mltool-0.2.0.1: Machine Learning Toolbox

Copyright(c) Alexander Ignatyev 2016
LicenseBSD-3
Stabilityexperimental
PortabilityPOSIX
Safe HaskellNone
LanguageHaskell2010

MachineLearning.Optimization

Description

Optimization module.

Synopsis

Documentation

data MinimizeMethod Source #

Constructors

GradientDescent R

Gradient descent, takes alpha. Requires feature normalization.

MinibatchGradientDescent Int Int R

Minibacth Gradietn Descent, takes seed, batch size and alpha

ConjugateGradientFR R R

Fletcher-Reeves conjugate gradient algorithm, takes size of first trial step (0.1 is fine) and tol (0.1 is fine).

ConjugateGradientPR R R

Polak-Ribiere conjugate gradient algorithm. takes size of first trial step (0.1 is fine) and tol (0.1 is fine).

BFGS2 R R

Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm, takes size of first trial step (0.1 is fine) and tol (0.1 is fine).

minimize Source #

Arguments

:: Model a 
=> MinimizeMethod 
-> a

model (Least Squares, Logistic Regression etc)

-> R

epsilon, desired precision of the solution

-> Int

maximum number of iterations allowed

-> Regularization

regularization parameter

-> Matrix

X

-> Vector

y

-> Vector

initial solution, theta

-> (Vector, Matrix)

solution vector and optimization path

Returns solution vector (theta) and optimization path. Optimization path's row format: [iter number, cost function value, theta values...]

checkGradient :: Model a => a -> Regularization -> Matrix -> Vector -> Vector -> R -> R Source #

Gradient checking function. Approximates the derivates of the Model's cost function and calculates derivatives using the Model's gradient functions. Returns norn_2 between 2 derivatives. Takes model, regularization, X, y, theta and epsilon (used to approximate derivatives, 1e-4 is a good value).