matrex v0.5.3 Matrex.Algorithms View Source

Machine learning algorithms using matrices.

Link to this section Summary

Functions

Minimizes a continuous differentiable multivariate function

Linear regression cost and gradient function with regularization from Andrew Ng’s course (ex3)

The same cost function, implemented with operators from Matrex.Operators module

Link to this section Functions

Link to this function fmincg(f, x, fParams, length) View Source
fmincg(
  (Matrex.t(), any() -> {float(), Matrex.t()}),
  Matrex.t(),
  any(),
  integer()
) :: {Matrex.t(), [float()], pos_integer()}

Minimizes a continuous differentiable multivariate function.

Ported to Elixir from Octave version, found in Andre Ng’s course, (c) Carl Edward Rasmussen.

f — cost function, that takes two paramteters: current version of x and fParams. For example, lr_cost_fun/2.

x — vector of parameters, which we try to optimize, so that cost function returns the minimum value.

fParams — this value is passed as the second parameter to the cost function.

length — number of iterations to perform.

Returns column matrix of found solutions, list of cost function values and number of iterations used.

Starting point is given by x (D by 1), and the function f, must return a function value and a vector of partial derivatives. The Polack-Ribiere flavour of conjugate gradients is used to compute search directions, and a line search using quadratic and cubic polynomial approximations and the Wolfe-Powell stopping criteria is used together with the slope ratio method for guessing initial step sizes. Additionally a bunch of checks are made to make sure that exploration is taking place and that extrapolation will not be unboundedly large.

Link to this function lr_cost_fun(theta, params) View Source
lr_cost_fun(Matrex.t(), {Matrex.t(), Matrex.t(), number()}) ::
  {float(), Matrex.t()}

Linear regression cost and gradient function with regularization from Andrew Ng’s course (ex3).

Computes the cost of using theta as the parameter for regularized logistic regression and the gradient of the cost w.r.t. to the parameters.

Compatible with fmincg/4 algorithm from thise module.

theta — parameters, to compute cost for

X — training data input.

y — training data output.

lambda — regularization parameter.

Link to this function lr_cost_fun_ops(theta, params) View Source

The same cost function, implemented with operators from Matrex.Operators module.

Works 2 times slower, than standard implementation. But it’s a way more readable.