DeepPipe2 v1.1.1 Deeppipe View Source

main module of DeepPipe2.

functions for Deep-Learning.

Link to this section Summary

Functions

calculate accurace

download(x) case x :mnist download and decompress MNIST dataset :fashon download and decompress Fashion-MNIST dataset :cifar10 download and decompress CIFAR10 dataset :iris download iris dataset

forward return all middle data

for debug invoke garbage collection forcely.

gradient with backpropagation

learning(network1,network2) learning/2 1st arg is old network list 2nd arg is network with gradient generate new network with leared weight and bias update method is sgd

learning(network1,network2,update_method) learning/3 update method is :momentam, :adagrad, :sgd

load network from file

display newline

normalize dataset element normalize(x,bias,div) x + bias / div e.g. bias = -127, div = 255 0~255 => -0.5~0.5

numerical_gradient(ts,network,train) numerical gradient for debug 1st arg input tensor 2nd arg network 3rd arg train matrix

display network

select random data from image data and train data size of m. range from 0 to n and generate tuple of two matrix

retrain load network from file and restart learning

save network to file

for debug forcely stop

translate from number to onehot-list iex(1)> Deeppipe.to_onehot(1,9) [0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]

1st arg network
2nd arg train image list
3rd arg train onehot list
4th arg test image list
5th arg test labeel list
6th arg loss function (;cross or :squre)
7th arg learning method
8th arg minibatch size
9th arg repeat number

automaticaly save network to temp.ex

Link to this section Functions

Link to this function

accuracy(image, network, label)

View Source

calculate accurace

download(x) case x :mnist download and decompress MNIST dataset :fashon download and decompress Fashion-MNIST dataset :cifar10 download and decompress CIFAR10 dataset :iris download iris dataset

forward return all middle data

1st arg is input data matrix
2nd arg is network list
3rd arg is generated middle layer result

for debug invoke garbage collection forcely.

gradient with backpropagation

1st arg is input data matrix
2nd arg is network list
3rd arg is train matrix

learning(network1,network2) learning/2 1st arg is old network list 2nd arg is network with gradient generate new network with leared weight and bias update method is sgd

Link to this function

learning(network1, network2, atom)

View Source

learning(network1,network2,update_method) learning/3 update method is :momentam, :adagrad, :sgd

load network from file

display newline

normalize dataset element normalize(x,bias,div) x + bias / div e.g. bias = -127, div = 255 0~255 => -0.5~0.5

Link to this function

numerical_gradient(x, network, t)

View Source

numerical_gradient(ts,network,train) numerical gradient for debug 1st arg input tensor 2nd arg network 3rd arg train matrix

display network

Link to this function

random_select(image, train, m, n)

View Source

select random data from image data and train data size of m. range from 0 to n and generate tuple of two matrix

Link to this function

retrain(file, tr_imag, tr_onehot, ts_imag, ts_label, loss_func, method, m, n)

View Source

retrain load network from file and restart learning

save network to file

for debug forcely stop

translate from number to onehot-list iex(1)> Deeppipe.to_onehot(1,9) [0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]

Link to this function

train(network, tr_imag, tr_onehot, ts_imag, ts_label, loss_func, method, m, n)

View Source
1st arg network
2nd arg train image list
3rd arg train onehot list
4th arg test image list
5th arg test labeel list
6th arg loss function (;cross or :squre)
7th arg learning method
8th arg minibatch size
9th arg repeat number

automaticaly save network to temp.ex