2020-05-24 18:47:27 +02:00
2020-05-24 12:41:23 +02:00
2020-05-24 12:41:23 +02:00
2020-05-24 18:47:27 +02:00
2020-05-24 18:47:27 +02:00
2020-05-24 18:47:27 +02:00
2020-05-24 18:47:27 +02:00

N_Network

Neural Network implementation based on the Andrew Ng courses

Implements Batch GD, Stochastic GD (minibatch_size=1) & Stochastic minibatch GD:

  • Cost function: Cross Entropy Loss
  • Activation functions: relu, sigmoid, tanh
  • Regularization: l2 (lambd), Momentum (beta), Dropout (keep_prob)
  • Optimization: Minibatch Gradient Descent, RMS Prop, Adam
  • Learning rate decay, computes a factor of the learning rate at each # of epochs
  • Fair minibatches: Can create batches with the same proportion of labels 1/0 as in train data

Restriction:

  • Multiclass only with onehot label

Install

pip install git+https://github.com/doctorado-ml/NeuralNetwork

Example

Console

python main.py

Jupyter Notebook

Test Test notebook

Description
Neural Network implementation based on the Andrew Ng courses
Readme MIT 1.4 MiB
Languages
Jupyter Notebook 99.1%
Python 0.9%