First commit

This commit is contained in:
2020-05-24 18:47:27 +02:00
parent cc23dddc1b
commit d13081765a
9 changed files with 1184 additions and 2 deletions

View File

@@ -1,2 +1,35 @@
# NeuralNetwork
Neural Network implementation based on the DeepLearning courses in Coursera
# N_Network
Neural Network implementation based on the Andrew Ng courses
Implements Batch GD, Stochastic GD (minibatch_size=1) & Stochastic minibatch GD:
- Cost function: Cross Entropy Loss
- Activation functions: relu, sigmoid, tanh
- Regularization: l2 (lambd), Momentum (beta), Dropout (keep_prob)
- Optimization: Minibatch Gradient Descent, RMS Prop, Adam
- Learning rate decay, computes a factor of the learning rate at each # of epochs
- Fair minibatches: Can create batches with the same proportion of labels 1/0 as in train data
Restriction:
- Multiclass only with onehot label
## Install
```bash
pip install git+https://github.com/doctorado-ml/NeuralNetwork
```
## Example
#### Console
```bash
python main.py
```
#### Jupyter Notebook
[![Test](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Doctorado-ML/NeuralNetwork/blob/master/test.ipynb) Test notebook