mirror of
https://github.com/Doctorado-ML/NeuralNetwork.git
synced 2025-08-15 15:35:54 +00:00
36 lines
931 B
Markdown
36 lines
931 B
Markdown
# N_Network
|
|
|
|
Neural Network implementation based on the Andrew Ng courses
|
|
|
|
Implements Batch GD, Stochastic GD (minibatch_size=1) & Stochastic minibatch GD:
|
|
|
|
- Cost function: Cross Entropy Loss
|
|
- Activation functions: relu, sigmoid, tanh
|
|
- Regularization: l2 (lambd), Momentum (beta), Dropout (keep_prob)
|
|
- Optimization: Minibatch Gradient Descent, RMS Prop, Adam
|
|
- Learning rate decay, computes a factor of the learning rate at each # of epochs
|
|
- Fair minibatches: Can create batches with the same proportion of labels 1/0 as in train data
|
|
|
|
Restriction:
|
|
|
|
- Multiclass only with onehot label
|
|
|
|
## Install
|
|
|
|
```bash
|
|
pip install git+https://github.com/doctorado-ml/NeuralNetwork
|
|
```
|
|
|
|
## Example
|
|
|
|
#### Console
|
|
|
|
```bash
|
|
python main.py
|
|
```
|
|
|
|
#### Jupyter Notebook
|
|
|
|
[](https://colab.research.google.com/github/Doctorado-ML/NeuralNetwork/blob/master/test.ipynb) Test notebook
|
|
|