This is an attempt to make a Deep Learning Library from scratch for educational purposes.
Remember, this is just for understanding how NN works, I don't intend to compete with the likes of TensorFlow and PyTorch (as if I could, lol)
Implemented the following so far:
- Weight Initialization techniques such as Xavier, He and LeCun
- Different loss functions such as MSE, MAE, Huber Loss
- SoftMax Activation Layer
- Activation layers such as ReLU, Sigmoid, Tanh
- Xor Problem
I plan to add the following
- ReLU's variants such as Leaky ReLU, ELU, SELU
- Batch Normalization
- More loss functions such as Cross Entropy Loss
- Convolutional layer
- Clone the repository
- Run the requirements (really just one requirement, numpy)
- Run the following
python xor.py