This repository contains a full implementation of Artificial Neural Networks (ANN) and Image Processing algorithms built entirely from scratch using Python and NumPy. This project bypasses high-level frameworks to demonstrate a deep, foundational understanding of backpropagation, gradient descent optimization, and convolutional math.
- Adaptive Learning Rate: Implemented a custom optimization function (
training_opt) that dynamically reduces the learning rate when entropy oscillates, preventing divergence and escaping local minima. - Loss Function: Implementation of Categorical Cross-Entropy to handle complex classification tasks.
- Bottleneck Analysis: Conducted empirical studies proving that placing layers with fewer neurons towards the end of the network improves convergence stability.
- Neuron Distribution: Demonstrated that distributing neurons across multiple layers yields better optimization landscapes than a single massive layer.
- Upgraded the binary classifier to a K-class model using Softmax activation.
- Successfully resolved highly non-linear classification problems (up to 6 distinct, intertwined classes).
- 1D & 2D Convolutions: Manual mathematical implementation of convolution operations without external computer vision libraries.
- Kernel Engineering: Application of custom feature extraction kernels, including Gaussian blur, Edge detection, Sharpening, and Embossing.
- Python 3
- NumPy (Core matrix operations, forward/backward passes)
- Matplotlib (Decision boundary visualization, loss curves)
- PIL / Pillow (Image data handling)
Exercice_1.py: Binary classification. Contains the core MLP architecture and adaptive learning rate logic.Exercice_2.py: Multi-class classification. Extends the logic to multiple classes with advanced architectures.Exercice_3.py: Signal and Image processing. Contains 1D/2D convolution algorithms and applies mathematical filters to images.