Neural Network from Scratch: MLP Implementation

January 18, 20251 minute

A pure NumPy implementation of a Multilayer Perceptron (MLP) with custom backpropagation, exploring non-linearly separable problems from XOR to digit recognition. Features modular architecture with swappable optimizers (SGD, Momentum, Adam), multiple activation functions, and comprehensive training analytics—all without relying on high-level ML frameworks.

Features

  • Hand-rolled Backpropagation: Manual gradient computation for deep learning education
  • Optimizer Playground: Compare SGD, Momentum, and Adam with built-in performance metrics
  • Config-Driven Experiments: YAML-based architecture for reproducible ML experiments
  • Real Problems: XOR gates, parity detection, and handwritten digit classification (MNIST-style)
  • Visual Analytics: Confusion matrices, decision boundaries, and convergence analysis built-in

Quick Start

  # Train on XOR problem
  python experiments/exercise_3/train_tp3.py xor_config.yaml

  # Run digit classification with noise robustness testing
  python experiments/exercise_3/train_tp3.py digit_classification_config.yaml

Perfect For

  • Learning: Understand backprop internals without framework magic
  • Research: Experiment with optimizer behaviors and convergence patterns
  • Foundation: Extensible architecture for custom ML experiments

Built for the Artificial Intelligence Systems course at Buenos Aires Institute of Technology (ITBA).

GitHub Repo
Python NumPy Matplotlib YAML