Unsupervised Learning: Classical Neural Networks

October 29, 20251 minute

A from-scratch implementation of three foundational unsupervised learning architectures: Kohonen Self-Organizing Maps (SOM) with configurable topologies and decay strategies, Oja’s rule for principal component extraction, and Hopfield networks for associative memory. Each algorithm is built with NumPy, featuring modular hyperparameter configurations, multiple distance metrics (Euclidean, exponential), neighborhood functions (Gaussian, hard), and comprehensive evaluation metrics including quantization error, topological error, and U-matrix visualizations.

Features

  • Kohonen Self-Organizing Maps: Configurable grid topologies with multiple decay strategies and neighborhood functions
  • Oja’s Rule: Principal component extraction with adaptive learning rates
  • Hopfield Networks: Associative memory implementation with pattern completion and storage capacity analysis
  • Distance Metrics: Euclidean and exponential distance functions
  • Evaluation Toolkit: Quantization error, topological error, and U-matrix visualizations
  • Modular Configuration: YAML-based hyperparameter management for reproducible experiments

Built for the Artificial Intelligence Systems course at Buenos Aires Institute of Technology (ITBA).

GitHub Repo
Python NumPy Pandas Matplotlib scikit-learn