December 2023

Genetic Algorithm Training of a PyTorch Neural Network

Project Overview We built a custom Genetic Algorithm (GA) that trains small PyTorch neural networks by directly evolving their weights and biases. Adaptive parameter control enables the GA to balance exploration and exploitation automatically. The project also included developing a lightweight Python package to visualize the architecture, weights, and biases of fully connected PyTorch models.

Cover Image

Objectives

  • Direct Weight Evolution: Optimize neural network parameters without gradient-based methods by using evolutionary principles.
  • Adaptive Parameter Control: Automatically adjust mutation and crossover rates to maintain a balance between exploration and exploitation.
  • Visualization: Provide insights into network structure and parameter distributions throughout the evolution process.

Solution

  1. Custom GA Framework

    • Encoded network weights and biases as GA chromosomes for direct evolution.
    • Implemented selection, crossover, and mutation operators tailored to PyTorch model parameters.
    • Integrated adaptive parameter control to dynamically adjust genetic operators based on population diversity.

Network Visualization

  1. Visualization Package

    Developed a Python library that shows:

    • Network topology.
    • Weights and biases.

Outcomes

  • Effective Optimization: Successfully trained small PyTorch networks without backpropagation, achieving competitive performance.
  • Automated Tuning: Adaptive GA parameters reduced manual hyperparameter tuning and improved convergence speed.
  • Enhanced Transparency: Visualization tools offered immediate feedback on the parameter evolution.