Create your own wiki
AI-generated instantly
Updates automatically
Solo and team plans

micrograd

repo logo
micrograd
Language
Jupyter Notebook
Created
04/13/2020
Last updated
01/06/2024
License
MIT
autowiki
Revision
0
Software Version
0.0.4Basic
Generated from
Commit
c91140
Generated on
01/06/2024

Micrograd is a library for building and training neural networks with automatic differentiation. It allows users to define computational graphs made up of tensor-like objects, then efficiently compute gradients via backpropagation.

At the core is an Autograd Engine that handles building computation graphs and running backpropagation. Classes represent scalar values that participate in graphs. Calling methods on instances performs operations and returns new objects, while also inserting nodes into the graph. After running the forward pass, calling methods recursively calls gradient computation methods on each node in reverse topological order to compute gradients via chain rule.

Built on top of this engine, Micrograd provides a Neural Network Building API for easily constructing neural network layers and models. Classes in …/nn.py allow users to define models by stacking layers of neurons. The forward pass applies each layer in sequence.

Overall, Micrograd allows imperative-style Python code to implicitly construct computational graphs behind the scenes. It handles all the graph manipulation and gradient calculations automatically via operator overloading and topological sorting. This enables an easy-to-use API for building and training neural networks.

Autograd Engine

References: micrograd

The file …/engine.py defines functionality for automatic differentiation. It implements the backward pass, topologically sorting nodes using depth-first search, then calling callbacks in reverse order to backpropagate gradients through the entire computational graph. This allows efficiently calculating gradients via chain rule.

Read more

Neural Network Building

References: micrograd

The …/nn.py file defines classes for building neural networks in a high-level, object-oriented way.

Read more

Testing Autograd Engine

References: test

The test suite in test contains test cases that validate the gradient calculations performed by the autograd functionality in micrograd. These tests are critical to ensure the correctness of the automatic differentiation which is core to building and training neural networks with micrograd.

Read more

Documentation

References: micrograd

The micrograd directory contains code for building and training neural networks. The main class for representing nodes is defined in …/engine.py.

Read more

Usage

References: README.md

Micrograd is installed via pip. The core functionality involves defining computational graphs and calculating gradients using an autograd engine. This engine dynamically builds the computation graph behind the scenes as operations are applied to tensor objects.

Read more

Examples

References: micrograd

The file …/test_engine.py contains tests for the core autograd functionality. It uses a class defined in …/engine.py to represent scalar values that can implicitly build computational graphs through operations.

Read more

Neural Network Training

References: micrograd/nn.py

The file …/nn.py defines classes for building neural networks.

Read more

Package Metadata

References: micrograd

The setup.py file defines the metadata needed to distribute Micrograd. This includes attributes like the package name and version.

Read more

Setup.py

References: setup.py

The file setup.py contains code to define metadata and functionality for packaging and distributing the Micrograd library as a reusable Python package. It uses a function to define important metadata like the package name, version, author, description, and other classifiers.

Read more

Version

References: micrograd

The README.md file documents the Micrograd package. It includes a section on getting started.

Read more