|
| 1 | +# AutoDiff - automatic differentiation for Python |
| 2 | + |
| 3 | +[](https://github.com/krippner/auto-diff-python/actions/workflows/wheel.yml) |
| 4 | + |
| 5 | +A lightweight Python package that provides fast **automatic differentiation (AD)** in forward and reverse mode for scalar and array computations. |
| 6 | + |
| 7 | +AD is an **efficient algorithm** for computing **exact derivatives** of numeric programs. |
| 8 | +It is a standard tool in numerous fields, including optimization, machine learning, and scientific computing. |
| 9 | + |
| 10 | +> [!NOTE] |
| 11 | +> This repository focuses on providing Python bindings and does not include the C++ backend, which is part of a separate, standalone C++ library. The C++ version offers additional features not available in these bindings. For more information on the C++ version, please visit the [AutoDiff repository](https://github.com/krippner/auto-diff). |
| 12 | +
|
| 13 | +## Features |
| 14 | + |
| 15 | +- **Automatic differentiation**: |
| 16 | + - Jacobian matrix with forward- and reverse-mode AD |
| 17 | + - Jacobian-vector products, e.g., gradients and directional derivatives |
| 18 | + - Support for scalar, 1D and 2D array, and linear algebra computations |
| 19 | +- **Fast and efficient implementation**: |
| 20 | + - Backend written in C++ (using [this repository](https://github.com/krippner/auto-diff)) |
| 21 | + - Leverages the performance of the [Eigen](https://eigen.tuxfamily.org) linear algebra library |
| 22 | + - Memory efficient (see [Variables vs. expressions](docs/expressions.md#variables-vs-expressions)) |
| 23 | +- **Simple and intuitive API** |
| 24 | + - Regular control flow: function calls, loops, branches |
| 25 | + - Eager evaluation: what you evaluate is what you differentiate |
| 26 | + - Lazy re-evaluations: offering you precise control over what to evaluate |
| 27 | + - Math-inspired syntax |
| 28 | + |
| 29 | +For more details, see the [documentation](#documentation). |
| 30 | + |
| 31 | +## Installation |
| 32 | + |
| 33 | +If you are on Linux, you can download the latest wheel file from the [releases page](https://github.com/krippner/auto-diff-python/releases) and install it using pip. |
| 34 | + |
| 35 | +```bash |
| 36 | +python -m pip install autodiff-0.1.0-cp311-cp311-linux_x86_64.whl |
| 37 | +``` |
| 38 | + |
| 39 | +The wheel contains the extension modules as well as Python stub files for autocompletion and documentation in your IDE. |
| 40 | + |
| 41 | +## Usage |
| 42 | + |
| 43 | +Below are two simple examples of how to use the `autodiff` package to compute the gradient of a function with respect to its inputs. |
| 44 | + |
| 45 | +The package provides two sub-modules: `array` and `scalar`. |
| 46 | +The `array` module is more general and can be used to compute gradients of functions involving both scalars and arrays (1D and 2D). |
| 47 | + |
| 48 | +> [!CAUTION] |
| 49 | +> It is not possible to mix the `array` and `scalar` modules in the same program, as they use incompatible internal representations for variables. |
| 50 | +
|
| 51 | +### NumPy array example |
| 52 | + |
| 53 | +```python |
| 54 | +# Example: gradient computation with NumPy arrays |
| 55 | +import numpy as np |
| 56 | +from autodiff.array import Function, var, d |
| 57 | + |
| 58 | +# Create two 1D array variables |
| 59 | +x = var(np.array([1, 2, 3])) |
| 60 | +y = var(np.array([4, 5, 6])) |
| 61 | + |
| 62 | +# Assign their (element-wise) product to a new variable |
| 63 | +z = var(x * y) |
| 64 | + |
| 65 | +# Variables are evaluated eagerly |
| 66 | +print("z =", z()) # z = [ 4. 10. 18.] |
| 67 | + |
| 68 | +# Create the function f : (x, y) ↦ z = x * y |
| 69 | +f = Function(z) # short for: Function(sources=(x, y), target=z) |
| 70 | + |
| 71 | +# Set the (element-wise) derivative of z with respect to itself |
| 72 | +z.set_derivative(np.ones((1, 3))) |
| 73 | + |
| 74 | +# Compute the gradient of f at (x, y) using reverse-mode AD |
| 75 | +f.pull_gradient() |
| 76 | + |
| 77 | +# Get the components of the (element-wise) gradient |
| 78 | +print("∇_x f =", d(x)) # ∇_x f = [[4. 5. 6.]] |
| 79 | +print("∇_y f =", d(y)) # ∇_y f = [[1. 2. 3.]] |
| 80 | + |
| 81 | +``` |
| 82 | + |
| 83 | +### Scalar example |
| 84 | + |
| 85 | +For functions mapping only scalars to scalars, the `scalar` module is more efficient and convenient. |
| 86 | +No further imports are required. |
| 87 | + |
| 88 | +```python |
| 89 | +# Example: gradient computation with scalars |
| 90 | +from autodiff.scalar import Function, var, d |
| 91 | + |
| 92 | +# Create two scalar variables |
| 93 | +x = var(1.5) |
| 94 | +y = var(-2.0) |
| 95 | + |
| 96 | +# Assign their product to a new variable |
| 97 | +z = var(x * y) |
| 98 | + |
| 99 | +# Variables are evaluated eagerly |
| 100 | +print("z =", z()) # z = -3.0 |
| 101 | + |
| 102 | +# Create the function f : (x, y) ↦ z = x * y |
| 103 | +f = Function(z) # short for: Function(sources=(x, y), target=z) |
| 104 | + |
| 105 | +# Compute the gradient of f at (x, y) using reverse-mode AD |
| 106 | +f.pull_gradient_at(z) |
| 107 | + |
| 108 | +# Get the components of the gradient |
| 109 | +print("∂f/∂x =", d(x)) # ∂f/∂x = -2.0 |
| 110 | +print("∂f/∂y =", d(y)) # ∂f/∂y = 1.5 |
| 111 | + |
| 112 | +``` |
| 113 | + |
| 114 | +## Documentation |
| 115 | + |
| 116 | +1. [Variables and expressions](docs/expressions.md#top) - writing programs with `autodiff` |
| 117 | + 1. [Variables](docs/expressions.md#variables) |
| 118 | + 2. [Expressions](docs/expressions.md#expressions) |
| 119 | + 3. [Variables vs. expressions](docs/expressions.md#variables-vs-expressions) |
| 120 | +2. [Functions](docs/functions.md#top) - (deferred) evaluation and differentiation |
| 121 | + 1. [Lazy evaluation](docs/functions.md#lazy-evaluation) |
| 122 | + 2. [Forward-mode differentiation](docs/functions.md#forward-mode-differentiation) |
| 123 | + 3. [Reverse-mode differentiation (aka backpropagation)](docs/functions.md#reverse-mode-differentiation-aka-backpropagation) |
| 124 | + 4. [Advanced: changing the program after evaluation](docs/functions.md#advanced-changing-the-program-after-evaluation) |
| 125 | +3. [The `autodiff.scalar` module](docs/scalar.md#top) - working with scalars only |
| 126 | + 1. [Classes](docs/scalar.md#classes) |
| 127 | + 2. [Variable factory functions](docs/scalar.md#variable-factory-functions) |
| 128 | + 3. [Operations](docs/scalar.md#operations) |
| 129 | +4. [The `autodiff.array` module](docs/array.md#top) - working with scalars and NumPy arrays |
| 130 | + 1. [Classes](docs/array.md#classes) |
| 131 | + 2. [Variable factory functions](docs/array.md#variable-factory-functions) |
| 132 | + 3. [Operations](docs/array.md#operations) |
| 133 | + 4. [Matrix-valued expressions](docs/array.md#matrix-valued-expressions) |
| 134 | +5. [Applications](docs/applications.md#top) - common use cases and examples |
| 135 | + 1. [Control flow](docs/applications.md#control-flow) |
| 136 | + 2. [Computing the Jacobian matrix](docs/applications.md#computing-the-jacobian-matrix) |
| 137 | + 3. [Gradient computation](docs/applications.md#gradient-computation) |
| 138 | + 4. [Element-wise gradient computation](docs/applications.md#element-wise-gradient-computation) |
| 139 | + 5. [Jacobian-vector products](docs/applications.md#jacobian-vector-products) |
0 commit comments