Skip to content

auto-differentiation/xad

XAD

XAD: The fastest automatic differentiation library for C++

XAD is a high-performance C++ automatic differentiation library designed for large-scale, performance-critical systems.

It provides forward and adjoint (reverse) mode automatic differentiation via operator overloading, with a strong focus on:

  • Low runtime overhead
  • Minimal memory footprint
  • Straightforward integration into existing C++ codebases

For Monte Carlo and other repetitive workloads, XAD also provides an abstract JIT backend interface, enabling record-once / replay-many execution for additional performance.

Download PRs Welcome Build Status Coverage Codacy Quality

Key Features

  • Forward & Reverse (Adjoint) Mode: Supports any order using operator overloading.
  • Vector mode: Compute multiple derivatives at once.
  • Checkpointing Support: Efficient tape memory management for large-scale applications.
  • External Function Interface: Seamlessly connect with external libraries.
  • Eigen support: Works with the popular linear algebra library Eigen.
  • JIT Backend Support (optional): Infrastructure for pluggable JIT backends, enabling record-once/replay-many workflows. See samples/jit_tutorial. A native code generation backend is available separately under commercial license.

Ecosystem

Repository Description
xad-py Python bindings for XAD
QuantLibAAD Full QuantLib integration — compute all Greeks at once, up to 3 orders of magnitude faster than bump-and-reval
QuantLib-Risks-Py QuantLib risks from Python
xad-codegen Native code generation backend — maximum throughput (commercial)
AAD Training Hands-on AAD training for quants and quant developers - delivered to dozens of tier 1 banks and financial services firms

Example

Calculate first-order derivatives of an arbitrary function with two inputs and one output using XAD in adjoint mode.

Adouble x0 = 1.3;              // initialise inputs
Adouble x1 = 5.2;
tape.registerInput(x0);        // register independent variables
tape.registerInput(x1);        // with the tape
tape.newRecording();           // start recording derivatives
Adouble y = func(x0, x1);      // run main function
tape.registerOutput(y);        // register the output variable
derivative(y) = 1.0;           // seed output adjoint to 1.0
tape.computeAdjoints();        // roll back adjoints to inputs
cout << "dy/dx0=" << derivative(x0) << "\n"
     << "dy/dx1=" << derivative(x1) << "\n";

Getting Started

Build XAD from source using CMake:

git clone https://github.com/auto-differentiation/xad.git
cd xad
mkdir build
cd build
cmake ..
make

For more detailed guides, refer to our Installation Guide and explore Tutorials.

Documentation

Full documentation, including API reference and usage examples, is available at: https://auto-differentiation.github.io/

Contributing

Contributions are welcome. Please see the Contributing Guide for details, and feel free to start a discussion in our GitHub Discussions.

Found a Bug?

Please report bugs and issues via the GitHub Issue Tracker.