Skip to content

OneTrueJian/Active-learning-aenet

Repository files navigation

AENET Active Learning

A committee-based active learning framework for systematically building training datasets for AENET (Atomic Energy Network) interatomic potentials.

Overview

This framework automates the iterative loop of training neural network potentials, sampling new configurations via molecular dynamics, and selecting the most informative structures for DFT relabeling. It uses a committee of neural networks to estimate prediction uncertainty and identify structures where the model is least confident.

Workflow

┌─────────────────┐     ┌──────────────────┐     ┌───────────────────┐
│  Train Committee │────▶│  LAMMPS MD       │────▶│  Evaluate         │
│  (N models)      │     │  Sampling        │     │  Uncertainty      │
└─────────────────┘     └──────────────────┘     └───────┬───────────┘
        ▲                                                │
        │                                                ▼
┌───────┴───────────┐                           ┌───────────────────┐
│  Update Training  │◀──────────────────────────│  VASP DFT         │
│  Set              │                           │  Relabeling       │
└───────────────────┘                           └───────────────────┘

Each iteration:

  1. Train a committee of AENET neural network potentials
  2. Sample new configurations using LAMMPS molecular dynamics
  3. Evaluate candidate structures using committee disagreement (force/energy deviation)
  4. Relabel selected structures with VASP DFT calculations
  5. Update the training set and repeat

Installation

pip install git+https://github.com/OneTrueJian/Active-learning-aenet.git

Or install in development mode:

git clone https://github.com/OneTrueJian/Active-learning-aenet.git
cd aenet-active-learning
pip install -e .

Dependencies

  • Python >= 3.6
  • NumPy, SciPy
  • ASE (Atomic Simulation Environment)
  • AENET + AENET-PyTorch (external)
  • LAMMPS with AENET pair style (external)
  • VASP for DFT calculations (external)

Usage

1. Prepare configuration files

File Purpose
input.json Main config: paths, target structure count, iteration control
aenet_input.json AENET training parameters (architecture, hyperparameters)
judge.json Uncertainty thresholds for structure selection
lammps_sampling.json MD sampling parameters (temperature, timestep, steps)
vasp.json VASP calculation and SLURM submission parameters
exe.json Paths to external executables

2. Run the active learning loop

aenet_active_learning --input input.json --exe exe.json

To resume from a checkpoint:

aenet_active_learning --input input.json --exe exe.json --restart

3. Monitor progress

The framework tracks iteration state in state.json and supports automatic restart from any step within an iteration.

Features

  • Committee-based uncertainty quantification using force and energy deviation across multiple neural network models
  • Automatic restart with step-level checkpointing within each iteration
  • Configurable selection criteria via delta_energy and delta_force thresholds
  • HPC integration with SLURM job submission and monitoring for VASP calculations
  • Flexible structure I/O supporting XSF, POSCAR, LAMMPS dump, and other ASE-compatible formats

License

MIT License

About

Committee-based active learning framework for building AENET interatomic potential training datasets

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages