Note
Accepted at IEEE Robotics and Automation Letters (RA-L), 2026. The full multi-season AgriGS-SLAM dataset (apple + pear orchards across dormancy, flowering, and harvesting) will be publicly released in collaboration with AgrifoodTEF (EU Digital Europe Programme, GA Nº 101100622) — coming soon. In the meantime, you can try the pipeline on the demo dataset on Hugging Face, which is fetched automatically by the setup wizard.
AgriGS-SLAM is a unified Visual–LiDAR SLAM framework that couples direct LiDAR odometry and loop closures with multi-camera 3D Gaussian Splatting (3DGS) rendering. A batch rasterization strategy over three synchronized RGB-D cameras recovers orchard structure even under heavy occlusions and limited lateral viewpoints, while a gradient-driven map lifecycle — executed asynchronously between keyframes — preserves geometric detail and keeps GPU memory bounded throughout long field traversals. Pose refinement is driven by a probabilistic KL depth-consistency term derived from LiDAR measurements, back-propagated through differentiable camera projection to tighten the geometry–appearance coupling.
The system is validated on a tractor-mounted platform in apple and pear orchards across three phenological stages — dormancy, flowering, and harvesting — using a standardized trajectory protocol that evaluates both training-view reconstruction and novel-view synthesis. Across all seasons and orchards, AgriGS-SLAM consistently outperforms Photo-SLAM, Splat-SLAM, PINGS, and OpenGS-SLAM in rendering fidelity and trajectory accuracy, while operating in real time on-tractor.
For full results, qualitative comparisons, and dataset previews, visit the project page.
- src/odometry/ — C++ LiDAR odometry module (pybind11 bindings)
- src/scancontext/ — C++ Scan Context loop-closure module (pybind11 bindings)
- src/pipeline.py — Python entry point for the full pipeline
- config/default.yaml — default configuration
- docker/ —
Dockerfileanddocker-compose.yml - scripts/setup.sh — one-shot setup wizard (recommended)
- NVIDIA GPU
- Docker + Docker Compose plugin
- NVIDIA Container Toolkit
- (Optional) X11 running, for the viewer. Allow container access once per session:
xhost +local:docker
From the repo root:
./scripts/setup.shThe wizard performs every step required to go from a clean checkout to a ready-to-run environment:
- Checks host prerequisites (Docker, Compose, NVIDIA runtime)
- Creates host-side
data/andresults/(mounted into the container at/agri_gs_slam/dataand/agri_gs_slam/results) - Builds the Docker image (
mirkousuelli/agri-gs-slam:stable) - Starts
agri-gs-slam-containerwith the repo mounted at/agri_gs_slam - Compiles
src/odometryandsrc/scancontextin Release mode inside the container, each into its ownbuild/folder - Downloads the
foxmirko/agri-gs-slam-dataset-demodataset from Hugging Face intodata/agri-gs-slam-dataset-demo/(skipped if already present).config/default.yaml → dataloader.pathalready points there. - Extracts any PLY archives shipped with the demo dataset and verifies them
- Drops you into an interactive shell inside the container
Flags:
| flag | effect |
|---|---|
--rebuild |
force docker compose build --no-cache |
--clean-cpp |
wipe src/*/build before recompiling |
--no-shell |
skip the final interactive shell |
-h, --help |
show usage |
Override host-side mount paths by exporting DATA_DIR=/abs/path or RESULTS_DIR=/abs/path before running the wizard.
Once the wizard finishes you are inside the container — just run the pipeline:
cd /agri_gs_slam/src
python3 pipeline.py --gs-slamPick exactly one modality:
--odom— LiDAR odometry only (no loop closure)--slam— full SLAM with loop closure--gs-odom— odometry + Gaussian Splatting mapping--gs-slam— full SLAM + Gaussian Splatting (default if no flag is given)
Add --gs-viewer together with --gs-odom / --gs-slam to enable the live viewer. The container exposes ports 8080 (viewer) and 8500 (dashboard).
Configuration is loaded from config/default.yaml; edit it to point at your dataset and tune SLAM / splatting parameters.
Open another shell into the running container:
docker compose -f docker/docker-compose.yml exec agri-gs-slam bashStop the container:
docker compose -f docker/docker-compose.yml down# 1. Build and start
cd docker
docker compose build
docker compose up -d
docker compose exec agri-gs-slam bash
# 2. Compile the C++ modules (Release) — inside the container
for mod in odometry scancontext; do
cd /agri_gs_slam/src/$mod
mkdir -p build && cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
cmake --build . -j"$(nproc)"
done
# 3. Run the pipeline
cd /agri_gs_slam/src
python3 pipeline.py --gs-slamIf you use AgriGS-SLAM in your research, please cite:
@article{usuelli2026agrigs,
author={Usuelli, Mirko and Rapado-Rincon, David and Kootstra, Gert and Matteucci, Matteo},
journal={IEEE Robotics and Automation Letters},
title={AgriGS-SLAM: Orchard Mapping Across Seasons via Multi-View Gaussian Splatting SLAM},
year={2026},
volume={11},
number={6},
pages={7102-7109},
doi={10.1109/LRA.2026.3685453}
}-
Mirko Usuelli¹* and Matteo Matteucci¹ Dipartimento di Bioingegneria, Elettronica e Informazione, Politecnico di Milano, 20133 Milano, Italy {mirko.usuelli, matteo.matteucci}@polimi.it
-
David Rapado-Rincon² and Gert Kootstra² Agricultural Biosystems Engineering, Wageningen University & Research, 6708 PB Wageningen, The Netherlands {david.rapadorincon, gert.kootstra}@wur.nl
*Corresponding author
The authors thank the Fruit Research Center (FRC) in Randwijk for access to the orchards. Mirko Usuelli's work was carried out within the Agritech National Research Center and funded by the European Union — Next-GenerationEU (PNRR – M4C2, Inv. 1.4 – D.D. 1032 17/06/2022, CN00000022). This manuscript reflects only the authors' views; the EU and Commission are not responsible. Contributions from Matteo Matteucci, Gert Kootstra, and David Rapado-Rincon were co-funded by the European Union — Digital Europe Programme (AgrifoodTEF, GA Nº 101100622).
This project is released under the Apache License 2.0 — a permissive license well-suited to academic and industrial research use, with an explicit patent grant and attribution requirements.
Warning
Research code — not production-ready. AgriGS-SLAM is intended for research and academic experimentation. While it has been validated on our tractor-mounted platform across multiple orchards and seasons, it may contain untested edge cases, hardware-specific assumptions, and limitations that have not been characterized for safety-critical or real-world deployment scenarios. Use at your own risk; the authors provide no warranty regarding fitness for any particular purpose. See the LICENSE file for the full disclaimer.


AIRLab@POLIMI
AIRLab POLIMI
