Summary
Example showing how to use Memtrace with local models via Ollama (Llama 3, Mistral, etc.). This is the fully local, zero-API-cost story: Memtrace + Arc + Ollama, no cloud dependencies.
Value Proposition
- Zero cloud cost for development and testing
- Full privacy — nothing leaves your machine
- Uses OpenAI-compatible Ollama API
Scope
- Setup instructions for local stack
- Memory loop example with Ollama
- Standalone in
examples/ollama/ directory
Dependencies
Summary
Example showing how to use Memtrace with local models via Ollama (Llama 3, Mistral, etc.). This is the fully local, zero-API-cost story: Memtrace + Arc + Ollama, no cloud dependencies.
Value Proposition
Scope
examples/ollama/directoryDependencies