Skip to content

feat(examples): Ollama + Llama local-first cookbook #15

@xe-nvdk

Description

@xe-nvdk

Summary

Example showing how to use Memtrace with local models via Ollama (Llama 3, Mistral, etc.). This is the fully local, zero-API-cost story: Memtrace + Arc + Ollama, no cloud dependencies.

Value Proposition

  • Zero cloud cost for development and testing
  • Full privacy — nothing leaves your machine
  • Uses OpenAI-compatible Ollama API

Scope

  • Setup instructions for local stack
  • Memory loop example with Ollama
  • Standalone in examples/ollama/ directory

Dependencies

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions