Skip to content

I want to run this locally with my LLM, such as Ollama. #4

@nevenc

Description

@nevenc

I want to be able to run this application locally with my LLM, such as Ollama.

The application now requires openai.api.key in LangChain4jConfig. No way to skip the langchain4j-openai creation or provide an alternative langchain4j-ollama

In Spring AI it's easy to use just a different starter and app works without the change.
spring-ai-ollama-spring-boot-starter instead of spring-ai-openai-spring-boot-starter

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions