I want to be able to run this application locally with my LLM, such as Ollama.
The application now requires openai.api.key in LangChain4jConfig. No way to skip the langchain4j-openai creation or provide an alternative langchain4j-ollama
In Spring AI it's easy to use just a different starter and app works without the change.
spring-ai-ollama-spring-boot-starter instead of spring-ai-openai-spring-boot-starter