Bug description When using 'ollama' and 'openai' models in the same project, they cannot be configured with '@Primary'.
Error info as below:
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 1 of method chatClientBuilder in org.springframework.ai.autoconfigure.chat.client.ChatClientAutoConfiguration required a single bean, but 2 were found:
- ollamaChatModel: defined by method 'ollamaChatModel' in class path resource [org/springframework/ai/autoconfigure/ollama/OllamaAutoConfiguration.class]
- openAiChatModel: defined by method 'openAiChatModel' in class path resource [org/springframework/ai/autoconfigure/openai/OpenAiAutoConfiguration.class]
This may be due to missing parameter name information
Action:
Consider marking one of the beans as @Primary, updating the consumer to accept multiple beans, or using @Qualifier to identify the bean that should be consumed
Ensure that your compiler is configured to use the '-parameters' flag.
You may need to update both your build tool settings as well as your IDE.
(See https://github.com/spring-projects/spring-framework/wiki/Upgrading-to-Spring-Framework-6.x#parameter-name-retention)
Environment
<!-- spring boot AI -->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-openai-spring-boot-starter</artifactId>
<version>1.0.0-M1</version>
</dependency>
<!--spring-ai-ollama-spring-boot-starter-->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
<version>1.0.0-M1</version>
</dependency>
Steps to reproduce
@Autowired(required = false)
@Qualifier("openAiChatModel")
private ChatModel chatModel;
@Resource
@Qualifier("ollamaChatModel")
OllamaChatModel ollamaChatModel;
Expected behavior
Support multiple models, such as 'ollama' and 'openai', in the same project, and prioritize them with '@Primary'.
The problem comes from the bean definition of org.springframework.ai.autoconfigure.chat.client.ChatClientAutoConfiguration:
@Bean
@Scope("prototype")
@ConditionalOnMissingBean
ChatClient.Builder chatClientBuilder(ChatClientBuilderConfigurer chatClientBuilderConfigurer, ChatModel chatModel) {
ChatClient.Builder builder = ChatClient.builder(chatModel);
return chatClientBuilderConfigurer.configure(builder);
}
You can define your own ChatClient.Builder bean and specify the ChatModel you want to use to solve this problem. For example:
@Bean
@Scope("prototype")
ChatClient.Builder openAiChatClientBuilder(ChatClientBuilderConfigurer chatClientBuilderConfigurer,
@Qualifier("openAiChatModel") ChatModel chatModel) {
ChatClient.Builder builder = ChatClient.builder(chatModel);
return chatClientBuilderConfigurer.configure(builder);
}
@Bean
@Scope("prototype")
ChatClient.Builder ollamaChatClientBuilder(ChatClientBuilderConfigurer chatClientBuilderConfigurer,
@Qualifier("ollamaChatModel") ChatModel chatModel) {
ChatClient.Builder builder = ChatClient.builder(chatModel);
return chatClientBuilderConfigurer.configure(builder);
}
Then, inject the desired ChatClient.Builder:
@Autowired
@Qualifier("openAiChatClientBuilder")
ChatClient.Builder openAiBuilder;
@Autowired
@Qualifier("ollamaChatClientBuilder")
ChatClient.Builder ollamaBuilder;