javaspring-bootlangchainlangchain4j

gpt-4o-search-preview Model In LangChain4J - Spring


Tonight I received the following from openAI via email:

Web search delivers accurate and clearly-cited answers from the web. Using the same tool as search in ChatGPT, it’s great at conversation and follow-up questions—and you can integrate it with just a few lines of code. Web Search is available in the Responses API as a tool for the gpt-4o and gpt-4o-mini models, and can be paired with other tools. In the Chat Completions API, web search is available as a separate model, called gpt-4o-search-preview and gpt-4o-mini-search-preview. Available to all developers in preview

Question:

I currently use langchain4j with springboot, and I want to implement the gpt-4o-search-preview model. Im unsure how to do it. How do I tell langchain4j to use the tool from openAI itself? And How do I tell langchain4j to set the web_search_options like on the python code snippet below:

from openai import OpenAI
client = OpenAI()

completion = client.chat.completions.create(
    model="gpt-4o-search-preview",
    web_search_options={
        "user_location": {
            "type": "approximate",
            "approximate": {
                "country": "GB",
                "city": "London",
                "region": "London",
            }
        },
    },
    messages=[{
        "role": "user",
        "content": "What are the best restaurants around Granary Square?",
    }],
)

print(completion.choices[0].message.content)

Context

Below there are the codes I am currently using:

import dev.langchain4j.service.MemoryId;
import dev.langchain4j.service.Result;
import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
import dev.langchain4j.service.V;
import dev.langchain4j.service.spring.AiService;

@AiService
public interface Assistant {

    @SystemMessage("{{systemMessage}}")
    Result<String> chat(@MemoryId long memoryId, @UserMessage String userMessage, @V("systemMessage") String systemMessage);
}

@Service
public class ChatAiService {


    @Autowired
    private Assistant assistant;

    @Autowired
    private OpenAiModerationModel moderationModel;

    @Autowired
    private Environment environment;

    public ChatAiService() {

    }

    public String sendRequest(ChatAiDTO chatAiDTO) {

        String userInput = chatAiDTO.getInput();
        UserDTO loggedUser = LoginContext.getLoggedUser();


        String systemMessagePrompt = "You are an educated assistant. Answer me in HTML language.";
        String model = environment.getProperty("langchain4j.open-ai.chat-model.model-name");


        String response = "";
        Result<String> result = null;
        try {

            Response<Moderation> moderate = moderationModel.moderate(userInput);
            boolean flagged = moderate.content().flagged();

            if (flagged) {
                throw new ModerationException("Blocked by moderation!");
            }

            result = assistant.chat(loggedUser.getId(), userInput, systemMessagePrompt);

            response = result.content();
            
        } catch (ModerationException e) {
            
            response = "Blocked by moderation!";
        }
        //
        //
        return response;

    }

}

Solution

  • How do I tell langchain4j to use the tool from openAI itself?

    You need to set langchain4j.open-ai.chat-model.model-name = gpt-4o-search-preview

    And How do I tell langchain4j to set the web_search_options like on the python code snippet below

    Unfortunately there is currently no way to set it