promptlangchainollama

How to print input requests and output responses in Ollama server?


I'm working with Langchain and CrewAI libraries to gain an in-depth understanding of system prompting. Currently, I'm running the Ollama server manually (ollama serve) and trying to intercept the messages flowing through using a proxy server I've created.

The goal is to log or print the input requests and output responses for debugging and analysis purposes.

Can anyone suggest a better way to achieve this?


Solution

  • For Ubuntu Users:

    To print out the input request on the server side, you need to enable Debug mode. Follow these steps:

    1. Open Ollama's service file:

      sudo systemctl edit --full ollama.service

    2. Add the following line in the [Service] section:

      Environment="OLLAMA_DEBUG=1"

    3. Restart the Ollama service:

      sudo systemctl restart ollama.service

    4. Read the service logs to view debug information:

      journalctl -f -b -u ollama

    This will enable Debug mode and allow you to see detailed logs for input requests.

    Additional info.