I'd like to give this prompt to the LLM:
Output this exact string:{{some_string}}"
I.e., {{some_string}}
shall not be replaced by Promptfoo with some variable, but be verbatim instead.
My current attempt looks as follows:
description: "test_2025-04-30_003"
prompts:
- "Output this exact string:{{ '{{some_string}}' }}"
providers:
- "openai:gpt-4o-mini"
tests:
assert:
- type: contains-all
value: "{{ '{{some_string}}' }}"
But the output of
promptfoo eval --verbose
shows this:
[chat.js:107] Calling OpenAI API: {"model":"gpt-4o-mini","messages":[{"role":"user","content":"Output this exact string:"}],"max_tokens":1024}
So the {{some_string}}
is missing in the message.
What am I doing wrong?
It has been implemented in Promptfoo version 0.112.2
.