pythonazureazure-openai

How to call AzureOpenAI API with PyRIT?


Im triying to do a basic PyRIT project but im not able to use the AzureOpenAI with PyRIT.

I have this code woking, but not with PyRIT:

from openai import AzureOpenAI

client = AzureOpenAI(
api_key="myApiKey",  
azure_endpoint="myEndpoint"
)

try:
   response = client.chat.completions.create(
      model="myModel",
      messages=[
         {"role": "system", "content": "You're a helpful assistant."},
         {"role": "user", "content": "Hello, how are you?"}
      ],
      temperature=0.0
   )
   print(response.choices[0].message.content)
except Exception as e:
   print(f"Error: {e}")

This code works, but now i want to use PyRIT.

This is my PyRIT code:

from pyrit.common import IN_MEMORY, initialize_pyrit
from pyrit.orchestrator import PromptSendingOrchestrator
from pyrit.prompt_target import OpenAIChatTarget

initialize_pyrit(memory_db_type=IN_MEMORY)

prompt = "Hello, how are you?"

target = OpenAIChatTarget(
    endpoint="myEndpoint",
    api_key="myApiKey"
)

orchestrator = PromptSendingOrchestrator(target)

response = await orchestrator.send_prompts_async(prompt_list=[prompt])  # type: ignore
await orchestrator.print_conversations_async()  # type: ignore

This code always return this Exception: Error sending prompt with conversation ID: "anyConversationIDThere."

I tried different ways to connect with the API but always fails the code.

Any idea?


Solution

  • When using Azure OpenAI with PyRIT, always ensure that the endpoint includes the custom Azure OpenAI subdomain and the correct API version.

    https://<openaiName>.openai.azure.com/openai/deployments/<deploymentName>/chat/completions?api-version=2023-03-15-preview
    

    I have tried the below code using PyRIT with correct endpoint and successfully got the output response.

    Code :

    import asyncio
    from pyrit.common import IN_MEMORY, initialize_pyrit
    from pyrit.orchestrator import PromptSendingOrchestrator
    from pyrit.prompt_target import OpenAIChatTarget
    
    initialize_pyrit(memory_db_type=IN_MEMORY)
    
    AZURE_OPENAI_ENDPOINT = "https://<openaiName>.openai.azure.com/"  
    DEPLOYMENT_NAME = "<modelName>"  
    API_VERSION = "2023-03-15-preview"
    API_KEY = "<apiKey>" 
    
    target = OpenAIChatTarget(
        endpoint=f"{AZURE_OPENAI_ENDPOINT}openai/deployments/{DEPLOYMENT_NAME}/chat/completions?api-version={API_VERSION}",
        api_key=API_KEY
    )
    
    orchestrator = PromptSendingOrchestrator(target)
    async def send_prompt(prompt: str):
        """ Sends the prompt and returns the responses. """
        try:
            return await orchestrator.send_prompts_async(prompt_list=[prompt])  
        except Exception as e:
            print(f"Error sending prompt: {e}")
            return None
    
    def extract_assistant_response(response):
        """ Extracts and prints the assistant's response correctly. """
        if not hasattr(response, "request_pieces"):
            print("Unexpected response format:", response)
            return
    
        for piece in response.request_pieces:
            if isinstance(piece, dict):
                for key, value in piece.items():
                    if isinstance(value, str) and "assistant" in value.lower():
                        print(f"Assistant: {value}")
    
    async def main():
        """ Main function to send prompt and process responses. """
        prompt = "Hello, how are you?"
        responses = await send_prompt(prompt)
        if responses:
            for response in responses:
                print(f"Debug Response: {response.__dict__}") 
                extract_assistant_response(response)
            await orchestrator.print_conversations_async() 
    asyncio.run(main())
    

    Output :

    I successfully got the output response as shown below.

    enter image description here