azureterraformdevopsinfrastructure-as-codeazure-openai

Dependency in Terraform OpenAI and VNET/Subnet


I am trying to create an openai endpoint using the openai module in terraform. I have no problem creating the network and then openai separately. However, when I try to add everything under one pipeline, it looks for vnet and subnet as if they are already created instead of creating them first.

(I CHANGED PROPER NAMES TO 'TEST' FOR PRIVACY)

My directory structure is as below:

terraform
  /modules
    /openai/
      main.tf
      variables.tf
      outputs.tf
    network/
      main.tf
      variables.tf
      outputs.tf
  main.tf
  variables.tf

In my network file I am creating outputs for vnet and subnet names:

output "subnet_name" {
  description = "The name of the subnet."
  value       = azurerm_subnet.test_subnet.name
}

output "vnet_name" {
  description = "The name of the virtual network."
  value       = azurerm_virtual_network.test_vnet.name
}

In root main.tf:

provider "azurerm" {
  features {}
}



# Network Module
module "network" {
  source                = "./modules/network"
  vnet_name             = "test_vnet"
  address_space         = ["10.0.0.0/16"]
  location              = "US East"
  resource_group_name   = "TestRG"
  subnet_name           =  "test_subnet"
  subnet_prefixes       = ["10.0.1.0/24"]
}

module "openai" {
  source                = "./modules/openai"
  resource_group_name   = "TestRG"
  location              = "US East"
  openai_name           = "test_openai"
  vnet_name             = module.network.vnet_name
  subnet_name           = module.network.subnet_name
} 

openai module main.tf:

provider "azurerm" {
  features {}
}

module "openai" {
  source              = "Azure/openai/azurerm"
  resource_group_name = var.resource_group_name
  location            = var.location
  private_endpoint = {
    "pe_endpoint" = {
      private_dns_entry_enabled       = true
      dns_zone_virtual_network_link   = "dns_zone_link"
      is_manual_connection            = false
      name                            = var.private_endpoint_name
      private_service_connection_name = var.private_service_connection_name 
      subnet_name                     = var.subnet_name
      vnet_name                       = var.vnet_name
      vnet_rg_name                    = var.resource_group_name
    }
  }
  deployment = {
    "text-embedding-ada-002" = {
      name          = "text-embedding-ada-002"
      model_format  = "OpenAI"
      model_name    = "text-embedding-ada-002"
      model_version = "2"
      scale_type    = "Standard"
    }
  }

}

Should it not create network components first considering I am taking in variables from network module? I know I can add a vnet module inside the openai folder, I just prefer to separate them and cant understand why above does not work. Any ideas are welcome! I am not a terraform expert :)

Thank you

Tried running terraform plan for above getting error that vnet and subnets dont exist when creating an openai pe_endpoint.


Solution

  • OpenAI and then the network can only be created separately using terraform.

    The issue seems to be is the way you are referring Vnet into OpenAI this because during parallel operation all resources provisioned simultaneously.

    So, to overcome the issue we need to make changes such that output of Vnets should be used to assigned for OpenAI by using depends_on.

    configuration:

    main.tf:

    provider "azurerm" {
      features {}
    }
    
    module "network" {
      source                = "./modules/network"
      vnet_name             = "test_vnet"
      address_space         = ["10.0.0.0/16"]
      location              = "East US"
      resource_group_name   = "vinay-rg"
      subnet_name           = "test_subnet"
      subnet_prefixes       = ["10.0.1.0/24"]
    }
    
    module "openai" {
      source                = "./modules/openai"
      resource_group_name   = "vinay-rg"
      location              = "East US"
      vnet_name             = module.network.vnet_name
      subnet_name           = module.network.subnet_name
      private_endpoint_name = "test_pe_endpoint"
      private_service_connection_name = "test_service_connection"
    }
    

    modules/network.tf:

    variable "vnet_name" {
      description = "Name of the Virtual Network"
      type        = string
    }
    
    variable "address_space" {
      description = "Address space for the Virtual Network"
      type        = list(string)
    }
    
    variable "location" {}
    
    variable "resource_group_name" {}
    
    variable "subnet_name" {}
    
    variable "subnet_prefixes" {}
    
    resource "azurerm_virtual_network" "test_vnet" {
      name                = var.vnet_name
      address_space       = var.address_space
      location            = var.location
      resource_group_name = var.resource_group_name
    }
    
    resource "azurerm_subnet" "test_subnet" {
      name                 = var.subnet_name
      resource_group_name  = var.resource_group_name
      virtual_network_name = azurerm_virtual_network.test_vnet.name
      address_prefixes     = var.subnet_prefixes
    }
    
    output "subnet_name" {
      value = azurerm_subnet.test_subnet.name
    }
    
    output "vnet_name" {
      value = azurerm_virtual_network.test_vnet.name
    }
    

    modules/openai.tf:

    provider "azurerm" {
      features {}
    }
    
    module "openai" {
      source              = "Azure/openai/azurerm"
      resource_group_name = var.resource_group_name
      location            = var.location
    
      private_endpoint = {
        "pe_endpoint" = {
          private_dns_entry_enabled       = true
          dns_zone_virtual_network_link   = "dns_zone_link"
          is_manual_connection            = false
          name                            = var.private_endpoint_name
          private_service_connection_name = var.private_service_connection_name 
          subnet_name                     = var.subnet_name
          vnet_name                       = var.vnet_name
          vnet_rg_name                    = var.resource_group_name
        }
      }
    
      deployment = {
        "text-embedding-ada-002" = {
          name          = "text-embedding-ada-002"
          model_format  = "OpenAI"
          model_name    = "text-embedding-ada-002"
          model_version = "2"
          scale_type    = "Standard"
        }
      }
    
      depends_on = [ var.vnet_name, var.subnet_name ]
    }
    
    variable "subnet_name" {}
    variable "vnet_name" {}
    variable "resource_group_name" {}
    variable "location" {}
    variable "private_endpoint_name" {}
    variable "private_service_connection_name" {}
    

    Deployement:

    enter image description here

    enter image description here refer:

    https://github.com/Azure/terraform-azurerm-openai/tree/v0.1.3/examples/azureopenai-private-endpoints

    https://msandbu.org/deploy-azure-openai-using-terraform-with-private-endpoint/

    https://techcommunity.microsoft.com/t5/azure-architecture-blog/azure-openai-private-endpoints-connecting-across-vnet-s/ba-p/3913325

    The depends_on Meta-Argument - Configuration Language | Terraform | HashiCorp Developer