I have a terraform workspace with remote state file on Azure Storage I authenticate locally using a Service Principal and deployed, tested all the resources. Works like a charm!
I am using Public Agents (not self-hosted) My storage account isn't "private"
Now when I am moving my deployment process to Azure DevOps, I am getting
Error: Failed to get existing workspaces: listing blobs:
executing request: unexpected status 403 (403 This request is not authorized
to perform this operation.) with AuthorizationFailure: This request is not
authorized to perform this operation.
My SP has the correct permission on Storage and Container - That's why it works locally with same Service Principal
I have created the Service Connection also correctly and is mapped to right Service Principal
I see some people discussing that it has something to do with "wrong agent VMs" being assigned thus leading to not getting access to your storage account. There are also "solutions" that are just whitelisting 250+ IPs or accepting traffic from anywhere. But I don't like these solutions.
What is wrong? what could be the issue here?
This is how my pipeline looks like
stages:
- stage: validation
jobs:
- job: validate
continueOnError: false
steps:
- task: TerraformInstaller@1
displayName: tfinstall
inputs:
terraformVersion: 'v1.11.2'
- task: TerraformTaskV4@4
displayName: init
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(workingFolder)'
backendAzureRmUseEnvironmentVariablesForAuthentication: true
backendServiceArm: 'landingzone-poc-sc'
backendAzureRmResourceGroupName: '$(state_rg)'
backendAzureRmStorageAccountName: '$(state_storage)'
backendAzureRmContainerName: '$(state_container)'
backendAzureRmKey: '$(state_key)'
- task: TerraformTaskV4@4
displayName: validate
inputs:
provider: 'azurerm'
command: 'validate'
Please guide!
Azure Devops with Terraform Failed to get existing workspaces: listing blobs: executing request: unexpected status 403 error
Since you mentioned that the SP has right access to the storage with Blob Data Contributor. You use the same SP to authenticate locally on my CLI and it works fine! Shows it has the right access. Only when used from Azure DevOps agents, it starts to fail.
So the issue seems not to be with the role or permission issues, and it works locally fine, so the blocker is from DevOps agent side. Here by default, public DevOps agents are outside your Azure tenant and VNet. So because of this, if your storage account firewall is disabled. Some Microsoft-hosted services like Azure DevOps don’t get access unless you explicitly allow.
Navigate to Networking > Firewalls and virtual networks & Set Allow Azure services on the trusted services list to access this storage account → Enabled
For more information on this refer
Allow trusted Microsoft services
The setup works really well in local because of your machine accesses the storage account over the public internet using the SP.
When it comes to Azure DevOps, hosted agents are internal services without fixed IPs and get blocked unless “Allow trusted Microsoft services” is enabled.
If you don't want to allow all the resources you can try creating a Private Endpoint for your storage account and ensure the agent can route through it.
Refer:
Creating Azure Storage Containers in a storage account with network rules, with Terraform answer by Ansuman Bal
Status=403 Code="AuthorizationFailure" Message="This request is not authorized to perform this operation" answered by Rukmini