azureazure-storagesas-token

When generating a ADLS SAS token, specifying the path causes an authorization failure


I use following code to generate SAS token, when path is null or empty string, it's working.

TokenCredential tokenCredential =
    new ClientSecretCredentialBuilder()
        .tenantId(tenantId)
        .clientId(clientId)
        .clientSecret(clientSecret)
        .build();

DataLakeServiceClient dataLakeServiceClient =
    new DataLakeServiceClientBuilder()
        .endpoint(endpoint)
        .credential(tokenCredential)
        .buildClient();

UserDelegationKey userDelegationKey =
    dataLakeServiceClient.getUserDelegationKey(start, expiry);

PathSasPermission pathSasPermission =
    PathSasPermission.parse("rwdlca");
DataLakeServiceSasSignatureValues signatureValues =
    new DataLakeServiceSasSignatureValues(expiry, pathSasPermission);

String sasToken =
    new DataLakeSasImplUtil(signatureValues,container, path, true)
        .generateUserDelegationSas(userDelegationKey, accountName, Context.NONE);

However, when I specify path, it will fail. I get error message below.

Status code 403
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.</Message>
</Error>

I give path data/testTable, and do sql execution

sql("CREATE DATABASE IF NOT EXISTS " + namespaceName);
sql(String.format("CREATE TABLE %s (id bigint COMMENT 'unique id',data string, ts timestamp) USING iceberg " + "PARTITIONED BY (bucket(2, id), days(ts))", tableName));
sql(String.format(" INSERT INTO %s VALUES (1, 'a', cast('2023-10-01 01:00:00' as timestamp));", tableName));
sql(String.format(" INSERT INTO %s VALUES (2, 'b', cast('2023-10-02 01:00:00' as timestamp));", tableName));
sql(String.format(" INSERT INTO %s VALUES (3, 'c', cast('2023-10-03 01:00:00' as timestamp));", tableName));

Diagnostic for failed authentication requests

I also try to use Diagnostic for failed authentication requests function in Azure portal, but seems it do nothing.

---Updated---

Code use for generate token

Token genarated by path new DataLakeSasImplUtil(signatureValues, "container", path, true)

Postman fail

My container in azure portal

My Access Role

Token genarated without path new DataLakeSasImplUtil(signatureValues, "container", "", true) And then change request url to fetch file.

Postman success when sas token is generate without path


Solution

  • When generating a ADLS SAS token, specifying the path causes an authorization failure

    The above error occurs when you pass incorrect permissions or credentials to access the Azure data lake storage Account.

    In my environment, I added Storage Blob Data Contributor role to the Azure data lake gen2 storage account.

    Portal: enter image description here

    In my environment, I stored files in the foo/bar directory like below.

    Portal: enter image description here

    You can use the below code that will generate sas token for path using Azure Java SDK.

    Code:

    import com.azure.identity.ClientSecretCredential;
    import com.azure.identity.ClientSecretCredentialBuilder;
    import com.azure.storage.file.datalake.*;
    import com.azure.storage.file.datalake.implementation.util.DataLakeSasImplUtil;
    import com.azure.storage.file.datalake.models.*;
    import com.azure.storage.file.datalake.sas.DataLakeServiceSasSignatureValues;
    import com.azure.storage.file.datalake.sas.PathSasPermission;
    import com.azure.core.util.Context;
    
    import java.time.OffsetDateTime;
    
    public class App {
       public static void main(String[] args) {
           // Replace with your details
           String tenantId = "";
           String clientId = "";
           String clientSecret = "";
           String accountName = "venkat8912";
           String containerName = "test";
           String path = "foo/bar"; // Path within the container
           String endpoint = String.format("https://%s.dfs.core.windows.net", accountName);
    
           OffsetDateTime start = OffsetDateTime.now();
           OffsetDateTime expiry = OffsetDateTime.now().plusHours(1); // Token expiry (1 hour)
    
           try {
              
               ClientSecretCredential tokenCredential = new ClientSecretCredentialBuilder()
                       .tenantId(tenantId)
                       .clientId(clientId)
                       .clientSecret(clientSecret)
                       .build();
    
               DataLakeServiceClient dataLakeServiceClient = new DataLakeServiceClientBuilder()
                       .endpoint(endpoint)
                       .credential(tokenCredential)
                       .buildClient();
    
               UserDelegationKey userDelegationKey = dataLakeServiceClient.getUserDelegationKey(start, expiry);
    
               PathSasPermission pathSasPermission = new PathSasPermission()
                       .setReadPermission(true)
                       .setWritePermission(true)
                       .setDeletePermission(true)
                       .setListPermission(true)
                       .setCreatePermission(true)
                       .setAddPermission(true);
    
               DataLakeServiceSasSignatureValues signatureValues = new DataLakeServiceSasSignatureValues(expiry, pathSasPermission);
    
               String sasToken = new DataLakeSasImplUtil(signatureValues, containerName, path, true)
                       .generateUserDelegationSas(userDelegationKey, accountName, Context.NONE);
    
               String fullPathUrl = String.format("%s/%s/%s?%s", endpoint, containerName, path, sasToken);
               System.out.println("Full Path URL with SAS Token: " + fullPathUrl);
    
           } catch (Exception ex) {
               System.err.println("Error generating SAS token: " + ex.getMessage());
               ex.printStackTrace();
           }
       }
    }
    

    Output:

    Full Path URL with SAS Token: https://venkat8912.dfs.core.windows.net/test/foo/bar?sv=2023-11-03&se=2024-12-13T05%3A34%3A32Z&skoid=xxxx15&sktid=93xxxx6d&skt=2024-12-13T04%3A34%3A32Z&ske=2024-12-13T05%3A34%3A32Z&sks=b&skv=2023-11-03&sr=d&sp=racwdl&sdd=2&sig=redacted
    

    enter image description here

    Now, I used the above file URL and fetched the files which is stored in that path of Azure Data Lake Storage account.

    Output:

    enter image description here

    Update:

    The above code with DataLakeServiceclient will work on Data Lake gen2 storage accounts.