azureshellcloudblobazcopy

azcopy sync authentication fails when ssh session closes


I am working on automating the NFS filesystem sync from azure vm to a blob storage account which I am planning to run through the crontab

I am having authentication issues when script is running in the background, the azcopy sync errors out as soon as the ssh session is closed

#ran the script in this way

nohup ./script &

Cannot perform sync due to error: Login Credentials missing. No SAS token or OAuth token is present and the resource is not public

I tried first managed identity which was not working in background then tried principal secret which also had the same issue

here is my script

#!/bin/bash

# Set variables
SRC_BASE_DIR=/shared2/attachments
DEST_BASE_BLOB_URL=https://stgontadapreprodgennlp.blob.core.windows.net/fileattachments
LOG_DIR=/root/.azcopy
LOG_FILE="$LOG_DIR/azcopy_sync_$(date '+%Y-%m-%d_%H-%M-%S').log"
EMAIL_RECIPIENT=example@xyz.com
FILESYSTEM_USAGE_THRESHOLD=85
SCRIPT_START=$(date '+%Y-%m-%d %H:%M:%S')

#echo "$DEST_BASE_BLOB_URL"

# Get the client ID, client secret, and tenant ID from the app registration
CLIENT_ID=cfe10752-d606-46f8-a427-ef8956711234
CLIENT_SECRET=kP28Q~rtZrNdW4mU5Z6Ho.sgXbcU5Hq55Kb1234
TENANT_ID=da67ef1b-ca59-4db2-9a8c-aa8d9412345

# Authenticate AzCopy with VM's managed identity
#azcopy login --identity
#azcopy login --service-principal --tenant-id “$TENANT_ID” --application-id “$CLIENT_ID” --client-secret “$CLIENT_SECRET”
export AZCOPY_SPA_CLIENT_SECRET=$CLIENT_SECRET
azcopy login --service-principal --tenant-id $TENANT_ID --application-id $CLIENT_ID


# Iterate over directories starting with /shared2/attachments/as
for src_dir in /shared2/attachments/as*/; do

  # Get the directory name (e.g. as01, as02) from the path
  src_dir_name=$(basename "$src_dir")
  echo "in main directory" $src_dir_name

 # Check the filesystem usage before syncing
#echo $LOG_DIR
#echo $LOG_FILE
echo "Filesystem Usage:"$filesystem_usage
echo "Filesystem Threshold:" $FILESYSTEM_USAGE_THRESHOLD
  
    filesystem_usage=$(df -h "$LOG_DIR" | awk 'NR==2{print $5}'|sed 's/%//g')
    if ((filesystem_usage >= FILESYSTEM_USAGE_THRESHOLD)); then
        echo "Filesystem usage for $LOG_DIR is over $FILESYSTEM_USAGE_THRESHOLD%, removing old log files..." >> "$LOG_FILE"
        rm "$LOG_DIR"/*.log
    fi

  # Iterate over subdirectories under the current as directory
  for subdir in "$src_dir"*/; do
    # Get the subdirectory name from the path
    subdir_name=$(basename "$subdir")
    echo "in sub directory $subdir_name and main directory $src_dir_name"

    # Construct the source and destination URLs
    SRC_URL="$SRC_BASE_DIR/$src_dir_name/$subdir_name/"
    DEST_URL="$DEST_BASE_BLOB_URL/$src_dir_name/$subdir_name/"

    # Sync the directories with AzCopy
    azcopy sync "$SRC_URL" "$DEST_URL" --recursive --log-level=ERROR

echo "Source:"$SRC_URL
echo "Destination:" $DEST_URL
#echo $DEST_BASE_BLOB_URL

  done
done

# Log out from AzCopy
azcopy logout

SCRIPT_END=$(date '+%Y-%m-%d %H:%M:%S')
SCRIPT_DURATION=$(( $(date -d "$SCRIPT_END" '+%s') - $(date -d "$SCRIPT_START" '+%s') ))
SCRIPT_DURATION_MINUTES=$(( $SCRIPT_DURATION / 60 ))

echo $SCRIPT_DURATION_MINUTES


# Send email with start and end times, and duration
#echo "Subject: AzCopy Sync Script Completed" | cat - <(echo -e "From: user@example.com\nTo: $EMAIL_RECIPIENT\n") <(echo -e "AzCopy Sync Script completed.\nStart time: $SCRIPT_START\nEnd time: $SCRIPT_END\nDuration: $SCRIPT_DURATION seconds ($SCRIPT_DURATION_MINUTES minutes).") | sendmail -t

any ideas how to resolve this authentication issue when active session is closed


Solution

  • Cannot perform sync due to error: Login Credentials missing. No SAS token or OAuth token is present and the resource is not public Source:/shared2/attachments/as*/*/

    To ressolve this error, add SAS token as an authentication at the end of blob URL as below. When using SAS token, no need to use az login for authentication.

    Goto storage account > containers and generate SAS token.

    enter image description here

    Select the required permissions and click on generate SAS token and copy the token for later use.

    enter image description here

    Now add the token to the blob URL in your script as below.

    DEST_BASE_BLOB_URL=https://<storage-account-name>.blob.core.windows.net/<container-name>?sp=racwdl&st=2023-05-03T10:53:27Z&se=2023-05-03T18:53:27Z&sv=2022-11-0svjhsg762VC77T6FBSSFdjhASGJHSADFB34672T7YGB4EBWE