bashshellerror-handling

Correct way to use a function with this "write to error log" command?


Let's say I want to print output to an error log file and the console. I assume I have 2 options:

  1. Include error logging inside the function

    copy_to_s3() {
        local INPUT_FILE_NAME=$1
        local BUCKET_NAME=$2
        if aws s3 cp "${INPUT_FILE_NAME}" "s3://${BUCKET_NAME}" >error.log 2>&1; then
            echo "Successfully copied the input file ${INPUT_FILE} to s3://${BUCKET_NAME}"
        else
            error=$(cat "error.log")
            # EMAIL this error to the admin
            echo "Something went wrong when copying the input file ${INPUT_FILE} to s3://${BUCKET_NAME}"
            exit 1
        fi
    
        rm -rf "${INPUT_FILE_NAME}"
    }
    
    copy_to_s3 "test.tar.gz" "test-s3-bucket"
    
    
  2. Include error logging when calling the function

    copy_to_s3() {
        local INPUT_FILE_NAME=$1
        local BUCKET_NAME=$2
        if aws s3 cp "${INPUT_FILE_NAME}" "s3://${BUCKET_NAME}"; then
            echo "Successfully copied the input file ${INPUT_FILE} to s3://${BUCKET_NAME}"
        else
            echo "Something went wrong when copying the input file ${INPUT_FILE} to s3://${BUCKET_NAME}"
            exit 1
        fi
    
        rm -rf "${INPUT_FILE_NAME}"
    }
    
    copy_to_s3 "test.tar.gz" "test-s3-bucket" >error.log 2>&1
    
    

Two questions:


Solution

  • About the error checking:

    if aws s3 cp "${INPUT_FILE_NAME}" "s3://${BUCKET_NAME}" >error.log 2>&1; then
       ...    
    fi
    

    I don't recommend using a temporary file for this, as Bash can just capture the output of your command and store it in a variable for later use.

    For example:

    if Captured_Output="$(my_command 2>&1)"; then
       echo "Success"
    else
       echo "Failure = $Captured_Output"
    fi
    

    About the crontab:

    Why are you setting this PATH in your crontab?

    PATH=/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/home/ec2-user/.local/bin:/home/ec2-user/bin
    

    You are calling your script directly with a full path "/home/ec2-user/test.sh", so it will not be searching for executables in your PATH. Unless you need that PATH to be setup for commands within your script, in which case I'd recommend just setting the PATH within your script.

    As for the logging happening from crontab, it should behave the same as when the script is run from the command line, i.e. it should work fine.

    The only potential difference being run from a crontab is the shell is running "non-interactively", and some commands behave differently when not attached to an interactive shell -- such as by not outputting any text at all. I'm not sure if your aws command will behave differently or not, try it and see, but most likely it will behave identically to the command line.

    Also, I highly recommend not using ALL_UPPERCASE variable names:

    Bash and other shells use all uppercase for special internal variables. You will run into cases where you overwrite an internal variable, and then it will either not set your variable at all (in the case of read-only Bash variables), or it will change your Bash configuration (internal settings variables), or it will alter what you stored into the variable in some way (internal variables may limit what can be stored in them, such as only allowing numbers).

    It is much safer to use $TitleCase, $Title_Case, $camelCase, $lowercase, $lower_case, or some other mix of upper and lowercase letters for your variable names.