I am starting a pipeline using an external webhook that invokes the Runs endpoint.
The webhook is providing a JSON string which I want to parse and use inside the pipeline:
I am collecting the JSON string inside a pipeline variable that allows overwrite at runtime:
I mean to use the variable inside a Bash@3 task:
If I try to parse and use the variable value with jq, I get errors like "Invalid numeric literal at line 1, column 5", which makes me think there must be a problem with the JSON syntax.
And indeed, if I just log it out, it appears that all the doublequotes have been stripped away:
Is this really the case? Why?
How could I ensure that the string enters the pipeline with all its doublequotes?
Consider using an environment variable in the task, instead of assigning a local script variable. It might help to avoid issues such as special characters escaping.
Example:
trigger: none
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash@3
inputs:
targetType: 'inline'
script: |
echo "Payload: ${jsonPayload}"
echo "${jsonPayload}" | jq
displayName: Echo payload
env:
jsonPayload: $(payload) # <--------------------- set environment variable
Running the pipeline with the following JSON string, which contains single-quotes '
and double-quotes "
:
{"name": "Patrick O'Brien", "age": 30, "value": "{\"key1\": \"value1\", \"key2\": \"value2\"}"}
Output: