I am trying to send alerts if in last 15 minutes failureThreshold increases by 1%, Following is the Splunk query that returning percentage failureThreshold:
index=*apiGateway* Consumer=* ServiceName=creditcard.* host=*gateway* HTTPStatus=* earliest=-15m@s
| stats count as Total count(eval(HTTPStatus > 499)) as Failure
| eval failureThreshold=Failure*100/Total
| table failureThreshold
from Alert window - I am running this query after every 5 minutes and choose custom trigger condition, need suggestions how to check condition failureThreshold > 1
I've found it easier and more reliable to put the alert threshold in the search and have the alert trigger when the number of results is not zero.
index=*apiGateway* Consumer=* ServiceName=creditcard.* host=*gateway* HTTPStatus=* earliest=-15m@s
| stats count as Total count(eval(HTTPStatus > 499)) as Failure
| eval failureThreshold=Failure*100/Total
| where failureThreshold > 1
| table failureThreshold