My preferred location to review database audit logs is in a database, an Azure database to be specific. To be even more specific, I would like to review my Azure database audit logs for my Azure database in that same Azure database. How do I automate this?
In Azure SQL database, under "Auditing", set the destination to Storage. Audit logs will accumulate in your Storage account. Next, in the Azure Portal, go to your storage account. Go to the container with the audit file(s). In "Manage ACL", "Add Principal" > Users > Add your user. Give them permissions (e.g., read, write, execute). Click "Save" at the bottom.
SELECT *
FROM sys.fn_get_audit_file('https://{Azure storage account}.blob.core.windows.net/{container}/{...path to file...}/{audit file name}.xel',null, null);
The default Azure database auditing appears to create audit files with random names. To automatically upload the files, I will likely need an Azure function to iterate through the blob directory to find the file name and move it into the database. The procedure above could then load the audit file into the database.
The Microsoft database audit log reference does not list a database option: Set up Auditing for Azure SQL Database and Azure Synapse Analytics
This SO question suggests a way to import blob storage into a SQL database: Automation to Read SQL Database Audit Logs in Azure. Is this the way?
Azure Active Directory audit logs have an API: Prerequisites to access the Azure Active Directory reporting API
Microsoft has a special SQL function for loading audit logs: sys.fn_xe_file_target_read_file. Getting warmer? Import Extended Events XEL File?, EXTENDED EVENTS DATA, Microsoft function docs
Connecting SQL with Azure: Transact-SQL code that uses Azure Storage container
Question: If these audit events are produced by the database, they must have some type of memory footprint. Why can't I access this memory and forget about trying to load a file in the database?
Answer: While the database does produce the events, it seems the events are stored in a buffer that does not/cannot write to a database table. (Perhaps writing to a database table reduced database performance.) Therefore, I cannot readily access the audit logs produced by Azure prior to their writing to blob.
Targets for your Azure SQL Database event sessions
While it may be possible to re-create the Azure audit logging with session monitoring, it seems overly difficult. For example, the Azure audit logs are broken into daily files likely starting/stopping the auditing session. The audit also captures specific events, which I would need to figure out what to include/not include. But, these views appear to be where the data is stored in a tabular format.
New dynamic management views (DMVs)
Learning about "Extended events": Extended events in Azure SQL Database
The Azure database auditing produces a .xel ("Extended Events") file. This file is binary and cannot be easily read without Microsoft tools (e.g., not notepad). The file is also XML, likely because each event has different fields, making it non-relational and difficult to load in a database. Extended Events overview
Maybe Azure uses these events for auditing. SQL Server Event Class Reference
My preferred location to review database audit logs is in a database, an Azure database to be specific.
AFAIK there is no direct way to load Audit file in to Azure SQL .
You can try two workaround:
sys.fn_get_audit_file
and then you can create table based on this view using SELECT INTO.--creating view
CREATE VIEW View1 AS SELECT * FROM sys.fn_get_audit_file('https://mystorage.blob.core.windows.net/sqldbauditlogs/ShiraServer/MayaDB/SqlDbAuditing_Audit/2017-07-14/10_45_22_173_1.xel', DEFAULT, DEFAULT); GO
--Loading view into table
SELECT * INTO New_Table FROM view1;