come here looking for some help.
i have a resource graph explorer that normally generates a 900kb .csv / 5k lines of information.
i have embeeded the queri inside a powershell runbook to be executed in an Automation account and then sent the .csv file attached in an email. the thing is that i'm tryin to export the file this way:
$result = Search-AzGraph -Query $query
if ($result -eq $null -or $result.Data -eq $null) {
throw "No data returned from the query. Please check the query and try again."
}
$csvFilePath = "$env:TEMP\ResourcesInventory.csv"
try {
$result.Data | Export-Csv -Path $csvFilePath -NoTypeInformation
Start-Sleep -Seconds 30
if (-Not (Test-Path $csvFilePath)) {
throw "CSV file not found: $csvFilePath"
}
this ways works the only thing is that the .csv file generated is only 19kb / 101 lines and i haven't find a way to make sure the result is exported complete.
i have tried also to send the result into an storage account as follows:
# Execute the query
$result = Search-AzGraph -Query $query
# Convert result to CSV content
$csvContent = $result.Data | ConvertTo-Csv -NoTypeInformation
# Upload CSV content to Azure Storage Account
$StorageAccountName = "saresourceinventory"
$ContainerName = "resourceinventorycontainer"
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storagekey
$blobName = "ResourcesInventory.csv"
Set-AzStorageBlobContent -Container $ContainerName -Blob $blobName -BlobType Block -Context $StorageContext -File $blobName -Force
but it seems i need to have the file exported first to be uploaded to the storage account.
Do you guys have any other idea to solve this?
thanks for your help
yep, i already tried that, the thing is that, the thing is that when i export the file to a temporary path it is only 19kb, i can't get to print a bigger file :(
In my environment, when I used Search-AzGraph
, the generated .csv file also contained only 101 lines.
File:
According to this MS-Document,
You can use the below script that Search-AzGraph
command is used within a while
loop to fetch data in batches of five records per request, utilizing the skipToken
parameter to handle pagination.
Script:
$kqlQuery = "Resources | join kind=leftouter (ResourceContainers | where
type=='microsoft.resources/subscriptions' | project subscriptionName = name, subscriptionId) on
subscriptionId | where type =~ 'Microsoft.Compute/virtualMachines' | project VMResourceId = id,
subscriptionName, resourceGroup, name"
$batchSize = 5
$skipResult = 0
[System.Collections.Generic.List[string]]$kqlResult
while ($true) {
if ($skipResult -gt 0) {
$graphResult = Search-AzGraph -Query $kqlQuery -First $batchSize -SkipToken $graphResult.SkipToken
}
else {
$graphResult = Search-AzGraph -Query $kqlQuery -First $batchSize
}
$kqlResult += $graphResult.data
if ($graphResult.data.Count -lt $batchSize) {
break;
}
$skipResult += $skipResult + $batchSize
}
$csvFilePath="$env:TEMP\sample.csv"
$kqlResult | Export-Csv -Path $csvFilePath -NoTypeInformation
$StorageAccountName = "venkat326"
$ContainerName = "test"
$blobName = "ResourcesInventory.csv"
$storagekey = "xzzzzzz"
# Get storage context and upload the csv file
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storagekey
Set-AzStorageBlobContent -File "$csvFilePath" -Container $ContainerName -Blob $blobName -Context $StorageContext -Force
Remove-Item -Path $csvFilePath
Output:
The above code executed and Uploaded 392
lines of csv file to Azure Blob Storage.
Reference: