I have some code which pulls out a Kusto query from Azure Monitor and I need to upload the data to a blob storage account for long term retention.
I can pull out the data and display it in the azure automation screen when I run through the test pane, but it doesn't upload to blob.
I think the error is here
$SearchResult
$StorageAccountName = Get-AutomationVariable -Name "AccessKey"
$StorageAccountKey = Get-AutomationVariable -Name "StorageAccName"
foreach ($sr in $SearchResult){
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName `
-StorageAccountKey $StorageAccountKey
$ContainerName = "Data"
New-AzureStorageContainer -Name $ContainerName -Context $ctx -Permission Blob
$BlobName = "$sr"
Set-AzureStorageBlobContent -Container $ContainerName -Blob $BlobName `
-Context $ctx
The full script is below
https://pastebin.com/embed_iframe/RyLJZVKW
Basically it authenticates using some stored variables and then runs a query which returns the results below (up to that part works), but then I'd like to upload the data to Blob.
The output example can be found at:
https://pastebin.com/embed_iframe/fEF6NsnK
If there's a better way of getting a kusto query stored straight to blob storage I'd be happy to consider that .. thanks everyone :)
Your Set-AzureStorageBlobContent call seems to be missing the -File parameter. You probably get a message complaining about this in the job Error stream.
Assuming that you want to send the data stored in the $sr variable, something like this should work (not tested):
$tempFile = New-TemporaryFile
$sr | Out-File -FilePath $tempFile.FullName
Set-AzureStorageBlobContent ... -File $tempFile.FullName
Remove-Item $tempFile
Related
Script:
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName "RD-RDIPCloudMigration-AppResources-devtest" -AccountName "rdipstoragegen2").Value[0]
$ctx = New-AzStorageContext -StorageAccountName "rdipstoragegen2" -StorageAccountKey $storageAccountKey
Get-AzDataLakeGen2ChildItem -Context $ctx -FileSystem "rdipdata" -Path "eval/raw/rdipclinicaltrial/data/warehouse/integrated/clntrl_reference_use.db" | export-csv "test.csv" -NoTypeInformation
By using this I am only able to get details of file level, if I run this script again and again in each level the only i can get details of the file inside the folder.
FYI
Kindly help me to get one script by which i can get all the details include file inside the folder.
There is a -Recurse switch within the Get-AzDataLakeGen2ChildItem cmdlet that:
Indicates if will recursively get the Child Item. The default is
false.
The code below works manually but what I'm trying to do is find a way I can run it in an automated way through a job and log in to my azure account also not expose the account key in the code.
any advice?
{Connect-AzAccount
Select-AzSubscription -SubscriptionId <subId>
$ctx = New-AzStorageContext -StorageAccountName <Accounttest> -UseConnectedAccount
$ctx = New-AzStorageContext -StorageAccountName <Accounttest>
-StorageAccountKey <accoutnKey>
$filesystemName = "DataFiles"
New-AzStorageContainer -Context $ctx -Name $filesystemName
# folder directory
$filesystemName = " DataFiles "
$dirname = "my-directory/"
New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Directory
#Upload a file to a directory
$localSrcFile = "upload2.txt"
$filesystemName = " DataFiles "
$dirname = "my-directory/"
$destPath = $dirname + (Get-Item $localSrcFile).Name
New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $destPath -Source $localSrcFile -Force
}
I think it will be better to put your PowerShell script in a DevOps pipeline in Azure Pipelines for instance. A Service principal would be needed. You would need to configure the service connection in Azure DevOps and your service principal should be Contributor on the resource group where the Data Lake Storage is located.
See link: Manage service connections
I am trying to convert .CSV file to .ZIP using Powershell and then store the ZIP file in Azure blob storage.
Invoke-Sqlcmd -ServerInstance $SvrName -Database $dataname -Username $UserName -Password $Password -Query $getTABLEdetails | Export-CSV $filename1 -Append -NoTypeInformation
With the above command I am storing the output in .CSV file. Now I want to convert the .CSV file .ZIP file. And store the ZIP file in Azure blob storage.
$context = New-AzureStorageContext -StorageAccountName 'Accountname' -StorageAccountKey 'storagekey'
Set-AzureStorageBlobContent -Container zipfiles -File $filename1 -Blob $filename1 -Context $context -Force
With the above command I can store it in .CSV file. But I want to store the file in .ZIP format.
Is their any way I can achieve this?
Please help
Use Compress-Archive to place your CSV file in a ZIP archive file before uploading to Blob storage.
To extract the filename from the CSV file, I use -LeafBase from Split-Path, then format this filename into a ZIP file path using the Format operator -f. Could also just concatenate with (Split-Path -Path $csvFile -LeafBase) + '.zip'.
$storageAccountName = 'Accountname'
$storageAccountKey = 'storagekey'
$containerName = 'zipfiles'
$csvFile = 'data.csv'
# Format into data.zip
$zipFile = '{0}.zip' -f (Split-Path -Path $csvFile -LeafBase)
# Create archive file
# Use -Update to overwrite if necessary
Compress-Archive -Path $csvFile -DestinationPath $zipFile -Update
# Get current storage account context
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
# Upload .zip file to Blob storage container
# Use -Force to skip prompts
Set-AzureStorageBlobContent -Container $containerName -Blob $zipFile -File $zipFile -Context $context -Force
To use the newer Az modules, we can replace New-AzureStorageContext with New-AzStorageContext and Set-AzureStorageBlobContent with Set-AzStorageBlobContent.
I am trying to upload file from my local machine to azure blob using Powershell.
Code
$date = Get-Date -Format "dd-MM-yyyy"
$Ctx = New-AzStorageContext -StorageAccountName abc -StorageAccountKey 'xxxxxx=='
Set-AzStorageBlobContent -Container jmeter -Blob "Result/Load1/$date/" -File "C:\Result.jtl" -Context $Ctx -Force
The hierarchy created successfully inside the container jmeter
e.g : Result/Load1/13-03-2020
But file uploaded with no name
As mentioned in the comment, it should be -Blob "Load1/$date/Result.jtl".
Set-AzStorageBlobContent -Container test1 -Blob "Load1/$date/Result.jtl" -File "C:\Users\joyw\Desktop\Result.jtl" -Context $Ctx -Force
We are trying to grant read/write access to many folders in our Azure data Lake gen 2 containers and although we can do this through the UI, it's quite tedious and has to be repeated for all environments. Has anyone used a better way using Powershell to automate or at least parameterize this process of granted access to Azure Data Lake gen 2 containers and avoid granting access manually?
Unfortunately I couldn't get this to work using the following link or other documentation as it's for Gen 1 but it's very similar to what I need to do for gen 2.
https://www.sqlchick.com/entries/2018/3/17/assigning-data-permissions-for-azure-data-lake-store-part-3
According to my test, we can use the PowerShell to manage Azure Data Lake Gen2 permissions. For more details, please refer to the document
Install the required module
install-Module PowerShellGet –Repository PSGallery –Force
install-Module Az.Storage -Repository PSGallery -RequiredVersion 1.9.1-preview –AllowPrerelease –AllowClobber –Force
Besides, please note that if you want to install the module, you need to meet some conditions
.NET Framework is 4.7.2 or greater installed
PowerShell is 5.1 or higher
Script
Connect-AzAccount
$groupName=""
$accountName=""
$account= Get-AzStorageAccount -ResourceGroupName $groupName -Name $accountName
$ctx = $account.Context
$filesystemName = "test"
$dirname="template/"
$Id = "<the Object ID of user, group or service principal>"
$dir=Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname
$acl = New-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityId $id -Permission "rw-" -InputObject $dir.ACL
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Acl $acl
$dir=Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname
$dir.ACL
Thanks Jim Xu for providing the script above. I'm just complementing the code with the following items :
Get all folders from the container
Assign ACL for all folders
Propagate ACL to all subfolders
$groupName="resource group name"
$accountName="storage account name"
$account= Get-AzStorageAccount -ResourceGroupName $groupName -Name $accountName
$ctx = $account.Context
$filesystemName = "container name"
$Id = (Get-AzADGroup -DisplayName '<type user / group name here>').Id
$items = Get-AzDataLakeGen2ChildItem -Context $ctx -FileSystem $filesystemName
foreach ( $item in $items) {
$dir = Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/"
$acl = New-AzDataLakeGen2ItemAclObject -AccessControlType group -EntityId $id -Permission "rwx" -InputObject $dir.ACL -DefaultScope
# Update ACL on blob item
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/" -Acl $acl
# Propagate ACL to child blob items
Set-AzDataLakeGen2AclRecursive -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/" -Acl $acl
}