I am trying to upload file from my local machine to azure blob using Powershell.
Code
$date = Get-Date -Format "dd-MM-yyyy"
$Ctx = New-AzStorageContext -StorageAccountName abc -StorageAccountKey 'xxxxxx=='
Set-AzStorageBlobContent -Container jmeter -Blob "Result/Load1/$date/" -File "C:\Result.jtl" -Context $Ctx -Force
The hierarchy created successfully inside the container jmeter
e.g : Result/Load1/13-03-2020
But file uploaded with no name
As mentioned in the comment, it should be -Blob "Load1/$date/Result.jtl".
Set-AzStorageBlobContent -Container test1 -Blob "Load1/$date/Result.jtl" -File "C:\Users\joyw\Desktop\Result.jtl" -Context $Ctx -Force
Related
Script:
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName "RD-RDIPCloudMigration-AppResources-devtest" -AccountName "rdipstoragegen2").Value[0]
$ctx = New-AzStorageContext -StorageAccountName "rdipstoragegen2" -StorageAccountKey $storageAccountKey
Get-AzDataLakeGen2ChildItem -Context $ctx -FileSystem "rdipdata" -Path "eval/raw/rdipclinicaltrial/data/warehouse/integrated/clntrl_reference_use.db" | export-csv "test.csv" -NoTypeInformation
By using this I am only able to get details of file level, if I run this script again and again in each level the only i can get details of the file inside the folder.
FYI
Kindly help me to get one script by which i can get all the details include file inside the folder.
There is a -Recurse switch within the Get-AzDataLakeGen2ChildItem cmdlet that:
Indicates if will recursively get the Child Item. The default is
false.
The code below works manually but what I'm trying to do is find a way I can run it in an automated way through a job and log in to my azure account also not expose the account key in the code.
any advice?
{Connect-AzAccount
Select-AzSubscription -SubscriptionId <subId>
$ctx = New-AzStorageContext -StorageAccountName <Accounttest> -UseConnectedAccount
$ctx = New-AzStorageContext -StorageAccountName <Accounttest>
-StorageAccountKey <accoutnKey>
$filesystemName = "DataFiles"
New-AzStorageContainer -Context $ctx -Name $filesystemName
# folder directory
$filesystemName = " DataFiles "
$dirname = "my-directory/"
New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Directory
#Upload a file to a directory
$localSrcFile = "upload2.txt"
$filesystemName = " DataFiles "
$dirname = "my-directory/"
$destPath = $dirname + (Get-Item $localSrcFile).Name
New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $destPath -Source $localSrcFile -Force
}
I think it will be better to put your PowerShell script in a DevOps pipeline in Azure Pipelines for instance. A Service principal would be needed. You would need to configure the service connection in Azure DevOps and your service principal should be Contributor on the resource group where the Data Lake Storage is located.
See link: Manage service connections
I am trying to convert .CSV file to .ZIP using Powershell and then store the ZIP file in Azure blob storage.
Invoke-Sqlcmd -ServerInstance $SvrName -Database $dataname -Username $UserName -Password $Password -Query $getTABLEdetails | Export-CSV $filename1 -Append -NoTypeInformation
With the above command I am storing the output in .CSV file. Now I want to convert the .CSV file .ZIP file. And store the ZIP file in Azure blob storage.
$context = New-AzureStorageContext -StorageAccountName 'Accountname' -StorageAccountKey 'storagekey'
Set-AzureStorageBlobContent -Container zipfiles -File $filename1 -Blob $filename1 -Context $context -Force
With the above command I can store it in .CSV file. But I want to store the file in .ZIP format.
Is their any way I can achieve this?
Please help
Use Compress-Archive to place your CSV file in a ZIP archive file before uploading to Blob storage.
To extract the filename from the CSV file, I use -LeafBase from Split-Path, then format this filename into a ZIP file path using the Format operator -f. Could also just concatenate with (Split-Path -Path $csvFile -LeafBase) + '.zip'.
$storageAccountName = 'Accountname'
$storageAccountKey = 'storagekey'
$containerName = 'zipfiles'
$csvFile = 'data.csv'
# Format into data.zip
$zipFile = '{0}.zip' -f (Split-Path -Path $csvFile -LeafBase)
# Create archive file
# Use -Update to overwrite if necessary
Compress-Archive -Path $csvFile -DestinationPath $zipFile -Update
# Get current storage account context
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
# Upload .zip file to Blob storage container
# Use -Force to skip prompts
Set-AzureStorageBlobContent -Container $containerName -Blob $zipFile -File $zipFile -Context $context -Force
To use the newer Az modules, we can replace New-AzureStorageContext with New-AzStorageContext and Set-AzureStorageBlobContent with Set-AzStorageBlobContent.
I have some code which pulls out a Kusto query from Azure Monitor and I need to upload the data to a blob storage account for long term retention.
I can pull out the data and display it in the azure automation screen when I run through the test pane, but it doesn't upload to blob.
I think the error is here
$SearchResult
$StorageAccountName = Get-AutomationVariable -Name "AccessKey"
$StorageAccountKey = Get-AutomationVariable -Name "StorageAccName"
foreach ($sr in $SearchResult){
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName `
-StorageAccountKey $StorageAccountKey
$ContainerName = "Data"
New-AzureStorageContainer -Name $ContainerName -Context $ctx -Permission Blob
$BlobName = "$sr"
Set-AzureStorageBlobContent -Container $ContainerName -Blob $BlobName `
-Context $ctx
The full script is below
https://pastebin.com/embed_iframe/RyLJZVKW
Basically it authenticates using some stored variables and then runs a query which returns the results below (up to that part works), but then I'd like to upload the data to Blob.
The output example can be found at:
https://pastebin.com/embed_iframe/fEF6NsnK
If there's a better way of getting a kusto query stored straight to blob storage I'd be happy to consider that .. thanks everyone :)
Your Set-AzureStorageBlobContent call seems to be missing the -File parameter. You probably get a message complaining about this in the job Error stream.
Assuming that you want to send the data stored in the $sr variable, something like this should work (not tested):
$tempFile = New-TemporaryFile
$sr | Out-File -FilePath $tempFile.FullName
Set-AzureStorageBlobContent ... -File $tempFile.FullName
Remove-Item $tempFile
I have storage that's where I host a static website. I want to create a task that transfers my build files into that storage.
$context = New-AzureStorageContext -StorageAccountName $env:prSourceBranchName -StorageAccountKey "e4Nt0i1pcsYWnzLKw4PRdwu+************/qMj7fXyJK6lS4YTlPCOdbFlEG2LN9g2i5/yQ=="
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build ls -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context
When I run the code:
I can see the files that need to be uploaded but when I check my $web blob (created by the Static Website) it's empty.
https://learn.microsoft.com/en-us/powershell/module/azure.storage/set-azurestorageblobcontent?view=azurermps-6.13.0 check out the second example.
Can someone explain why nothing is happening?
I want to do this > https://learn.microsoft.com/en-us/cli/azure/storage/blob?view=azure-cli-latest#az-storage-blob-upload-batch but through AzureRM.
Remove ls from your last line of code which is used to upload to blob storage:
Change:
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build ls -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context
To:
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context