We have multiple blobs in an azure storage container, when we use powershell to download the blobs (files), 8 out of the 9 files download, however the 9th one fails. There is absolutely nothing different about this file, the only thing I've noticed in the blob properties the "content MD5" is blank, however the other 8 have a value. Not sure what this is or if it has anything to do wit it, I was hoping someone could shed some light as to why this is one file is not downloading..
Thanks in advance :)
Try below code for downloading files from Azure Blob
function Get-DLLFile
{
param(
[Parameter(Mandatory=$true)] [string] $connectionString,
[Parameter(Mandatory=$true)] [String[]] $blobsName,
[Parameter(Mandatory=$true)] [string] $container,
[Parameter(Mandatory=$true)] [string] $filePath
)
Try
{
foreach ($blobName in $blobsName)
{
$file = $filePath + $blobName
$fileAvailable = Get-Item -Path $file -ErrorAction SilentlyContinue
if($null -eq $fileAvailable)
{
$ctx = New-AzureStorageContext -ConnectionString $connectionString
New-Item -Path $filePath -ItemType Directory -Force | Out-Null
Get-AzureStorageBlobContent -Blob $blobName -Container $container -Destination $filePath -Context $ctx -Force | out-null
}
}
}
Catch
{
$_.Exception.Message
}
}
Get-DLLFile -blobsName "File1.csv","File2.json" -container "myContainer" -connectionString "$(BlobConnectionString)" -filePath "$(System.DefaultWorkingDirectory)/Download"
Hope this should work, if not Please share the exception you are getting.
Related
I'm trying to convert JSON file which is present in storage account data into Powershell object. But I'm not getting the proper output. Output does not contain proper values.
My code:
$storageAccountKey = "xxxx"
$Context = New-AzStorageContext -StorageAccountName 'xxxx' -StorageAccountKey $storageAccountKey
$b='C:\Temp\Scheduled.json'
Get-AzureStorageFileContent -Path $b -ShareName "file-share-name"
$newScheduledRules = Get-Content -Raw $b | ConvertFrom-Json
Write-Output ("########newScheduledRules::###########" + $newScheduledRules)
Output:
Could not get the storage context. Please pass in a storage context or set the current storage context.
########newScheduledRules::############{Scheduled=System.Object[]; Fusion=System.Object[]; MLBehaviorAnalytics=System.Object[]; MicrosoftSecurityIncidentCreation=System.Object[]}
I have reproduced in my environment and got expected results as below and followed below process and followed Microsoft-Document:
Scheduled.json:
{
"Rithwik":"Hello",
"Chotu":"Bojja"
}
Firstly, I have created Storage account and then added a Scheduled.json in file share as below:
Now i have created a runbook and excuted below script in runbook as below:
$storageAccountKey = "PFHxFbVmAEvwBM6/9kW4nORJYA+AStA2QQ1A=="
$Context = New-AzStorageContext -StorageAccountName 'rithwik' -StorageAccountKey $storageAccountKey
$out = "$($env:TEMP)\$([guid]::NewGuid())"
New-Item -Path $out -ItemType Directory -Force
Get-AzStorageFileContent -ShareName "rithwik" -Context $Context -Path 'Scheduled.json' -Destination $out -Force
$newScheduledRules = Get-Content -Path "$out\Scheduled.json" -Raw | ConvertFrom-Json
Write-Output ("########newScheduledRules::###########" + $newScheduledRules)
Output:
Here $out is the Destination Variable.
-Path should be Only the file name Scheduled.json in
Get-AzStorageFileContent command.
It seems the Get-AzureStorageFileContent is missing -Context parameter. It should be something like this
$OutPath = "$($env:TEMP)\$([guid]::NewGuid())"
New-Item -Path $OutPath -ItemType Directory -Force
$storageContext = (Get-AzStorageAccount -ResourceGroupName xxxx -Name xxxx).Context
Get-AzStorageFileContent -ShareName "file-share-name" -Context $storageContext -Path 'Scheduled.json' -Destination $OutPath -Force
$newScheduledRules = Get-Content -Path "$OutPath\Scheduled.json" -Raw | ConvertFrom-Json
Write-Output ("########newScheduledRules::###########" + $newScheduledRules)
Could you please help me with Azure PowerShell script to compare files in the blob with local destination and download files which is not available in the destination.
I tried some, but able to get the answer.
$blobNames = Get-Content
For ($i=0; $i -lt $blobNames.Length; $i++) {
$blob = $blobNames[$i]
Write-Host "Downloading $blob. Please wait."
Get-AzStorageBlobContent -Blob $blob -Container $containerName -Destination $destination -Context $context -Verbos
PowerShell script to compare files in the blob with local destination and download files which is not available in the destination:
I've created a script and it works for me:
$ContainerName ='<containername>'
$destination_path = 'C:\Users\xxxx\Desktop\blobcheck' #pathtobedownloaded
$Ctx = New-AzureStorageContext '<storageaccount>' -StorageAccountKey 'accesskey'
$Blobs = Get-AzureStorageBlob -Container $ContainerName -Context $Ctx
$localfile = get-childitem -LiteralPath C:\Users\xxxxxx
For ($i=0; $i -lt $Blobs.Length; $i++) {
if($Blobs[$i].Name -eq $localfile){
Write-Host "Presented"
}
else{
$blob = $Blobs[$i].Name
Write-Host "Downloading $blob. Please wait."
Get-AzureStorageBlobContent -Blob $blob -Container $ContainerName -Destination $destination_path -Context $Ctx -Verbose
}
}
Compared and downloaded the files which is not present in my local folder:
Files uploaded in My Local folder (destination_path):
Prompting to rewrite if the file is already exists as shown below:
If the prompt is not needed, then you can simply use
DisplayAlerts = FALSE
Note: Get storage access key from Azure Portal:
Goto <Storageaccount> -> Access keys
I have PowerShell script which downloads the file with current date as filename from Azure blob. But How to get the log file of the process and how to remove the file which is downloaded from Azure blob through script. Could someone help me in this, Please.
Example.
app_09102021.txt
app_10102021.txt
app_11102021.txt
Below is the script.
$container_name = '$XXX'
$destination_path = 'D:\test'
$Ctx = New-AzStorageContext $ZZZZ -StorageAccountKey $CCVCVCVCV
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$Listblobs = 'app_{0:ddMMyyyy}.txt' -f (Get-Date)
# Just download that blob
Get-AzStorageBlobContent -Context $Ctx -Container $container_name -Blob $Listblobs -Destination $destination_path
I have tested in my environment to download the blobs and delete by using below cmd and its successfully downloaded and got removed from Azure as well .
$container_name = 'testaj'
$destination_path = 'C:\Users\Desktop\test'
$Ctx = New-AzStorageContext 'accountname' -StorageAccountKey 'accountkeyrA=='
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$Listblobs ='app_{0:ddMMyyyy}.txt' -f (Get-Date)
$blob = Get-AzStorageBlobContent -Context $Ctx -Container $container_name -Blob $Listblobs -Destination $destination_path -ErrorAction SilentlyContinue
if($blob -ne $Null)
{
Write-Host ("The File $Listblobs has been downloaded to $destination_path")
Write-Host ("Proceeding to delete the downloaded file from $container_name in Azure!!!")
Remove-AzStorageBlob -Container $container_name -Blob $Listblobs -Context $Ctx
Write-Host ("deleted file from $Listblobs in Azure!!!")
}
else{
write-Host ("The file does not exit")
}
Here is the Output for downloaded and deleted file from blob:
If the file got deleted then it will be something like below:
For more information please refer this MS DOC: Monitoring Azure Blob Storage
UPDATE:
I am looking for the solution to search & match the date with today's
date in filename, not sorting by last modified date/time.
Tried with the below code to download all the blobs with todays date for e.g abc_04012022,app_04012022 and delete them from Azure
PS Script :-
$container_name = 'test'
$destination_path = 'C:\Users\Desktop\test'
$Ctx = New-AzStorageContext 'accountname' -StorageAccountKey 'accountkey=='
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$latestBlob = Get-Date -UFormat '%d%m%Y'
$bloblist = Get-AzStorageBlob -Container $container_name -Context $Ctx |select -Property Name
foreach($item in $bloblist){
if($item.Name -match $latestBlob){
Write-Output "Here is the blobs"
$blob = Get-AzStorageBlobContent -Context $Ctx -Container $container_name -Blob $item.Name -Destination $destination_path -ErrorAction SilentlyContinue
if($blob -ne $Null)
{
Write-Output ("The File $item.Name has been downloaded to $destination_path")
Write-Output ("Proceeding to delete the downloaded file from $container_name in Azure!!!")
Remove-AzStorageBlob -Container $container_name -Blob $item.Name -Context $Ctx
Write-Output ("deleted file from $item.Name in Azure!!!")
}
else{
write-Output ("The file does not exit")
}
}
}
Screenshot for reference:
I'm trying to write a function in Powershell that sets blobs in a specific container to a certain type, as they are always written with the type application/octet-stream which causes issues with downstream applications. I have written the below function but it returns the error 'ContentType' is a ReadOnly property.
I was wondering if there was any way around this? I know the property can be set manually in Azure Storage Explorer, however this is a daily task.
Function:
Function Set-ContentType {
Param (
[string]$accountName,
[string]$accessKey,
[string]$storageContainer
)
# Connect to blob storage and get blobs
$context = New-AzureStorageContext -StorageAccountName $accountName -StorageAccountKey $accessKey
$blobs = Get-AzureStorageBlob -Container $storageContainer -Context $context -Blob $fileMask
foreach ($blob in $blobs) {
if ($blob.ContentType -eq $genericMIME) {
$blob.ContentType = $targetMIME
}
}
}
So based on the link from the comments, please give this solution a try.
Function Set-ContentType {
Param (
[string]$accountName,
[string]$accessKey,
[string]$storageContainer
)
# Connect to blob storage and get blobs
$context = New-AzureStorageContext -StorageAccountName $accountName -StorageAccountKey $accessKey
$blobs = Get-AzureStorageBlob -Container $storageContainer -Context $context -Blob $fileMask
foreach ($blob in $blobs) {
if ($blob.ContentType -eq $genericMIME) {
$blob.Properties.ContentType = $targetMIME
$blob.SetProperties()
}
}
}
I have solved my own issue by writing an alternative upload script that defines the ContentType at the time of writing the blob:
Function UploadFile {
Param (
[string]$accountName,
[string]$accessKey
)
$context = New-AzureStorageContext -StorageAccountName $accountName -StorageAccountKey $accessKey
$files = Get-ChildItem $workingDir -Filter $fileMask
foreach ($file in $files) {
Set-AzureStorageBlobContent -File $file.FullName -Container $container -Properties #{"ContentType" = "$targetMIME"} -Context $context -Force
}
}
I have got a problem to set my content in AzureBlobStorage.
In local, I have succeeded to replace characters for each files in a directory.
$sourceFolder = "C:\MyDirectory"
$targetFolder = "C:\MyDirectoryEncodeded"
$fileList = Dir $sourceFolder -Filter *.dat
MkDir $targetFolder -ErrorAction Ignore
ForEach($file in $fileList) {
$file | Get-Content | %{$_ -replace '"',''} | %{$_ -replace ',','.'} | Set-Content -Path "tempDirectory\$file"
$newFile = Get-Content "tempDirectory\$file"
$Utf8NoBomEncoding = New-Object System.Text.UTF8Encoding $False
[System.IO.File]::WriteAllLines("targetDirectory\$file" , $newFile,$Utf8NoBomEncoding)
}
exit
But now, I need to do the same in Microsoft Azure.
I get the content into an Azure Blob Storage, I escape characters, I encoding my file in UTF-8NoBom and then I set the encode file into a new Blob Directory.
Nevertheless, I faced an issue when I want to set the new content with escape characters (First line in my loop).
$storageContext = New-AzureStorageContext -ConnectionString "DefaultEndpointsProtocol=https;AccountName=<myAccountName>;AccountKey=<myAccountKey>;"
$sourceFolder = Get-AzureStorageBlob -Container "datablobnotencoded" -Blob "*.dat" -Context $storageContext
$targetFolder = Get-AzureStorageBlob -Container "datablob" -Context $storageContext
MkDir $targetFolder -ErrorAction Ignore
ForEach($file in $sourceFolder) {
Get-AzureStorageBlob -Container "datablobnotencoded" -Blob $file.Name -Context $storageContext | Get-AzureStorageBlobContent | %{$_ -replace '"',''} | %{$_ -replace ',','.'} | Set-AzureStorageBlobContent -File $file.Name -Context $storageContext -CloudBlob $file
$newFile = Get-AzureStorageFileContent -Path $file
$Utf8NoBomEncoding = New-Object System.Text.UTF8Encoding $False
[System.IO.File]::WriteAllLines($file , $newFile, $Utf8NoBomEncoding)
}
I've got this error:
Set-AzureStorageBlobContent : Cannot bind parameter 'CloudBlob'.
Cannot convert the
"Microsoft.WindowsAzure.Commands.Storage.Model.ResourceModel.AzureStorageBlob"
value of type
"Microsoft.WindowsAzure.Commands.Storage.Model.ResourceModel.AzureStorageBlob"
to type "Microsoft.WindowsAzure.Storage.Blob.CloudBlob". At line:7
char:264
+ ... lobContent -File $file.Name -Context $storageContext -CloudBlob $file
+ ~~~~~
+ CategoryInfo : InvalidArgument: (:) [Set-AzureStorageBlobContent], ParameterBindingException
+ FullyQualifiedErrorId : CannotConvertArgumentNoMessage,Microsoft.WindowsAzure.Commands.Storage.Blob.SetAzureBlobContentCommand
Thank you for your answers!
There are some mistakes in your powershell scripts:
1.You may misunderstand the usage of Get-AzureStorageBlobContent, it's used to download blob to local, you cann't get the content of the blob, more details refer here.
2.In the loop, you used $newFile = Get-AzureStorageFileContent -Path $file, the Get-AzureStorageFileContent cmdlet is for file share storage, not for the blob storage.
You can use Get-AzureStorageBlobContent to download the blobs to a local folder, then operate on the local file which is downloaded from blob storage. After the file is modified, you can use Set-AzureStorageBlobContent to upload the local files to the specified azure blob storage.
Sample code as below, and works fine at my side:
$context = New-AzureStorageContext -ConnectionString "xxxx"
#download the blobs in specified contianers
$sourceFolder_blob = Get-AzureStorageBlob -Container "test-1" -Blob "*.txt" -Context $context
#the target azure container, which you want to upload the modifed blob to
$taget_container="test-2"
#the local path which is used to store the download blobs, and make sure the folders exist before use.
$sourceFolder_local="d:\test\blob1\"
$targetFolder_local="d:\test\blob2\"
foreach($file in $sourceFolder_blob)
{
#download the specified blob to local path
Get-AzureStorageBlobContent -Container "test-1" -Blob $file.name -Destination $sourceFolder_local -Context $context
#get the local file path
$local_file_path=$sourceFolder_local + $file.name
#set content to the file in target local folder
$local_target_file_path = "$targetFolder_local"+$file.name
#since the files are downloaded to local, you can any operation for the local file
Get-Content $local_file_path | %{$_ -replace '-','!'} | %{$_ -replace ',','.'} | Set-Content -Path $local_target_file_path
$newFile = Get-Content -Path $local_target_file_path
$Utf8NoBomEncoding = New-Object System.Text.UTF8Encoding $False
[System.IO.File]::WriteAllLines($local_target_file_path , $newFile,$Utf8NoBomEncoding)
#the last step, upload the modified file to another azure container
Set-AzureStorageBlobContent -File $local_target_file_path -Context $context -Container $taget_container
}