powershell upload blob to azure blob storage set content type - azure

I am making a ReST api call and uploading the JSON payload to azure blob storage using the below powershell.
PoSh:
$Params = #{"URI" = 'https://myapiexample.com/api/data/'}
$Result = Invoke-RestMethod #Params | ConvertTo-Json
$context=New-AzStorageContext -StorageAccountName "mystorage" -StorageAccountKey ""
$container=Get-AzStorageContainer -Name "input" -Context $context
$content = [system.Text.Encoding]::UTF8.GetBytes($Result)
$blobpath = "azblob/mydata.json"
$container.CloudBlobContainer.GetBlockBlobReference($blobpath).UploadFromByteArray($content,0,$content.Length)
$container.CloudBlobContainer.GetBlockBlobReference($blobpath).Properties.ContentType = "application/json"
$container.CloudBlobContainer.GetBlockBlobReference($blobpath).SetProperties()
I notice that when the blob is stored in azure blob storage the content-type is application/octet-stream whereas I would want it to be application/json. The code I use is not working as expected.

After reproducing from my end, I could able to achieve your requirement using Set-AzStorageBlobContent. Below is the complete code that is working for me that sets the content type of the blob.
$url = "<YOUR_URL>"
$dest = "samplex.json"
Invoke-WebRequest -Uri $url -OutFile $dest
$context = New-AzStorageContext -StorageAccountName "<YOUR_ACCOUNT_NAME>" -StorageAccountKey "<YOUR_ACCOUNT_KEY>"
Set-AzStorageBlobContent -Context $context -Container "containers" -Blob "mydata.json" -File $dest -Properties #{"ContentType" = "application/json"};
RESULTS:

Related

Encrypt Azure Storage account key in powershell script

I'm developing a new powershell script in order to download any blobs from a specific container and the problem is due to security reasons because I do not want to paste in text plain the azure account key.
So I have implemented a solution using 'ConvertTo-SecureString' command but the problem still exists because when I create a connection string to the blob, there appears a message who said: "Server Failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. HTTP Status Code 403 - HTTP Error".
With the key in plain text I'm able to create the connection string properly and then list and download all blobs from the container.
I tried other solutions for example ' $Credential= New-Object System.Management.Automation.PSCredential ('$ShareUser, $SharePassword)'
but there is other problem related with the input is not valid base64 string.
Do you know how to avoid this issues and create a secure connection string with an Azure Storage Account?
Best regards and thanks in advance
Here a part of my powershell script
$SecurePassword= Read-Host -AsSecureString | ConvertFrom-SecureString
$SecurePassword | Out-File -FilePath C:\test_blob\pass_file.xml
$ConfigFile= 'C:\Users\\config_file.xml'
IF (Test-Path) {
[xml]$Config= Get-Content $ConfigFile
[string] $Server = $Config.Config.Server;
[string] $SharePassword = $Config.Config.SharePassword;
} ELSE
{
write-host "File do not exists: $ConfigFile"
}
#BlobStorageInformation
$StorageAccountName='test_acc'
$Container='test'
$DestinationFolder= 'C:\Users\user1\Blobs'
$Context = New-AzStorageConext -StorageAccountName $StorageAccountName -StorageAccountKey $SharePassword
#List of Blobs
$ListBlob=#()
$ListBlob+= Get-AzStorageBlob -context $Context -container $Container | Where-Object {$_.LastModified -lt (Get-Date).AddDAys(-1)}
Why would you maintain the password files or enter storage key manually when you have az powershell. Just login using az powershell, set the subscription and enjoy !
$ResourceGroupName = "YOURRESOURCEGROUPNAME"
$StorageAccountName = "YOURSTORAGEACCOUNTNAME"
$ContainerName = "YOURCONTAINERNAME"
$LocalPath = "D:\Temp"
Write-Output 'Downloading Content from Azure blob to local...'
$storageKey = (Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName).value[0]
$storageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storageKey
$blobs = Get-AzStorageBlob -Container $ContainerName -Context $storageContext
foreach($blob in $blobs)
{
Get-AzStorageBlobContent -Container $ContainerName -Context $storageContext -Force -Destination $LocalPath -Blob $blob.Name
}
Write-Output 'Content Downloaded Successfully !!!'

Undeleting a Soft Deleted Blob in Azure Storage Using a REST API call from PowerShell

I am trying to create a script to retrieve blobs for a given customer number from a storage account in Azure. All blobs reside in a single container, with 'actioned' blobs being soft deleted.
I can use PowerShell to display the relevant blobs, including their 'IsDeleted' status, but I understand that PowerShell doesn't have the necessary command to undelete blobs and so I'm trying to make a REST API call from the PowerShell script.
I do an inital login to the Azure platform and set a variable for an SAS token (which includes the necessary permissions to undelete):
$username = "<myUserName>"
$encryptedPwd = Get-Content <path\securepassword.txt> | ConvertTo-SecureString
$cred = New-Object System.Management.Automation.PsCredential($username, $encryptedPwd)
$strgaccname = "<myStorageAccount>"
$strgcontainer = "<myContainer>"
#SAS Token
$sastkn = "<mySAStoken>"
#Set StorageContext
$ctx = New-AzStorageContext -StorageAccountName $strgaccname -SasToken $sastkn
$subId = "mySubscriptionID"
Connect-AzAccount -Credential $cred -Subscription $subID
I can list all matching blobs with the following PowerShell:
$searchstring = '*'+<myCustomerNumber>+'*'
Get-AzStorageBlob -Blob $searchstring -Context $ctx -Container $strgcontainer -IncludeDeleted `
| Select-Object Name, Length, LastModified, IsDeleted `
| Sort-Object LastModified -Descending
I am unsure how to proceed with the REST API call. Looking at some other people's methods, I have something like the following, using a test blob that has been soft deleted:
$uri = "https://<myStorageAccount>.blob.core.windows.net/<myContainer>/<myTestBlob>?comp=undelete"
$headers = #{
'Authorization' = "Bearer <accessToken>";
'x-ms-date' = $((get-date -format r).ToString());
'x-ms-version' = "2020-12-06";
}
Invoke-RestMethod -Method 'Put' -Uri $uri -Headers $headers
However, I don't know how to create the Bearer Access Token that is mentioned.
We have done a repro in our local environment & it is working fine, Below statements are based on our analysis.
You can use the below Powershell script which will help you in restoring the soft-deleted blobs in your storage account.
Here is the Powershell Script :
Connect-AzAccount
#Get all deleted blob within a container
$StorageAccount = Get-AzStorageAccount | Where-Object { $_.StorageAccountName -eq "<storageAccountName>" }
$Blobs = Get-AzStorageContainer -Name "<ContainerName>" -Context $StorageAccount.Context | Get-AzStorageBlob -IncludeDeleted
$DeletedBlobs=$($Blobs| Where-Object {$_.IsDeleted -eq $true})
#Get your Bearer access token
$resource = “https://storage.azure.com"
$context = [Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureRmProfileProvider]::Instance.Profile.DefaultContext
$accessToken = [Microsoft.Azure.Commands.Common.Authentication.AzureSession]::Instance.AuthenticationFactory.Authenticate($context.Account, $context.Environment, $context.Tenant.Id.ToString(), $null, [Microsoft.Azure.Commands.Common.Authentication.ShowDialog]::Never, $null, $resource).AccessToken
#Restore
foreach ($DeletedBlob in $DeletedBlobs) {
Write-Host "Restoring : $($DeletedBlob.Name)"
$uri = "$($DeletedBlob.BlobBaseClient.Uri.AbsoluteUri)?comp=undelete"
$headers = #{
'Authorization' = "Bearer $accessToken";
'x-ms-date' = $((get-date -format r).ToString());
'x-ms-version' = "2020-12-06";
}
Invoke-RestMethod -Method 'Put' -Uri $uri -Headers $headers
}
Here is the Sample output for your reference:
Note:
In order to perform the restoration of soft-deleted blob, you need to have a Storage Blob Data Contributor RBAC role on the Storage Account.

how do you use rest to put a .csv file onto a storage account?

how do you use resenter code heret to upload data to a storage account in the format of .csv
Get token
Get Request using invoke-rest method.
export data in csv to a storage account
$Request = Invoke-RestMethod #ParamRequest
Request.value.properties | export-csv -path $path -NoTypeInformation
API: https://learn.microsoft.com/en-us/rest/api/consumption/usage-details/list
Not really a PowerShell expert :) but essentially the idea is to create a Shared Access Signature (SAS) URL for the blob and then use that SAS URL for uploading the content directly into Azure Storage.
Here's what I came up with:
$accountName = "storage-account-name"
$accountKey = "storage-account-key"
$containerName = "blob-container-name"
$blobName = "blob-name.csv"
# Get storage context
$context = New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $accountKey
# Get Shared Access Signature (SAS) Token expiration time. I have set it to expire after 1 hour.
$sasExpiry = (Get-Date).AddHours(1).ToUniversalTime()
# Get a SAS Token with "write" permission that will expire after one hour.
$sasToken = New-AzStorageBlobSASToken -Context $context -Container $containerName -Blob $blobName -Permission "w" -ExpiryTime $sasExpiry
# Create a SAS URL
$sasUrl = "https://$accountName.blob.core.windows.net/$containerName/$blobName$sasToken"
# Set request headers
$headers = #{"x-ms-blob-type":"BlockBlob"}
# Set request content (body)
$body = "This is the content I wish to upload"
#Invoke "Put Blob" REST API
Invoke-RestMethod -Method "PUT" -Uri $sasUrl -Body $body -Headers $headers -Content-Type "text/csv"

How to write data in Azure Blob storage programmatically?

I am using below PowerShell script to read JSON data using REST API call from source. Now I want to load the data of $Result to the Azure Blob Storage. Any idea please?
$Params = #{
"URI" = 'https://3ea5e53b-817e-4c41-ae0b-c5afc1610f4e-bluemix.cloudant.com/test/_all_docs?include_docs=true'
}
$Result = Invoke-RestMethod #Params | ConvertTo-Json -Depth 9
Regarding the issue, you can use the following ways
Save the JSON into one file then upload the file to Azure blob
$Params = #{
"URI" = 'https://3ea5e53b-817e-4c41-ae0b-c5afc1610f4e-bluemix.cloudant.com/test/_all_docs?include_docs=true'
}
$Result = Invoke-RestMethod #Params | ConvertTo-Json -Depth 9
$Result | Out-File "D:\file.json"
$context=New-AzStorageContext -StorageAccountName "andyprivate" -StorageAccountKey ""
Set-AzStorageBlobContent -File "D:\file.json" `
-Container "" `
-Blob "file.json" `
-Context $context `
-StandardBlobTier Hot
Directly upload to Azure blob
$Params = #{
"URI" = 'https://3ea5e53b-817e-4c41-ae0b-c5afc1610f4e-bluemix.cloudant.com/test/_all_docs?include_docs=true'
}
$Result = Invoke-RestMethod #Params | ConvertTo-Json -Depth 9
Write-Host "the result is :"
$Result
$context=New-AzStorageContext -StorageAccountName "andyprivate" -StorageAccountKey ""
$container=Get-AzStorageContainer -Name "input" -Context $context
$content = [system.Text.Encoding]::UTF8.GetBytes($Result)
$container.CloudBlobContainer.GetBlockBlobReference("my.json").UploadFromByteArray($content,0,$content.Length)

Powershell function for modifying the ContentType of an object in Azure blob storage

I'm trying to write a function in Powershell that sets blobs in a specific container to a certain type, as they are always written with the type application/octet-stream which causes issues with downstream applications. I have written the below function but it returns the error 'ContentType' is a ReadOnly property.
I was wondering if there was any way around this? I know the property can be set manually in Azure Storage Explorer, however this is a daily task.
Function:
Function Set-ContentType {
Param (
[string]$accountName,
[string]$accessKey,
[string]$storageContainer
)
# Connect to blob storage and get blobs
$context = New-AzureStorageContext -StorageAccountName $accountName -StorageAccountKey $accessKey
$blobs = Get-AzureStorageBlob -Container $storageContainer -Context $context -Blob $fileMask
foreach ($blob in $blobs) {
if ($blob.ContentType -eq $genericMIME) {
$blob.ContentType = $targetMIME
}
}
}
So based on the link from the comments, please give this solution a try.
Function Set-ContentType {
Param (
[string]$accountName,
[string]$accessKey,
[string]$storageContainer
)
# Connect to blob storage and get blobs
$context = New-AzureStorageContext -StorageAccountName $accountName -StorageAccountKey $accessKey
$blobs = Get-AzureStorageBlob -Container $storageContainer -Context $context -Blob $fileMask
foreach ($blob in $blobs) {
if ($blob.ContentType -eq $genericMIME) {
$blob.Properties.ContentType = $targetMIME
$blob.SetProperties()
}
}
}
I have solved my own issue by writing an alternative upload script that defines the ContentType at the time of writing the blob:
Function UploadFile {
Param (
[string]$accountName,
[string]$accessKey
)
$context = New-AzureStorageContext -StorageAccountName $accountName -StorageAccountKey $accessKey
$files = Get-ChildItem $workingDir -Filter $fileMask
foreach ($file in $files) {
Set-AzureStorageBlobContent -File $file.FullName -Container $container -Properties #{"ContentType" = "$targetMIME"} -Context $context -Force
}
}

Resources