How to save the file received as response to powershell web request within an Azure function app to an Azure storage container? - azure

I have an Azure function app that makes the following web request:
Invoke-RestMethod -Uri $uri -Method 'GET' -Headers $headers -Body $body -SkipCertificateCheck
On my local machine, I can save the returned XML file as OutFile to local disk. However, in the actual environment this is on an Azure function app and therefore I don't think that I can save the file to disk. Instead, I want to redirect it to a storage container in Azure.
I have tried Azure function output bindings in order to redirect the response there but failed to write the actual file:
Push-OutputBinding -Name outputBlob -Value $response.content
All this write to the storage container is a string value. So how do I write the actual file received as a response to InvokeRestMethod within the Azure function app to the Azure storage container in this environment?

Unless the file is huge, you can indeed save it to the local file system, even when running on Azure. Just select the proper location and don't forget to clean up:
$f = New-TemporaryFile
try {
Invoke-RestMethod ... -OutFile $f.FullName
# Do whatever you want with this file
...
} finally {
Remove-Item $f.FullName
}

Please read your file into byte array and pass it as argument.
using namespace System.Net
# Input bindings are passed in via param block.
param($Request, $TriggerMetadata)
# Write to the Azure Functions log stream.
Write-Host "PowerShell HTTP trigger function processed a request."
New-Item -Path 'test.txt' -ItemType File
Set-Content 'test.txt' 'Welcome to www.thecodemanual.pl'
$file = [System.IO.File]::ReadAllBytes('test.txt')
Push-OutputBinding -Name outputBlob -Value $file
# Associate values to output bindings by calling 'Push-OutputBinding'.
Push-OutputBinding -Name Response -Value ([HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = $body
})
and then you will get:

Related

How to pass json file as parameter to Azure Powershell Runbook?

I'm trying to deploy sentinel alerts into sentinel using Azure Runbook by using the below command:
Import-AzSentinelAlertRule -WorkspaceName "xxx" -SettingsFile "test_alert.json"
The SettingsFile of this command expects a path of json as parameter. How we can pass the json file to runbook?
How we can pass the json file to runbook?
I have reproduced in my environment and I followed Microsoft-Document and I got expected results as below:
Param(
[parameter(Mandatory=$true)]
[object]$json
)
$json = $json | ConvertFrom-Json
Then save and publish runbook.
Then open your local windows PowerShell and follow below steps:
Step1:
Connect-AzAccount
Step2:
$json = (Get-content -path "C:Downloads\xy.json") | Out-string
Step3:
$RBParams = #{
AutomationAccountName = 'rithwikrunning'
ResourceGroupName = 'XX'
Name = 'xy'
Parameters = $JsonParams
}
XX- Name of the resource Group
xy- Name of the runbook
Step4:
$job = Start-AzAutomationRunbook #RBParams
Now the json file is passed to run book and it got started:
Now the content of the file or file is in $json variable in runbook.
References:
Transferring Microsoft Sentinel scheduled alert rules between different workspaces using PowerShell - Microsoft Community Hub

Downloading a file from Adls gen1 using powershell is not working

I am trying a download a file from azure data lake store using powershell script. The code is triggered from a runbook in azure cloud. Looks like Export-AdlStoreItem is not working as expected.I dont get any error messages or compilation errors. In fact when this command is executed a zero byte file is generated in the destination.The name of that file is TemporaryFile2020-06-02_14-56-57.datc18371c2-d39c-4588-9af0-93aa3e136b01Segments
what is happening?.Please help!.
$LocalDwldPath = "T:\ICE_PROD_DATA_SOURCING\FILE_DOWNLOAD_PATH\TemporaryFile$($TimeinSec).dat"
$SourcePath = "Dataproviders/Landing/GCW/HPIndirect/Orders/AMS/gcw_hp_indirect_orders_ams_745_20200601_04_34_01.dat"
$PRODAdlsName = "itgdls01"
Export-AdlStoreItem -Account $PRODAdlsName -Path $("/" + $SourcePath.Trim()) -Destination $LocalDwldPath -Force -ErrorAction Stop
if( Test-Path $LocalDwldPath.Trim() )
{
Get-Content -Path $LocalDwldPath.Trim() -ReadCount 1000 |% { $FileCount += $_.Count }
Remove-Item $LocalDwldPath.Trim()
Set-Content -Path $cntCaptureFile -Value $FileCount
$TimeinSec = TimeStamp2
Add-Content -Value "$TimeinSec Log: Identified file for getting count is $($SourcePath.Trim()) and the count is $FileCount" -Path $logfile
}
else
{
$TimeinSec = TimeStamp2
Add-Content -Value "$TimeinSec Error: Identified file for getting count is $($SourcePath.Trim()) and the count capture failed as local file is not found!" -Path $logfile
}
According to my research, if you want to download a file from azure data lake with PowerShell, we can use the PowerShell command Export-AzDataLakeStoreItem.
For example
Export-AzDataLakeStoreItem -Account <> -Path '/test/test.csv' -Destination 'D:\myDirectory\test.csv'
For more details, please refer to the document
The issue was with the local download path/destination
("T:\ICE_PROD_DATA_SOURCING\FILE_DOWNLOAD_PATH\TemporaryFile$($TimeinSec).dat").
The T:\ drive is a virtual drive/network drive connected as Azure Fileshare.
Instead of T:\ , I have pointed the destination location to a local drive ("F:\ICE_PROD_DATA_SOURCING\FILE_DOWNLOAD_PATH\TemporaryFile$($TimeinSec).dat") and it worked fine.
Its surprising that Powershell didn't give any error messages when it was not able to save the file in a network path pointed to azure fileshare.

Upload blob with Set-AzStorageBlobContent via pipeline and set ContentType property

Using the Az PowerShell module, I'm trying to enumerate a directory on disk and pipe the output to Set-AzStorageBlobContent to upload to Azure, while preserving the folder structure. This works great, except the ContentType property of all blobs is set to application/octet-stream. I'd like to set it dynamically based on the file extension of the blob being uploaded.
Here's example code for the base case:
Get-ChildItem $SourceRoot -Recurse -File |
Set-AzStorageBlobContent -Container $ContainerName -Context $context -Force
To set the ContentType, I need to add a Properties parameter to Set-AzStorageBlobContent with a value like #{ "ContentType" = "<content type>" }. The content type should be determined from the specific file extension being uploaded. I've written a separate pipelined function that can add a MimeType property to the file object, but I can't figure out how to reference that for the parameter in the pipeline. Example:
function Add-MimeType{
[cmdletbinding()]
param(
[parameter(
Mandatory = $true,
ValueFromPipeline = $true)]
$pipelineInput
)
Process {
$mimeType = Get-MimeType $pipelineInput.Extension
Add-Member -InputObject $pipelineInput -NotePropertyName "MimeType" -NotePropertyValue $mimeType
return $pipelineInput
}
}
function Get-MimeType(
[string]$FileExtension
)
{
switch ($FileExtension.ToLowerInvariant())
{
'.txt' { return 'text/plain' }
'.xml' { return 'text/xml' }
default { return 'application/octet-stream' }
}
}
Get-ChildItem $SourceRoot -Recurse -File |
Add-MimeType |
Set-AzStorageBlobContent -Container $ContainerName -Properties #{"ContentType" = "$($_.MimeType)"} -Context $context -Force
It seems that $_ isn't usable in this context. Is there another way to accomplish this?
The reason I'd like to continue using pipelining is that it appears to work much faster than using a ForEach-Object loop to call the function (where $_ does work).
If you are open to completely different solutions, you can also use AzCopy.
You can upload your whole folder with one command, and AzCopy can also automatically guess the correct mime type based on the file extension. There is also support for Azure Pipelines, if that is part of your setup.
Command could look something like this:
# AzCopy v10 will automatically guess the content type unless you pass --no-guess-mime-type
azcopy copy 'C:\myDirectory' 'https://mystorageaccount.blob.core.windows.net/mycontainer' --recursive
# AzCopy V8
azcopy copy 'C:\myDirectory' 'https://mystorageaccount.blob.core.windows.net/mycontainer' /s /SetContentType
Taken from the output of AzCopy.exe copy --help:
AzCopy automatically detects the content type of the files when uploading from the local disk, based on the file extension or content (if no extension is specified).
The built-in lookup table is small, but on Unix, it is augmented by the local system's mime.types file(s) if available under one or more of these names:
/etc/mime.types
/etc/apache2/mime.types
/etc/apache/mime.types
On Windows, MIME types are extracted from the registry. This feature can be turned off with the help of a flag. Please refer to the flag section.

Using Set-AzStorageBlobContent to upload only new content without prompts

I'm enumerating a local folder and uploading to Azure storage. I want to only upload new content to my Azure storage. If I use Set-AzStorageBlobContent with -Force, it'll overwrite everything. If I use it without -Force, it'll prompt on items that already exist. I can use Get-AzStorageBlob to check if the item already exists, but it prints red errors if the item does not exist. I can't find a combination of these items that gracefully uploads only new content without printing any errors or prompting. Am I using the wrong approach?
FINAL EDIT: adding working solution based on suggestions from Ivan Yang. Now only new files are uploaded, without any error messages. The key was to use -ErrorAction Stop to convert the error message into an exception, and then catch the exception.
# In my code this is part of a Test-Blob function that returns $blobFound
$blobFound = $false
try
{
$blobInfo = Get-AzStorageBlob `
-Container $containerName `
-Context $storageContext `
-Blob $blobPath `
-ErrorAction Stop
$blobFound = ($null -ne $blobInfo)
}
catch [Microsoft.WindowsAzure.Commands.Storage.Common.ResourceNotFoundException]
{
# Eat the error that'd otherwise be printed
}
# Note in my code this is actually a call to my Test-Blob function
if ($false -eq $blobFound)
{
Set-AzStorageBlobContent `
-Container $containerName `
-Context $storageContext `
-File $sourcePath `
-Blob $blobPath `
-Force # -Force is unnecessary but just being paranoid to avoid prompts
}
I see you have mentioned trying Get-AzStorageBlob, why not use it continually?
The trick here is that you can use try-catch-finally, which can properly handle the error if the blob does not exist in azure.
The sample code works at my side for uploading a single file, and you can modify it to upload multi-files:
$account_name ="xxx"
$account_key ="xxx"
$context = New-AzStorageContext -StorageAccountName $account_name -StorageAccountKey $account_key
#use this flag to determine if a blob exists or not in azure. And assume it exists at first.
$is_exist = $true
try
{
Get-AzStorageBlob -Container test3 -Blob a.txt -Context $context -ErrorAction Stop
}
catch [Microsoft.WindowsAzure.Commands.Storage.Common.ResourceNotFoundException]
{
#if the blob does not exist in azure, do the following
$is_exist = $false
Write-Output "the blob DOES NOT exists."
}
finally
{
#only execute the code when the blob does not exist in azure blob storage.
if(!$is_exist)
{
Set-AzStorageBlobContent -Container test3 -File "d:\myfolder\a.txt" -Blob a.txt -Context $context
Write-Output "uploaded!"
}
}
Not a PowerShell solution but I would suggest that you take a look at AzCopy. It's like RoboCopy but for Azure storage. A command line tool which allows you to synch, copy, move and more. It's free, works on macOS, Linux and Windows. And also, it is fast!
I use AzCopy from PowerShell scripts and it makes lie a lot easier (I'm managing millions of files and the stability and speed of AzCopy really helps)
This command is not smart enough to detect which files are new. You need to keep in the folder just the files you want to upload.
Simply use Set-AzStorageBlobContent -Force all the time.
The alternative is to check for existing file, download the file content, compare the files, and upload if different. The amount of processing/IO will only increase this way.

Delete a corrupted Azure Functions App

I have created a functions app and then created a function, The function name defaulted to TriggerCSharp1 or similar.
After adding code I was wondering how to change the function name so I tried Ftp'ing into the functions app and manually changed the folder name TriggerCSharp1. I went back to the Azure portal and now when I click on the function app I get an error The access token is invalid. and nothing appears beneath, see screen-shot below.
I am not sure how I can delete this function app now since I can't get in its blade. The only way I can think of now is to delete the resource group that contains this function app but that is not something I can do since I have tons of other resources in there too.
Edit:
As suggested by David, resources.azure.com is easier and requires no client bits.
Solved using Azure-CLI with the command azure site delete <site name>
One function app may contain multiple functions and hence deleting the whole function app if only one function is corrupted may be overkill. For those in this situation deleting the function folder in the Kudu console/using powershell is a better way.
USING KUDU CONSOLE
Go to https://<yourFunctionApp>.scm.azurewebsites.net
Click on DEBUG(top bar) -> CMD and in the new page that appears navigate to site -> wwwroot
Find your function there and delete it (click on the icon to the right of the Edit/pencil icon)
USING POWERSHELL
(based on this)
$username = '<publish username>' #IMPORTANT: use single quotes as username may contain $
$password = "<publish password>"
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
$commandBody = #{
command = "rm -d -r myAzureFunction"
dir = "site\\wwwroot"
}
$deleteUrl = "https://<myFunctionApp>.scm.azurewebsites.net/api/command"
Invoke-RestMethod -Uri $deleteUrl -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -Method POST `
-ContentType "application/json" -Body (ConvertTo-Json $commandBody)
The publish username and publish password can be obtained as detailed here
$creds = Invoke-AzureRmResourceAction -ResourceGroupName YourResourceGroup -ResourceType Microsoft.Web/sites/config -ResourceName YourWebApp/publishingcredentials -Action list -ApiVersion 2015-08-01 -Force
$username = $creds.Properties.PublishingUserName
$password = $creds.Properties.PublishingPassword
If you wanted to do this in PowerShell, something like the following should work ...
Login-AzureRmAccount #Enter your username / pw
$funcApp = Get-AzureRmWebApp -Name "Your Function App Name"
Remove-AzureRmWebApp -WebApp $funcApp
If you have more than one subscription then make sure your using the right one. You could also add a -Confirm:$true to the remove command if you didn't want the check prompt.

Resources