My powers-shell creates a output file which is
$csvpath = C:/temp/test.csv
$FilePath = $CsvPath + $Compute.Name + "-" + (Get-Date).ToString("yyyyMMdd_HHmmss") + ".csv"
Export-Csv -InputObject $ComputeJobs -Path $FilePath
I want to automate this to run everyday, unfortunately I am not sure how to do it in Azure runbook.
requesting help to understand how to save files to azure storage using Azure Runbook.
They are two separated things. You're using Azure Automation to trigger the execution of your powershell script on predefined times. To save the output into Azure Storage, you actually needs the powershell code for it.
I'm not a powershell dev, but I assume the following should work:
Connect-AzAccount
# Define Variables
$subscriptionId = "yourSubscriptionId"
$storageAccountRG = "yourResourceGroup"
$storageAccountName = "yourStorageAccount"
$storageContainerName = "yourContainer"
# Select right Azure Subscription
Select-AzSubscription -SubscriptionId $SubscriptionId
# Get Storage Account Key
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $storageAccountRG -AccountName $storageAccountName).Value[0]
# Set AzStorageContext
$ctx = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$csvpath = C:/temp/test.csv
$FilePath = $CsvPath + $Compute.Name + "-" + (Get-Date).ToString("yyyyMMdd_HHmmss") + ".csv"
Export-Csv -InputObject $ComputeJobs | Set-AzureStorageBlobContent -Container $storageContainerName -Context $ctx
https://learn.microsoft.com/en-us/powershell/module/azure.storage/set-azurestorageblobcontent?view=azurermps-6.13.0
Related
Trying to get the list of unused/inactive storage accounts in azure using powershell. Below is my script which im trying it will provide the storage account name and last modified date of your Azure storage accounts, but i need to list only the unused storage accounts names not all the storage accounts, for that some condition/filter i need to provide to achieve the same. Please assist me to solve this. Thanks in Advance
It will output the results into a table detailing the name and last modified date of your Azure storage accounts.
& {
foreach ($storageAccount in Get-AzStorageAccount) {
$storageAccountName = $storageAccount.StorageAccountName
$resourceGroupName = $storageAccount.ResourceGroupName
# Get storage account key
$storageAccountKey = (Get-AzStorageAccountKey -Name $storageAccountName -ResourceGroupName $resourceGroupName).Value[0]
# Create storage account context using above key
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
# Get the last modified date
$lastModified = Get-AzStorageContainer -Context $context | Sort-Object -Property #{Expression = {$_.LastModified.DateTime}} | Select-Object -Last 1 -ExpandProperty LastModified
# Collect the information to output to a table when the for loop has completed
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastModified.DateTime;
ResourceGroupName = $resourceGroupName
}
}
} | Format-Table Name, LastModified, ResourceGroupName -autosize
I tried to reproduce the same in my environment and got the same result as below:
By using the same script, I got the storage account name and last modified date of the Azure storage accounts.
To get only the unused/inactive storage accounts in azure using PowerShell, I modified the script like below:
I agree with #Niclas, you need make use of get-date command.
& {
foreach ($storageAccount in Get-AzStorageAccount) {
$storageAccountName = $storageAccount.StorageAccountName
$resourceGroupName = $storageAccount.ResourceGroupName
$storageAccountKey = (Get-AzStorageAccountKey -Name $storageAccountName -ResourceGroupName $resourceGroupName).Value[0]
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$lastModified = Get-AzStorageContainer -Context $context | Sort-Object -Property #{Expression = {$_.LastModified.DateTime}} | Select-Object -Last 1 -ExpandProperty LastModified
$unusedacc = (Get-Date).AddDays(-10)
if ($lastModified.DateTime -lt $unusedacc) {
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastModified.DateTime;
ResourceGroupName = $resourceGroupName
}
}
}
} | Format-Table Name, LastModified, ResourceGroupName -autosize
Note: Based on your requirement you can change the number of days in this line $unusedacc = (Get-Date).AddDays(-10).
If there are no unused Storage accounts, then it will return blank results like below:
Use get-date
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/get-date
and use the Where-Object.
# ADD THIS
$lastModDate = (get-date).AddDays(-5).Date
$lastMod = $lastModified | Where-Object { ($_.DateTime).Date -lt $lastModDate}
# If $lastMod.DateTime is NOT empty, then:
if ($lastMod.DateTime) {
# Write-Host "variable is NOT null " + $storageAccountName # For testing purpose
# Collect the information to output to a table when the for loop has completed
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastMod.DateTime; # CHANGE THIS
ResourceGroupName = $resourceGroupName
}
}
https://www.techielass.com/find-unused-storage-accounts-in-azure/
With your script:
With my changes:
I'm writing my first powershell script to load data from a CSV to an Azure Storage table. I'm not sure why the line
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName)[0].Value
is throwing an error:
Running Get-Module gives this result:
This is a snippet of the code that I have written till now:
# Step 1, Set variables
# Enter Table Storage location data
$resourceGroupName = "ComputeTesting"
$storageAccountName = 'computetestingdiag'
$tableName = 'strtable'
$dateTime = get-date
# Step 2, Login to your Azure subscription
$sub = Get-AzSubscription -ErrorAction SilentlyContinue
if(-not($sub))
{
Connect-AzAccount
}
# If you have multiple subscriptions, set the one to use
# Select-AzSubscription -SubscriptionId "<SUBSCRIPTIONID>"
# Step 3, Get the access key for the Azure Storage account
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName)[0].Value
# Step 4, Connect to Azure Table Storage
$storageCtx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzureStorageTable -Name $tableName -Context $storageCtx
I checked some of the similar questions and what I understand that uninstalling and re-installing the Azure modules might help. Although I didn't try this yet, is there any other workaround for this? Any help whatsoever would be highly helpful.
According to the script you provided, you use the Az and AzureRM modules at the same PowerShell session. It may cause conflicts. I suggest you use the one module in one session.
For example
$resourceGroupName = "<>"
$storageAccountName = '<>'
$tableName = '<>'
Connect-AzAccount
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName)[0].Value
$storageCtx = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzStorageTable -Name $tableName -Context $storageCtx
For more details about how to manage Azure table storage, please refer to the document
I am looking to delete all files from azure storage blob which are older than 'x' days. I am trying the below code but is not working:
$StorageAccountName = '<name>'
$StorageAccountKey = '<key>'
$Ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
Get-AzureStorageBlob -Container "reports" -Context $Ctx -Blob *.csv
where {$_.LastModified -le (get-date).AddDays(-30) } | Remove-AzureStorageBlob
I referred the following doc but the query is not working for conditional deletion. link
I suggest you use the new azure powershell module AZ.
After install the new AZ module, try the code below:
$accountname="xx"
$accountkey="xxx"
$ctx = New-AzStorageContext -StorageAccountName $accountname -StorageAccountKey $accountkey
Get-AzStorageBlob -Container "aa1" -Blob *.jpg -Context $ctx | where {$_.LastModified -le (Get-Date).AddDays(-1)} | Remove-AzStorageBlob
After the code running, you can check on azure portal or use Get-AzStorageBlob cmdlet to see if all the specified files are deleted. In my case, all the files' date < "1 day ago" are deleted.
Azure storage have feature "Manage the Azure Blob storage lifecycle".
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts
For your test case you can directly refer
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts#powershell
$action = Add-AzStorageAccountManagementPolicyAction -BaseBlobAction Delete -daysAfterModificationGreaterThan 2555
Thank you Ivan. I compared my script with yours and found that I was missing a pipe before the where condition which was issue. After putting the pipe, I am able to delete the files based on condition. Didn't needed to go to AzureAz.
The script which is working now is :
$StorageAccountName = 'xx'
$StorageAccountKey = 'yyy'
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
Get-AzureStorageBlob -Container "abc" -Blob *.pdf -Context $ctx | where {$_.LastModified -le (Get-Date).AddDays(-4)} | Remove-AzureStorageBlob
I want to use the runbook to delete another runbook output (an Azure File Share snapshot).
Is it possible? If you know something, please write something at here
Runbook 1: Create an Azure File share snapshot
$context = New-AzureStorageContext -StorageAccountName -StorageAccountKey
$share = Get-AzureStorageShare -Context
$context -Name "sharefile"
$snapshot = $share.Snapshot()
Runbook 2: Delete the Azure runbook output. The problem with this is that it deletes all snapshots rather than just delete the one created by the first runbook.
$allsnapshots = Get-AzureStorageShare -Context $context | Where-Object { $_.Name -eq "sharefile" -and $_.IsSnapshot -eq $true }
foreach($snapshot in $allsnapshots){
if($snapshot.SnapshotTime -lt (get-date).Add·Hours()){
$snapshot.Delete()
}
}
The sample code is as below, I test it in runbook and works well(create a snapshot, and then delete it after 3 minutes), and the other snapshots have no effect.
code in my powershell runbook:
param(
[string]$username,
[string]$password,
[string]$filesharename
)
$context = New-AzureStorageContext -StorageAccountName $username -StorageAccountKey $password
$share = Get-AzureStorageShare -Context $context -Name $filesharename
$s = $share.snapshot()
#get the snapshot name, which is always a UTC time formated value
$s2= $s.SnapshotQualifiedStorageUri.PrimaryUri.ToString()
#the $snapshottime is actually equal to snapshot name
$snapshottime = $s2.Substring($s2.IndexOf('=')+1)
write-output "create a snapshot"
write-output $snapshottime
#wait 180 seconds, then delete the snapshot
start-sleep -s 180
write-output "delete the snapshot"
$snap = Get-AzureStorageShare -Context $context -SnapshotTime $snapshottime -Name $filesharename
$snap.Delete()
write-output "deleted successfully after 3 minutes"
after it's running, you can see the snapshot is created in azure portal:
After it completes, the specified snapshot is deleted(you may need to open a new webpage to see the change due to some cache issue)
the output in runbook:
Similar to this question How to get size of Azure CloudBlobContainer
How can one get the size of the Azure Container in PowerShell. I can see a suggested script at https://gallery.technet.microsoft.com/scriptcenter/Get-Billable-Size-of-32175802 but want to know if there is a simpler way to do in PowerShell
With Azure PowerShell, you can list all blobs in the container with Get-AzureStorageBlob with Container and Context parameter like:
$ctx = New-AzureStorageContext -StorageAccountName youraccountname -storageAccountKey youraccountkey
$blobs = Get-AzureStorageBlob -Container containername -Context $ctx
Output of Get-AzureStorageBlob is an array of AzureStorageBlob, which has a property with name ICloudBlob, you can get blob length in its Properties, then you can sum length of all blobs to get content length of the container.
The following PowerShell script is a simple translation of the c# code in the accepted answer of the question How to get size of Azure CloudBlobContainer. Hope this suit your needs.
Login-AzureRmAccount
$accountName = "<your storage account name>"
$keyValue = "<your storage account key>"
$containerName = "<your container name>"
$storageCred = New-Object Microsoft.WindowsAzure.Storage.Auth.StorageCredentials ($accountName, $keyValue)
$storageAccount = New-Object Microsoft.WindowsAzure.Storage.CloudStorageAccount ($storageCred, $true)
$container = $storageAccount.CreateCloudBlobClient().GetContainerReference($containerName)
$length = 0
$blobs = $container.ListBlobs($null, $true, [Microsoft.WindowsAzure.Storage.Blob.BlobListingDetails]::None, $null, $null)
$blobs | ForEach-Object {$length = $length + $_.Properties.Length}
$length
Note: the leading Login-AzureRmAccount command will load the necessary .dll for you. If you do know the path of "Microsoft.WindowsAzure.Storage.dll", you can replace it by [Reflection.Assembly]::LoadFile("$StorageLibraryPath") | Out-Null. The path is usually like this "C:\Program Files\Microsoft SDKs\Azure.NET SDK\v2.7\ToolsRef\Microsoft.WindowsAzure.Storage.dll"
Here's my solution I just hammered through today. Above examples didn't give me what I wanted which was (1) a byte sum of all blobs in a container and (2) a list of each blob + path + size so that it can be used to compare the results to a du -b on linux (origin).
Login-AzureRmAccount
$ResourceGroupName = ""
$StorageAccountName = ""
$StorageAccountKey = ""
$ContainerName = ""
New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
# Don't NEED the Resource Group but, without it, fills the screen with red as it search each RG...
$size = 0
$blobs = Get-AzureRmStorageAccount -ResourceGroupName $ResourceGroupName -Name $StorageAccountName -ErrorAction Ignore | Get-AzureStorageBlob -Container $ContainerName
foreach ($blob in $blobs) {$size = $size + $blob.length}
write-host "The container is $size bytes."
$properties = #{Expression={$_.Name};Label="Name";width=180}, #{Expression={$_.Length};Label="Bytes";width=80}
$blobs | ft $properties | Out-String -width 800 | Out-File -Encoding ASCII AzureBlob_files.txt
I then moved the file to Linux to do some flip flopping of it and the find output to create a list of files to input into blobxfer. Solution to a different problem, but perhaps a suitable solution for your needs as well.