The Azure PowerShell session has not been properly initialized. Please import the module and try again - azure

I'm writing my first powershell script to load data from a CSV to an Azure Storage table. I'm not sure why the line
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName)[0].Value
is throwing an error:
Running Get-Module gives this result:
This is a snippet of the code that I have written till now:
# Step 1, Set variables
# Enter Table Storage location data
$resourceGroupName = "ComputeTesting"
$storageAccountName = 'computetestingdiag'
$tableName = 'strtable'
$dateTime = get-date
# Step 2, Login to your Azure subscription
$sub = Get-AzSubscription -ErrorAction SilentlyContinue
if(-not($sub))
{
Connect-AzAccount
}
# If you have multiple subscriptions, set the one to use
# Select-AzSubscription -SubscriptionId "<SUBSCRIPTIONID>"
# Step 3, Get the access key for the Azure Storage account
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName)[0].Value
# Step 4, Connect to Azure Table Storage
$storageCtx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzureStorageTable -Name $tableName -Context $storageCtx
I checked some of the similar questions and what I understand that uninstalling and re-installing the Azure modules might help. Although I didn't try this yet, is there any other workaround for this? Any help whatsoever would be highly helpful.

According to the script you provided, you use the Az and AzureRM modules at the same PowerShell session. It may cause conflicts. I suggest you use the one module in one session.
For example
$resourceGroupName = "<>"
$storageAccountName = '<>'
$tableName = '<>'
Connect-AzAccount
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName)[0].Value
$storageCtx = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzStorageTable -Name $tableName -Context $storageCtx
For more details about how to manage Azure table storage, please refer to the document

Related

Need help spiting Azure snapshots into another tenant/subscription

I have a script which snapshots all my disks in a certain RG.
However when I do the snapshotting, I need them to be spat out into another tenant/subscription for a migration project!
I've got as far as snapshotting everything and spitting them into a different RG but I need to take it a step further and spit them into the same named RG but in a different tenant/sub.
My script is below:
Login-AzureRmAccount -Credential $psCred –SubscriptionId $SubscriptionId -ErrorAction Stop | out-null
Connect-AzureRmAccount
Get-AzureRmSubscription -SubscriptionId $SubscriptionId | Select-AzureRmSubscription
$tagResList = Get-AzureRmResource -TagName Environment -TagValue Staging
#$tagResList = Find-AzureRmResource -ResourceGroupNameEquals testrs
#$tagRsList[0].ResourceId.Split("//")
#subscriptions
#<SubscriptionId>
#resourceGroups
#<ResourceGroupName>
#providers
#Microsoft.Compute
#virtualMachines
#<vmName>
foreach($tagRes in $tagResList) {
if($tagRes.ResourceId -match "Microsoft.Compute")
{
$vmInfo = Get-AzureRmVM sandbox207478603000 #$tagRes.ResourceId.Split("//")[4] -Name $tagRes.ResourceId.Split("//")[8]
#Set local variables
$location = $vmInfo.Location
$resourceGroupName = $vmInfo.ResourceGroupName
$timestamp = Get-Date -f MM-dd-yyyy_HH_mm_ss
#Snapshot name of OS data disk
$snapshotName = $vmInfo.Name + $timestamp
#Create snapshot configuration
$snapshot = New-AzureRmSnapshotConfig -SourceUri $vmInfo.StorageProfile.OsDisk.ManagedDisk.Id -Location $location -CreateOption copy
#Take snapshot
New-AzureRmSnapshot -Snapshot $snapshot -SnapshotName $snapshotName snapshots $resourceGroupName
if($vmInfo.StorageProfile.DataDisks.Count -ge 1){
#Condition with more than one data disks
for($i=0; $i -le $vmInfo.StorageProfile.DataDisks.Count - 1; $i++){
#Snapshot name of OS data disk
$snapshotName = $vmInfo.StorageProfile.DataDisks[$i].Name + $timestamp
#Create snapshot configuration
$snapshot = New-AzureRmSnapshotConfig -SourceUri $vmInfo.StorageProfile.DataDisks[$i].ManagedDisk.Id -Location $location -CreateOption copy
#Take snapshot
New-AzureRmSnapshot -Snapshot $snapshot -SnapshotName $snapshotName snapshots $ResourceGroupName
}
}
else{
Write-Host $vmInfo.Name + " doesn't have any additional data disk."
}
}
else{
$tagRes.ResourceId + " is not a compute instance"
}
}
$tagRgList = Get-AzureRmResourceGroup -Tag #{ Environment = "Staging" }
I am not sure if you can save snapshot in another tenant in one command as you'd need to be authenticated there.
I would suggest using azcopy tool to move snapshot files between storage accounts
#######################################
Reviewed your comment and found that indeed you can't use azcopy on vm images.
But you may create access to the snapshot
#Generate the SAS for the snapshot
$sas = Grant-AzSnapshotAccess -ResourceGroupName $ResourceGroupName -SnapshotName $SnapshotName -DurationInSecond $sasExpiryDuration -Access Read
and save it to the destination storage account:
#Copy the snapshot to the storage account
Start-AzStorageBlobCopy -AbsoluteUri $sas.AccessSAS -DestContainer $storageContainerName -DestContext $destinationContext -DestBlob $destinationVHDFileName
More details canbe found here

How to save files to azure storage using runbooks

My powers-shell creates a output file which is
$csvpath = C:/temp/test.csv
$FilePath = $CsvPath + $Compute.Name + "-" + (Get-Date).ToString("yyyyMMdd_HHmmss") + ".csv"
Export-Csv -InputObject $ComputeJobs -Path $FilePath
I want to automate this to run everyday, unfortunately I am not sure how to do it in Azure runbook.
requesting help to understand how to save files to azure storage using Azure Runbook.
They are two separated things. You're using Azure Automation to trigger the execution of your powershell script on predefined times. To save the output into Azure Storage, you actually needs the powershell code for it.
I'm not a powershell dev, but I assume the following should work:
Connect-AzAccount
# Define Variables
$subscriptionId = "yourSubscriptionId"
$storageAccountRG = "yourResourceGroup"
$storageAccountName = "yourStorageAccount"
$storageContainerName = "yourContainer"
# Select right Azure Subscription
Select-AzSubscription -SubscriptionId $SubscriptionId
# Get Storage Account Key
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $storageAccountRG -AccountName $storageAccountName).Value[0]
# Set AzStorageContext
$ctx = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$csvpath = C:/temp/test.csv
$FilePath = $CsvPath + $Compute.Name + "-" + (Get-Date).ToString("yyyyMMdd_HHmmss") + ".csv"
Export-Csv -InputObject $ComputeJobs | Set-AzureStorageBlobContent -Container $storageContainerName -Context $ctx
https://learn.microsoft.com/en-us/powershell/module/azure.storage/set-azurestorageblobcontent?view=azurermps-6.13.0

Is it possible to use Azure Automation Runbook to delete another Runbook output (an Azure File share snapshot)?

I want to use the runbook to delete another runbook output (an Azure File Share snapshot).
Is it possible? If you know something, please write something at here
Runbook 1: Create an Azure File share snapshot
$context = New-AzureStorageContext -StorageAccountName -StorageAccountKey
$share = Get-AzureStorageShare -Context
$context -Name "sharefile"
$snapshot = $share.Snapshot()
Runbook 2: Delete the Azure runbook output. The problem with this is that it deletes all snapshots rather than just delete the one created by the first runbook.
$allsnapshots = Get-AzureStorageShare -Context $context | Where-Object { $_.Name -eq "sharefile" -and $_.IsSnapshot -eq $true }
foreach($snapshot in $allsnapshots){
if($snapshot.SnapshotTime -lt (get-date).Add·Hours()){
$snapshot.Delete()
}
}
The sample code is as below, I test it in runbook and works well(create a snapshot, and then delete it after 3 minutes), and the other snapshots have no effect.
code in my powershell runbook:
param(
[string]$username,
[string]$password,
[string]$filesharename
)
$context = New-AzureStorageContext -StorageAccountName $username -StorageAccountKey $password
$share = Get-AzureStorageShare -Context $context -Name $filesharename
$s = $share.snapshot()
#get the snapshot name, which is always a UTC time formated value
$s2= $s.SnapshotQualifiedStorageUri.PrimaryUri.ToString()
#the $snapshottime is actually equal to snapshot name
$snapshottime = $s2.Substring($s2.IndexOf('=')+1)
write-output "create a snapshot"
write-output $snapshottime
#wait 180 seconds, then delete the snapshot
start-sleep -s 180
write-output "delete the snapshot"
$snap = Get-AzureStorageShare -Context $context -SnapshotTime $snapshottime -Name $filesharename
$snap.Delete()
write-output "deleted successfully after 3 minutes"
after it's running, you can see the snapshot is created in azure portal:
After it completes, the specified snapshot is deleted(you may need to open a new webpage to see the change due to some cache issue)
the output in runbook:

How to move an azure VM image to a different location

I have an azure VM image with a managed disk in east US and I want to move/copy it to West Europe.
Is there an easy way to do it?
I saw that there's an azure cli extension called az-image-copy but it doesn't work for me because it outputs an error which says that it can't find the OS disk (even though the resource ID is correct and I can see it in the azure portal)
ERROR: Resource ServerLinux_OsDisk_1_c208679747734937b10a1525aa84a7d7 is not found
So is there any other way to do it?
You could use azure powershell to copy the managed images, create a snapshot and copy it to anther region, then create the image. Here is a similar issue.
Create a snapshot:
<# -- Create a snapshot of the OS (and optionally data disks) from the generalized VM -- #>
$vm = Get-AzureRmVM -ResourceGroupName $resourceGroupName -Name $vmName
$disk = Get-AzureRmDisk -ResourceGroupName $resourceGroupName -DiskName $vm.StorageProfile.OsDisk.Name
$snapshot = New-AzureRmSnapshotConfig -SourceUri $disk.Id -CreateOption Copy -Location $region
$snapshotName = $imageName + "-" + $region + "-snap"
New-AzureRmSnapshot -ResourceGroupName $resourceGroupName -Snapshot $snapshot -SnapshotName $snapshotName
Copy the snapshot:
# Create the name of the snapshot, using the current region in the name.
$snapshotName = $imageName + "-" + $region + "-snap"
# Get the source snapshot
$snap = Get-AzureRmSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $snapshotName
# Create a Shared Access Signature (SAS) for the source snapshot
$snapSasUrl = Grant-AzureRmSnapshotAccess -ResourceGroupName $resourceGroupName -SnapshotName $snapshotName -DurationInSecond 3600 -Access Read
# Set up the target storage account in the other region
$targetStorageContext = (Get-AzureRmStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName).Context
New-AzureStorageContainer -Name $imageContainerName -Context $targetStorageContext -Permission Container
# Use the SAS URL to copy the blob to the target storage account (and thus region)
Start-AzureStorageBlobCopy -AbsoluteUri $snapSasUrl.AccessSAS -DestContainer $imageContainerName -DestContext $targetStorageContext -DestBlob $imageBlobName
Get-AzureStorageBlobCopyState -Container $imageContainerName -Blob $imageBlobName -Context $targetStorageContext -WaitForComplete
# Get the full URI to the blob
$osDiskVhdUri = ($targetStorageContext.BlobEndPoint + $imageContainerName + "/" + $imageBlobName)
# Build up the snapshot configuration, using the target storage account's resource ID
$snapshotConfig = New-AzureRmSnapshotConfig -AccountType StandardLRS `
-OsType Windows `
-Location $targetRegionName `
-CreateOption Import `
-SourceUri $osDiskVhdUri `
-StorageAccountId "/subscriptions/${sourceSubscriptionId}/resourceGroups/${resourceGroupName}/providers/Microsoft.Storage/storageAccounts/${storageAccountName}"
# Create the new snapshot in the target region
$snapshotName = $imageName + "-" + $targetRegionName + "-snap"
$snap2 = New-AzureRmSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $snapshotName -Snapshot $snapshotConfig
Create the image:
<# -- In the second subscription, create a new Image from the copied snapshot --#>
Select-AzureRmSubscription -SubscriptionId $targetSubscriptionId
$snap = Get-AzureRmSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $snapshotName
$imageConfig = New-AzureRmImageConfig -Location $destinationRegion
Set-AzureRmImageOsDisk -Image $imageConfig `
-OsType Windows `
-OsState Generalized `
-SnapshotId $snap.Id
New-AzureRmImage -ResourceGroupName $resourceGroupName `
-ImageName $imageName `
-Image $imageConfig
For more details, refer to this link.
If the resources are under the same subscription then you can get move resources to from the Resource Group 01 (East Us) to the Resource Group 02 (West Europe).
For your help you can check these documents:
Move Managed Disks and VMs
Move resources to new resource group or subscription
Copy managed disks in the same subscription or different subscription with PowerShell
Migrating Azure IaaS solutions using MigAz
The last link is about the MigAz tool, which allow to migrate resources on Azure.

How to get size of Azure Container in PowerShell

Similar to this question How to get size of Azure CloudBlobContainer
How can one get the size of the Azure Container in PowerShell. I can see a suggested script at https://gallery.technet.microsoft.com/scriptcenter/Get-Billable-Size-of-32175802 but want to know if there is a simpler way to do in PowerShell
With Azure PowerShell, you can list all blobs in the container with Get-AzureStorageBlob with Container and Context parameter like:
$ctx = New-AzureStorageContext -StorageAccountName youraccountname -storageAccountKey youraccountkey
$blobs = Get-AzureStorageBlob -Container containername -Context $ctx
Output of Get-AzureStorageBlob is an array of AzureStorageBlob, which has a property with name ICloudBlob, you can get blob length in its Properties, then you can sum length of all blobs to get content length of the container.
The following PowerShell script is a simple translation of the c# code in the accepted answer of the question How to get size of Azure CloudBlobContainer. Hope this suit your needs.
Login-AzureRmAccount
$accountName = "<your storage account name>"
$keyValue = "<your storage account key>"
$containerName = "<your container name>"
$storageCred = New-Object Microsoft.WindowsAzure.Storage.Auth.StorageCredentials ($accountName, $keyValue)
$storageAccount = New-Object Microsoft.WindowsAzure.Storage.CloudStorageAccount ($storageCred, $true)
$container = $storageAccount.CreateCloudBlobClient().GetContainerReference($containerName)
$length = 0
$blobs = $container.ListBlobs($null, $true, [Microsoft.WindowsAzure.Storage.Blob.BlobListingDetails]::None, $null, $null)
$blobs | ForEach-Object {$length = $length + $_.Properties.Length}
$length
Note: the leading Login-AzureRmAccount command will load the necessary .dll for you. If you do know the path of "Microsoft.WindowsAzure.Storage.dll", you can replace it by [Reflection.Assembly]::LoadFile("$StorageLibraryPath") | Out-Null. The path is usually like this "C:\Program Files\Microsoft SDKs\Azure.NET SDK\v2.7\ToolsRef\Microsoft.WindowsAzure.Storage.dll"
Here's my solution I just hammered through today. Above examples didn't give me what I wanted which was (1) a byte sum of all blobs in a container and (2) a list of each blob + path + size so that it can be used to compare the results to a du -b on linux (origin).
Login-AzureRmAccount
$ResourceGroupName = ""
$StorageAccountName = ""
$StorageAccountKey = ""
$ContainerName = ""
New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
# Don't NEED the Resource Group but, without it, fills the screen with red as it search each RG...
$size = 0
$blobs = Get-AzureRmStorageAccount -ResourceGroupName $ResourceGroupName -Name $StorageAccountName -ErrorAction Ignore | Get-AzureStorageBlob -Container $ContainerName
foreach ($blob in $blobs) {$size = $size + $blob.length}
write-host "The container is $size bytes."
$properties = #{Expression={$_.Name};Label="Name";width=180}, #{Expression={$_.Length};Label="Bytes";width=80}
$blobs | ft $properties | Out-String -width 800 | Out-File -Encoding ASCII AzureBlob_files.txt
I then moved the file to Linux to do some flip flopping of it and the find output to create a list of files to input into blobxfer. Solution to a different problem, but perhaps a suitable solution for your needs as well.

Resources