I found this blog post
https://www.techmanyu.com/automate-disk-snapshots-azure/
And the author showes this script.
$clientID = "<client id>"
$key = "<client secret>"
$SecurePassword = $key | ConvertTo-SecureString -AsPlainText -Force
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $clientID, $SecurePassword
Add-AzureRmAccount -Credential $cred -Tenant "<Tenant ID>" -ServicePrincipal;
$disks=Get-AzureRmDisk | Select Name,Tags,Id,Location,ResourceGroupName ;
foreach($disk in $disks) { foreach($tag in $disk.Tags) { if($tag.Snapshot -eq 'True') {$snapshotconfig = New-AzureRmSnapshotConfig -SourceUri $disk.Id -CreateOption Copy -Location $disk.Location -AccountType PremiumLRS;$SnapshotName=$disk.Name+(Get-Date -Format "yyyy-MM-dd");New-AzureRmSnapshot -Snapshot $snapshotconfig -SnapshotName $SnapshotName -ResourceGroupName $disk.ResourceGroupName }}}
Trying to understand the script, I came up with the question, where are the snapshots stored? In the same managed disk as the VM disk?
After executing the script, it will create the snapshots that you can check them in the portal, they have the resource type of Microsoft.Compute/snapshots.
Essentially, they should be stored in the blob storage. Navigate to the snapshot in the portal -> Export, then you will find it generates a SAS token of the snapshot like https://md-nxxxqz.blob.core.windows.net/wxxxxxx0m/abcd?sv=2017-04-17&sr=b&si=31b3d91b-51be-4c1c-930e-996f382b8ad9&sig=xxxxxx. The md-nxxxqz is the storage account which stores the snapshots, it is managed by Azure.
Related
As you can see from the code below I can upload on certificate at a time, but the problem is it wipes out all of the existing certificates when doing so.
$Tenant_ID = '00000000-0000-0000-0000-000000000000'
$Subscription_ID = '00000000-0000-0000-0000-000000000000'
$Azure_PassWord = (Get-StoredCredential -Target 'Domain' -Type Generic -AsCredentialObject).Password
$UserName = "$($env:USERNAME)#Google.com"
$EncryptedPassword = ConvertTo-SecureString $Azure_PassWord -AsPlainText -Force
$Credential = New-Object System.Management.Automation.PsCredential($UserName,$EncryptedPassword)
$AzureConnection = (Connect-AzAccount -Credential $Credential -Tenant $Tenant_ID -Subscription $Subscription_ID -WarningAction 'Ignore').context
$AzureContext = (Set-AzContext -SubscriptionName $Subscription_ID -DefaultProfile $AzureConnection)
$Application_ID = '00000000-0000-0000-0000-000000000000'
$PFX_FileName = "Azure_Dev_V2"
$Cert_Password = "123456"
$Cert_Password = ConvertTo-SecureString -String $Cert_Password -Force -AsPlainText
$CurrentDate = Get-Date
$EndDate = $CurrentDate.AddYears(10)
$certificatePath = "C:\FilePath\Certificates\Certs\$($PFX_FileName).pfx" # OR Get-ChildItem -Path cert:\localmachine\my\$($Certificate_Thumbprint)
$cert = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Certificate2($certificatePath, $Cert_Password)
$keyValue = [System.Convert]::ToBase64String($cert.GetRawCertData())
$Azure_App_Registration = Get-AzADApplication -ApplicationId $Application_ID -DefaultProfile $AzureContext
New-AzADAppCredential -ApplicationObject $Azure_App_Registration -CertValue $keyValue -EndDate $EndDate -StartDate $CurrentDate
Connect-AzAccount -Subscription $Subscription_ID -ApplicationId $Azure_App_Registration.AppId -Tenant $Tenant_ID -CertificateThumbprint $cert.Thumbprint | Out-Null
Get-AzADAppCredential -ApplicationId $Azure_App_Registration.AppId | Where-Object {$_.DisplayName -match "CN=Azure_Dev_V2"}
In the Azure portal multiple certificates can be uploaded manually, and they will not delete the existing certificates.
EDIT: If bulk upload is not possible then how would I go about making sure the existing certificates that are in the Azure AD App Registration cert store do not get deleted ?
Thanks in Advance.
I need to automate domain join in a pipleine for Azure vm , Im using this code, however I dont want the user to enter the credential during runtime, how can i use a saved credential?
$DomainName = "abc.com"
$VMName = "VMNAME01"
$credential = Get-Credential
$ResourceGroupName = "RG01"
Set-AzVMADDomainExtension -DomainName $DomainName -VMName $VMName -Credential $credential -ResourceGroupName $ResourceGroupName -JoinOption 0x00000001 -Restart -Verbose
I am trying below powershell command to download parquet file from ADL Gen 2 to local system.
Below is the code snippet
#this appid has access to ADL
[string] $AppID = "bbb88818-aaaa-44fb-q2345678901y"
[string] $TenantId = "ttt88888-xxxx-yyyy-q2345678901y"
[string] $SubscriptionName = "Sub Sample"
[string] $LocalTargetFilePathName = "D:\MoveToModern"
Write-Host "AppID = " $AppID
Write-Host "TenantId = " $TenantId
Write-Host "SubscriptionName = " $SubscriptionName
Write-Host "AzureDataLakeAccountName = " AzureDataLakeAccountName
Write-Host "AzureDataLakeSrcFilePath = " $AzureDataLakeSrcFilePath
Write-Host "LocalTargetFilePathName = " $LocalTargetFilePathName
#this is the access key of the appid
$AccessKeyValue = "1234567=u-r.testabcdefaORYsw5AN5"
$azurePassword = ConvertTo-SecureString $AccessKeyValue -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential($AppID, $azurePassword)
Login-AzureRmAccount -Credential $psCred -ServicePrincipal -Tenant $TenantId
Get-AzureRmSubscription
Get-AzureRmSubscription -SubscriptionName $SubscriptionName | Set-AzureRmContext
Get-AzureStorageBlobContent -Container "/Test/Partner/Account/" -Blob "Account.parquet" -Destination "D:\MoveToModern"
But I am getting below error
May be we have to set the storage context. Can you please let me know how to set the storage context with the service principal. (I have app id & app key of service principal. W.r.t ADL Gen2 source, I just have the path details. Source team has provided access to service principal)
If you want to download files from Azure Data Lake Gen2, I suggest you use PowerShell module Az.Storage. Meanwhile, regarding how to implement it with a service principal, you have two choices.
1. Use Azure RABC Role
If you use Azure RABC Role, you need to assign the special role(Storage Blob Data Reader) to the sp.
For example
$AppID = ""
$AccessKeyValue = ""
$TenantId=""
$SubscriptionName = ""
#1. Assign role at the storage account level
#please use owner account to login
Connect-AzAccount -Tenant $TenantId -Subscription $SubscriptionName
New-AzRoleAssignment -ApplicationId $AppID -RoleDefinitionName "Storage Blob Data Reader" `
-Scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>"
# download
$azurePassword = ConvertTo-SecureString $AccessKeyValue -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential($AppID, $azurePassword)
Connect-AzAccount -Credential $psCred -ServicePrincipal -Tenant $TenantId -Subscription $SubscriptionName
$AzureDataLakeAccountName = "testadls05"
$ctx =New-AzStorageContext -StorageAccountName $AzureDataLakeAccountName -UseConnectedAccount
$LocalTargetFilePathName = "D:\test.parquet"
$filesystemName="test"
$path="2020/10/28/test.parquet"
Get-AzDataLakeGen2ItemContent -Context $ctx -FileSystem $filesystemName -Path $path -Destination $LocalTargetFilePathName
Use Access control lists
If you use the method, to grant a security principal read access to a file, you'll need to give the security principal Execute permissions to the container, and to each folder in the hierarchy of folders that lead to the file.
For example
$AppID = ""
$AccessKeyValue = ""
$TenantId=""
$azurePassword = ConvertTo-SecureString $AccessKeyValue -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential($AppID, $azurePassword)
Connect-AzAccount -Credential $psCred -ServicePrincipal -Tenant $TenantId
$AzureDataLakeAccountName = "testadls05"
$ctx =New-AzStorageContext -StorageAccountName $AzureDataLakeAccountName -UseConnectedAccount
$filesystemName="test"
$path="2020/10/28/test.parquet"
$LocalTargetFilePathName = "D:\test1.parquet"
Get-AzDataLakeGen2ItemContent -Context $ctx -FileSystem $filesystemName -Path $path -Destination $LocalTargetFilePathName
For more details, please refer to
https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control
https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control-model
https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-powershell
The powershell is used to automate the backup of AAS instance.
The instance have Multi-factor authentication and I think that is the problem.
Powershell:
$TenantId = "TenentID"
$Cred = Get-AutomationPSCredential -Name 'SSASModelBackup'
$Server = "ServerName"
$RolloutEnvironment = "location.asazure.windows.net"
$ResourceGroup = "ReourceGroupName"
#Create Credentials to convertToSecureString
$applicationId = "applicationId "
$securePassword = "securePassword " | ConvertTo-SecureString -AsPlainText -Force $Credential = New-Object
-TypeName System.Management.Automation.PSCredential -ArgumentList $applicationId, $securePassword
#Define the list of AAS databases
$asDBs = #('database1','database2')
Write-Output "Logging in to Azure..."
#Add-AzureAnalysisServicesAccount -Credential $Credential -ServicePrincipal -TenantId $TenantId -RolloutEnvironment $RolloutEnvironment
ForEach($db in $asDBs)
{
Write-Output "Starting Backup..."
Backup-ASDatabase `
–backupfile ($db +"." + (Get-Date).ToString("ddMMyyyy") + ".abf") `
–name $db `
-server $Server `
-Credential $Cred
Write-Output "Backup Completed!"
}
You are correct that the issue is with multi-factor authentication. Since the point of multi-factor is the require interaction with a secondary source like your phone there is no way to automate the process.
I would suggest that you look into using service principle authentication for the purpose of taking backups. By using a service principle to your server you can allow for automated tasks to run without 2-factor while minimizing the security risk.
I am using this code found on stack , and all connections are correct.
Import-Module Azure
Import-Module Azure.Storage
Get-AzureRmSubscription –SubscriptionName “Production” | Select-AzureRmSubscription
# Username for Azure SQL Database server
$ServerLogin = "username"
# Password for Azure SQL Database server
$serverPassword = ConvertTo-SecureString "abcd" -AsPlainText -Force
# Establish credentials for Azure SQL Database Server
$ServerCredential = new-object System.Management.Automation.PSCredential($ServerLogin, $serverPassword)
# Create connection context for Azure SQL Database server
$SqlContext = New-AzureSqlDatabaseServerContext -FullyQualifiedServerName “myspecialsqlserver.database.windows.net” -Credential $ServerCredential
$StorageContext = New-AzureStorageContext -StorageAccountName 'prodwad' -StorageAccountKey 'xxxxx'
$Container = Get-AzureStorageContainer -Name 'automateddbbackups' -Context $StorageContext
$exportRequest = Start-AzureSqlDatabaseExport -SqlConnectionContext $SqlContext -StorageContainer $Container -DatabaseName 'Users' -BlobName 'autobackupotest.bacpac' -Verbose -Debug
I am getting this error. I have spent hours on this.
Start-AzureSqlDatabaseExport : Cannot bind parameter 'StorageContainer'. Cannot convert the
"Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageContainer" value of type
"Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageContainer" to type
"Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageContainer".
At line:31 char:99
+ ... SqlConnectionContext $SqlContext -StorageContainer $Container -Databa ...
+ ~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Start-AzureSqlDatabaseExport], ParameterBindingException
+ FullyQualifiedErrorId : CannotConvertArgumentNoMessage,Microsoft.WindowsAzure.Commands.SqlDatabase.Database.Cmdlet.StartAzureSqlDatabaseExport
I am using AzureRM 3.8.0
According to your description and codes, If you want to start a new sqldatabase export.I suggest you could try below codes. It will work well.
$subscriptionId = "YOUR AZURE SUBSCRIPTION ID"
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId $subscriptionId
# Database to export
$DatabaseName = "DATABASE-NAME"
$ResourceGroupName = "RESOURCE-GROUP-NAME"
$ServerName = "SERVER-NAME"
$serverAdmin = "ADMIN-NAME"
$serverPassword = "ADMIN-PASSWORD"
$securePassword = ConvertTo-SecureString -String $serverPassword -AsPlainText -Force
$creds = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $serverAdmin, $securePassword
# Generate a unique filename for the BACPAC
$bacpacFilename = $DatabaseName + (Get-Date).ToString("yyyyMMddHHmm") + ".bacpac"
# Storage account info for the BACPAC
$BaseStorageUri = "https://STORAGE-NAME.blob.core.windows.net/BLOB-CONTAINER-NAME/"
$BacpacUri = $BaseStorageUri + $bacpacFilename
$StorageKeytype = "StorageAccessKey"
$StorageKey = "YOUR STORAGE KEY"
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $ResourceGroupName -ServerName $ServerName `
-DatabaseName $DatabaseName -StorageKeytype $StorageKeytype -StorageKey $StorageKey -StorageUri $BacpacUri `
-AdministratorLogin $creds.UserName -AdministratorLoginPassword $creds.Password
$exportRequest
# Check status of the export
Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $exportRequest.OperationStatusLink
Then you could use Get-AzureRmSqlDatabaseImportExportStatus to see the details information, like below:
I got the same problem after updating some powershell packages which I do not remember exactly. After the update my scripts started to fail.
My solution is :
Install the latest AzureRM from nuget via powershell
Use the other overload of Start-AzureSqlDatabaseExport which utilizes the parameters
-StorageContainerName and -StorageContext rather than -StorageContainer
It looks like if you pass the parameters to the function it will create the container object internally without crashing.