I want to use Get-AzureStorageBlob in a powershell script so my client can download files from my Azure Blob storage ( I use Devops to put them there )
The keys are in the script so the client does not need an Azure account.
He does however need to install Azure Powershell
And the instructions ask him to log in to Azure.
Is there an alternative?
If you just operates with azure storage, then you can ignore the "Connect-AzAccount" cmdlet.
After installing azure powershell module, since you have account_name and account_key of storage account, directly use them to download the blob.
But if you want to operate other resources like vm etc.,then you need to the cmdlet "Connect-AzAccount".
When I click Show Details in the right Commands panel I get an error message
cannot be loaded because running scripts is disabled on this system.
Related
i am writing az copy script which captures linux 18.04 system log using azcopy and store it into storage account container, but this whole steps I am doing with terraform automation. i have created machine code and I integrate shell script file with terraform extension.
so the issue is when azcopy copy the file from system and pass to a storage account need azcopy login to authenticate this process but these steps we can't perform through automation.
using following azcopy script and version is v10 please help me on this
AzCopy /Source:/var/log/syslog/Dest:https://testingwt.blob.core.windows.net/insights-operational-logs//SourceKey:y/bUACOu/wogikUT1EG0XeaPC4Y6spHcZly2d26QeENKwMiRpjFu5PwmXrThRbNGS3PiPfqEX8WsYC3dg== /S
updated error of azcopy using linux machine in azure
To upload files to the Storage Blob with a shell script automatically, you can use the SAS token of the storage, or use the azcopy login with a service principal or the VM managed identity.
For the SAS token:
azcopy copy "/path/to/file" "https://account.blob.core.windows.net/mycontainer1/?sv=2018-03-28&ss=bjqt&srt=sco&sp=rwddgcup&se=2019-05-01T05:01:17Z&st=2019-04-30T21:01:17Z&spr=https&sig=MGCXiyEzbtttkr3ewJIh2AR8KrghSy1DGM9ovN734bQF4%3D" --recursive=true
For the Service Principal, you need to set the environment variable AZCOPY_SPA_CLIENT_SECRET with the secret of the service principal as value and assign the role Storage Blob Data Contributor or role Storage Blob Data Owner of the storage Blob:
azcopy login --service-principal --application-id <application-id> --tenant-id=<tenant-id>
azcopy copy "/path/to/file" "https://account.blob.core.windows.net/mycontainer1/" --recursive=true
For the VM managed identity, you need also to assign the VM managed identity with the role Storage Blob Data Contributor or role Storage Blob Data Owner of the storage Blob:
azcopy login --identity
azcopy copy "/path/to/file" "https://account.blob.core.windows.net/mycontainer1/" --recursive=true
But when you use the VM managed identity, you need to execute the shell script in the Azure VM, it means you need to deploy the Terraform in the Azure VM. So the best way is that use a service principal, you can execute the shell script in other Linux OS, for example, your local Linux machine. The SAS token is also a good way without assigning the role. For more details, see the Use Azcopy for the Azure Storage Blob.
We use Azure CLI to write a script to delete a storage account container's content and upload the contents of a directory to container
az storage blob delete-batch --account-name $ACCOUNT_NAME --source $web
az storage blob upload-batch --account-name $ACCOUNT_NAME -s $SOURCE_PATH -d $web
Now, we want to host the script on Azure. According to our search, we cannot directly host Azure CLI script on Azure and we must migrate the script to powershell. But we cannot find similar powershell commands. Could someone help me?
Current version of Azure PowerShell at the time of writing this answer is 3.1 (https://learn.microsoft.com/en-us/powershell/module/az.storage/?view=azps-3.1.0#storage) and unfortunately batch operations on blobs is not supported there.
Thus using Azure PowerShell Cmdlets is not possible as of version 3.1 to perform batch operations on blobs.
One option for you is to use Azure Storage Blobs Batch client library for .NET. You can write PowerShell script that makes use of this library to perform batch operations on blobs. You can find more information and code samples here: https://github.com/Azure/azure-sdk-for-net/blob/master/sdk/storage/Azure.Storage.Blobs.Batch/README.md.
Other option would be to consume REST API. Again, you can write PowerShell script that makes the HTTP requests to perform batch operations against your storage account. You can find more information about the REST API here: https://learn.microsoft.com/en-us/rest/api/storageservices/blob-batch.
P.S: I would've included the code but unfortunately my knowledge in PowerShell is very limited :).
In TFS I selected Azure VMs File Copy:
My machine is classic and I created classic storage account. I set up the connection using username and password, not management certificate.
The storage account and cloud service I had to populate myself, because they did not appear in the drop-down menu (so possibly something is wrong already at this stage).
In the Cloud Service I entered MyMachine.cloudapp.net.
The task starts, it seems to login successfully, but throws:
Unable to find type [Hyak.Common.CloudException]
Log:
2017-11-24T14:21:28.80333Z Add-AzureAccount -Credential $psCredential
2017-11-24T14:21:35.866333Z Select-AzureSubscription -SubscriptionId -Default
2017-11-24T14:21:35.882333Z Set-AzureSubscription -SubscriptionId yy -CurrentStorageAccountName yyy
2017-11-24T14:21:35.898333Z ##[debug]Starting Azure File Copy Task
2017-11-24T14:21:35.898333Z ##[debug]connectedServiceNameSelector = ConnectedServiceName
2017-11-24T14:21:35.898333Z [debug]connectedServiceName = yyyyyy
(..)
2017-11-24T14:21:35.991333Z ##[debug]Loading AzureUtilityLTE9.8.ps1
2017-11-24T14:21:36.007333Z ##[debug]Connection type used is
UsernamePassword
2017-11-24T14:21:36.022333Z ##[debug]Azure
CallRetrieving storage key for the storage account:
mystorageaccount
2017-11-24T14:21:38.924333Z ##[error]Unable to find type
[Hyak.Common.CloudException].
Please help.
Actually you don't need to manually type the storage account, it will auto appear in the drop list. You just need to specify a pre-existing classic storage account. It is also used as an intermediary for copying files to Azure VMs.
Classic Storage Account
Required if you select Azure Classic for the Azure Connection Type
parameter. The name of an existing storage account within the Azure
subscription.
According to your log, the issue may related to the storage account setting. Double check this configuration under your Azure subscription.
Also suggest you go through this documentation to get more info of the Azure File Copy task. Such as make sure the machine should configured to allow WinRM connections.
Is there a way to use Azure Automation to download a file from azure storage? I can currently connect to the VM using templates from the gallery to create files/folder but how would I download a file from storage?
I am currently trying to use Get-AzureStorageBlob command from Invoke-command -ScriptBlock
If you are trying to use the powershell cmdlets, you need to remember to login to Azure prior to executing them. See the documentation. You would need to login in on the remote computer (ie: inside the Script Block).
An alternative is to have azcopy accessible, and simply pass in the key information via Automation Credentials.
If you want to do this, based on my experience, you need do the following steps.
1.Install Azure PowerShell on your target VM.
2.Enable Winrm on your VM, you need open port 5986 on Windows Firewall and Azure NSG. You also need configuration certificate on your VM. You could check this blog that step by step to enable winrm on Azure VM.
Note: You should enable winrm listening on https, if you enable it on http, you could not winrm your VM on runbook script.
3.Login to your Azure subscription in runbook, you could refer to this link about this.
4.Use New-PSSession to login your VM in runbook and execute your PowerShell cmlet. You could check my answer about this.
I am using powershell script, using this i create/setup vm in azure. I want to run powershell script without azure credentials (right now i am using as below but I don't want to my.publishsettings or publishsettings details in powershell script).
create_vm.ps1
...
azure account import D:\my.publishsettings
...
Is there any want to do same. please suggest me.
There are two ways by which you can connect to and manage your Azure Subscription - One is using X509 Certificate (which is what you're doing when you use publishsettings file) and the other is using Azure AD.
Please see this link for detailed instructions on how you can use Azure AD to manage your Azure Subscriptions: https://azure.microsoft.com/en-in/documentation/articles/powershell-install-configure/. Scroll down to section titled How to: Connect to your subscription.