Download file from Azure blob storage onto Azure Linux VM - azure

I'm coming from an AWS background and trying to get something relatively simple to work in Azure, but currently having a rough time parsing through all the documentation and Microsoft-specific jargon to find what I'm looking for.
I'm trying to download a single file I have in Azure blob storage (which from what I can gather is the closest equivalent to storing an object in S3) onto a Linux VM with the CLI. From what I've read. the command I need to run is:
az storage copy -s https://myaccount.blob.core.windows.net/mycontainer/myfile -d .
A couple of questions for automation purposes, however. Is there an equivalent to IAM roles for VMs in Azure? That way, I won't have to keep credential file(s) on the VM itself. But if not, what type of credentials should I generate for best practice? I ask, because it seems there's about a half-dozen different choices in Azure, and all I'm really looking for is something basic. Just essentially need what amounts to a "programmatic-access only" user in AWS. That way I can also lock down its permissions to a very specific set of resources and/or actions.
As always, thanks in advance!

Is there an equivalent to IAM roles for VMs in Azure?
What you're looking for is Managed Identity. Basically the way it works is that you assign an identity to your Azure resource (a Linux VM in your case) so that your Azure resource behaves like any other user in your Azure AD and then assign appropriate role/access to that user.
You can learn more about Managed Identities in Azure here: https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview.

Related

Azure Service Fabric backups (non-persistent data)

I have 3 applications deployed to Azure Service Fabric via ARM template. The only items that have been identified as needing to be backed up are some resources. They include a central blob storage, about 5 SQL databases, and the key vault. The cluster and apps can be redeployed right away via the template.
Searching for backup solutions, I'm seeing a lot of info on backups for services, but not for specific resources like I have here. Can anyone point me to the right direction/sample code on how to do this or is it even an option?
Ok so, storage cannot be backed up using Azure services. You have to créate a program\script that will do that for you.
For the keyvault you can use this powershell cmdlet.
For the SQL there are a bunch of ways to do that, but perhaps you can settle with the built-in backup, which happens automatically and goes 7-35 days back (depending on your tier)

Unable to create an azure image in ARM

I have uploaded a back up of onprem OS disk to azure RM storage account. I want to create an image out of that os disk to provision vms in azure.
Please let me know if that is possible ?
Assuming your base system is running Windows, and you don't want to sysprep the machine, you're trying to create what's called a Specialized image in Azure. The best steps to follow to complete this are:
Create the Specialized image
Create a VM from an image
The first thing that you need to make sure that the disk is in correct format i.e VHD for azure. There are many third party tools available to convert the disk to VHD (very easy if it's hyper v machine).
Secondly do create the necessary infrastructure in azure like storage account to upload the disk and virtual network that would connect to your machine, resource group etc. Also this is currently only possible through powershell and not through portal.
Azure migarte has now made it very to migrate large number of vms if you are considering a production( much better than its was last year).
The question says unable to migrate is disk so I assume you must have gone through Microsoft document and than must have faced problem. Can you provide me the error you got while uploading?

Azure Blob Container Granting Read only Access through Shared Access Signature Access

I have two Azure Blob Storage containers. Container A and B. I would like to grant Read only access to another Azure User for Container-A. The second container Container-B should not be visible to the Azure user. The Azure user will be accessing the blobs in Container-A from his Azure Virtual Machine. How do I achieve this? Reading on the web seems that I would need to generate Shared Access Signature, but how I am not sure.
Exactly, that is the scenario where you want to use SAS.
First, please read the Azure Storage security guidance to make sure that you are aware of all of the available options.
Here is the very helpful guidance on the SAS model.
Second, you need to generate the SAS with policies (please, refer to the guidances above). It can be done programmatically (sources are available in the guidance) and then you may give that SAS link to user you want anyway you want - it can be the online page where the user can grab the string, or you can write the simple tool to generate the SAS. Be aware, however, that they have the "life" and you need to renew them periodically.

is it possible to aggregate Azure resources from different subscriptions?

Our team has Windows Azure MSDN - Visual Studio Premium subscriptions for all our devs. I have been taking advantage of the $100 per month allowance and am building more infrastructure in the cloud.
However, I would like other members of our team to access certain of the assets. I am quite new to the Azure infrastructure, so this might be a dumb question. But can they access my blobs? and can I control exactly who can access my blobs?
They can obviously RDP into my VMs, that's not an issue. I assume they can hit my VMs too, via the IP address, inside Azure, etc. However, I am more interested in the Blobs. Mostly because I am starting to upload a lot of utility data (large sample datasets, common software we all install, etc.) and I would like to avoid all of us having to upload all of it again for each subscriptions.
As of today (11/8/2013), you cannot "pool" MSDN resources meaning..have 4 subscriptions add up to $400/month and do ala carte cloud services
You can have one admin/or several for multiple subscriptions, this will allow you to view the different subscriptions in the portal and manage them in a single spot
You can also have different deployment profiles, so one Visual Studio instance can deploy to different Azure accounts.
Specific to your question, you have blob access keys and if you share the name of the storage account and key...yes they can access your data located there.
Yes, it is possible to control access to your blobs by using SAS (Shared Access Signatures)
SAS grants granular access to container, blob, table, & queue
This should be a good resource to start with :
Manage Access to Windows Azure Storage Resources
Create and Use a Shared Access Signature
However, I would like other members of our team to access certain of
the assets. I am quite new to the Azure infrastructure, so this might
be a dumb question. But can they access my blobs? and can I control
exactly who can access my blobs?
To answer specifically this question, Yes your team members can access the data stored in any blob storage account in any of your subscription. There are two ways by which you can provide them access to blob storage:
By giving them account name/account key: Using this, they get full access to storage account and essentially become owners of that storage account.
By using Shared Access Signature: If you want to give them restricted access to blob storage, you would need to use SAS as described by Dan Dinu. SAS basically gives you a URL using which users in possession of that URL can explore storage (by writing some code), however it is not possible to identify which user accessed which storage. For that you would need to write something on your own.

Creating blob storage service programmatically in Azure

currently I'm playing around with Azure and thinking about a multi-tanent web app where users can create an instance of the app, where more users can register to upload and share files within this instance. I've created a blob storage service and created several containers. However, I'm not sure how customers may think about the fact, that they share their blob service with other users and files are only separated by containers. I would like that each user gets instead his own blob service. However the web app should be shared still by a single web worker role.
This sounds easy for every instance you create by hand, however I want the blob service to be created automatically as the user registers and creates his instance of the web app. Unfortunately I haven't found yet any information about how I could accomplish this. I've found only the blob storage api to query the service, not for creating it.
Can anybody lead me in the right direction? Is this even possible?
You can create a storage account programmatically (see "Create Storage Account": http://msdn.microsoft.com/en-us/library/hh264518.aspx), but I wouldn't recommend creating a different account for each user. The limit on how many storage accounts can be created per subscription is fairly low. (I believe the default is five and you can call to get your quota increased to twenty.)
In general, the recommendation is to go ahead and use the same storage account for all your customers. I believe your concern is about data security, but adding multiple storage accounts doesn't really change the security dynamic. (The trust boundary is still between you and the end user, since only your code will directly access storage.)

Resources