Team,
When i migrated the storage account from one subscription to another using a migration tool(mig-az) it did not migrate the underlying tables, queues and blobs storage. Is that an inherent nature of migration?
is there a way to copy over the tables to the storage account in newer subscription? When doing the copy, would there be any downtime to the resources in the older subscription? Would the data in the older sub be lost?
Having a brief read of the Github page for Mig-Az, this seems to be a tool that just creates ARM templates from existing resources and allows you to apply them elsewhere. If this is the case, then it's only going to create a storage account, not any of the underlying resources or data, this is the nature of ARM templates.
If both your subscriptions are under the same Azure AD tenant then you can just move them using the Azure portal, there is an option in the resource group to move resources. See here for instructions.
If the subscriptions are under different tenants then you are going to need to manually move your data.
Related
I have a requirement where I need to transfer files from one blob to the other through vnets deployed in different geographies and connected to each other. As I am very new to Azure platform, I tried researching over the web but could not find any proper solution. I got a suggestion that I can achieve this through programming an app service. Please let me know how I can achieve this.
Depends on your scenario, here are options:
To perform a backup of the storage account across different gegions, you can just specify the replication parameter (while creating a new storage account) to one these values:
Geo-redundant storage
Read-access geo-redundant storage
Another article on HA applications:
Designing Highly Available Applications using RA-GRS
If you want to manually copy files from a storage account to another, you can use Azure Storage events, it will push an event to Event Grid every time a blob is created.
Reacting to Blob storage events
You can then use a Logic App or a Function App to copy blobs to another storage account.
Whenever I create a new Storage (classic) account through the Azure portal I consistently have issues whereby the Table/Queue/File storage is not created at all, leaving the account with only Blob storage, like this:
Instead of like this (separate account):
I have tried this multiple times and all have had the same result. I don't see how I can be getting this wrong as there is only 4 options on the form to create the account, and none of them govern the content of the account.
When I then attempt to create a new Table or Queue in this new account I get a 502 Bad Gateway error.
Am I missing something here? Can anyone tell me how I can add the required storage types to the account.
Not sure what's up with the portal, but a storage account always comprises blob, table, queue, and file storage (unless you create a Premium storage account - that's strictly blobs).
You should be able to confirm this by creating an app to, say, create, write, and read from a queue or table.
EDIT I see you edited your question, showing that you did try to create a table/queue. If this is a non-premium account, I suggest reaching out to support, as this makes no sense.
EDIT 4/2017 Aside from Premium storage accounts (which only have page blobs), there is another type of general (non-premium) storage account, specific to blobs only, where you won't be able to create Tables and Queues, but it's not available via the "Classic" deployment model; it's available only via "Resource Manager" deployment model:
In my case the issue was due to selecting Zone Redundant Storage (ZRS).
Since ZRS accounts only support Block Blobs, you will not see the
table, queue or file endpoints listed on the portal for the new
account.
https://blogs.msdn.microsoft.com/windowsazurestorage/2014/08/01/introducing-zone-redundant-storage/
Recreating the storage account using Globaly Redundant Storage (GRS) worked.
I have more than one Azure subscriptions, one for myself and others for clients. Can I change the subscription for one of my storage accounts so that it's associated to one of my other subscriptions?
Cannot find a way to do it through the Azure Management Portal. Perhaps it can be done with PowerShell?
Thanks!
I think you're best approach would be to contact Microsoft Windows Azure support to see if they can help.
I wanted to post some updated guidance in case anyone else runs into this issue. To move a storage account, you can do it easily for V1 or V2. You can move Classic Storage but its not easy and I would suggest upgrading it to the newer ARM based V1 or V2 storage account. This assume within the region. Moving to a subscription in another region is a different matter. This answer is probably very different that it would have been back in 2014.
Move Azure Storage account to another region
Move classic storage
Move V1 or V2 Storage
Move to another region
Migrate classic storage to ARM
Our team has Windows Azure MSDN - Visual Studio Premium subscriptions for all our devs. I have been taking advantage of the $100 per month allowance and am building more infrastructure in the cloud.
However, I would like other members of our team to access certain of the assets. I am quite new to the Azure infrastructure, so this might be a dumb question. But can they access my blobs? and can I control exactly who can access my blobs?
They can obviously RDP into my VMs, that's not an issue. I assume they can hit my VMs too, via the IP address, inside Azure, etc. However, I am more interested in the Blobs. Mostly because I am starting to upload a lot of utility data (large sample datasets, common software we all install, etc.) and I would like to avoid all of us having to upload all of it again for each subscriptions.
As of today (11/8/2013), you cannot "pool" MSDN resources meaning..have 4 subscriptions add up to $400/month and do ala carte cloud services
You can have one admin/or several for multiple subscriptions, this will allow you to view the different subscriptions in the portal and manage them in a single spot
You can also have different deployment profiles, so one Visual Studio instance can deploy to different Azure accounts.
Specific to your question, you have blob access keys and if you share the name of the storage account and key...yes they can access your data located there.
Yes, it is possible to control access to your blobs by using SAS (Shared Access Signatures)
SAS grants granular access to container, blob, table, & queue
This should be a good resource to start with :
Manage Access to Windows Azure Storage Resources
Create and Use a Shared Access Signature
However, I would like other members of our team to access certain of
the assets. I am quite new to the Azure infrastructure, so this might
be a dumb question. But can they access my blobs? and can I control
exactly who can access my blobs?
To answer specifically this question, Yes your team members can access the data stored in any blob storage account in any of your subscription. There are two ways by which you can provide them access to blob storage:
By giving them account name/account key: Using this, they get full access to storage account and essentially become owners of that storage account.
By using Shared Access Signature: If you want to give them restricted access to blob storage, you would need to use SAS as described by Dan Dinu. SAS basically gives you a URL using which users in possession of that URL can explore storage (by writing some code), however it is not possible to identify which user accessed which storage. For that you would need to write something on your own.
I'm trying to get up-and-going with Windows Azure. I understand that I need to create a "Storage Account". However, what I'm confused about is, how I should set it up. For instance, my Azure subscription is set to my company name. I intend to have multiple ASP.NET web applications (web roles) associated with my subscription. Each web application will have its own database.
My question is, should each web application have its own storage account? Or should only one storage account be used for all of my projects?
Thank you!
There's no one way to answer this, but here are some thoughts to help your decision:
Each storage account is limited to 100TB. If you feel that you will push the limits of this across multiple websites, then create multiple storage accounts for sure.
To make billing easier, I'd suggest separate storage accounts
Storage accounts have an SLA of a few thousand transactions per second across the entire storage account. For performance purposes, it's probably better to have separate storage accounts
Consider putting your diagnostic data in a separate storage account. This way, you can safely give your Storage Account key to a 3rd-party like ParaLeap (creators of AzureWatch) for monitoring your app, while not giving away the key to real customer data, for instance.
If you need more than 5 storage accounts, you'll need to contact Customer Support to increase this number.
Windows Azure Storage server is for simple blob storage. This is for when your app needs a file store. Any application, not just Azure web roles, can target a storage service. It's kind of like Amazon S3 if you're familiar with that.
Storage services are not required to run Azure applications. You just need a "compute" instance.