We had try Microsoft's Azure platform for our startup.
A developer created servers and storage among other things in the account.
I have with the same login account a One Drive account as well with personal stuff.
I have stopped the servers and I want to delete the storage at Azure, is it safe to delete it without deleting my storage on One Drive?
Are they separated? So I can delete the Azure's storage without deleting the things I have on one Drive?
Best Regards,
Daniel
Azure storage and One Drive are totally different services. Azure is a paid, commercial, storage solution and any changes you make there won't affect your One Drive (which is a targeted at personal use and free, i think)
Related
Working in IaaS environment in AZURE and need to create a shared file for applications that will be sharing the same files uploaded by end users. The file share needs to be scene on various servers and appear as a fixed drive letter or mount point. Already created a Storage account and a file share in azure but can not overcome the issue that the mapped drive is associated with a users profile.
Was wondering if any has come up with a solution. ... I'm the system administrator assigned to this task and can do things in powershell or pass code information to developers for their review.
Did not resolve issue, developers are going to use Blog storage.
The trick with this was getting the application to see the drive letter. For us having a local user run as a service with the associated Azure file share mapping might have worked
NOTE to map the azure drive a use would need the Azure Storage account and Key generated for that account to access it.
Is there any way to create a drive in windows azure, and then that the users at the company can map a drive in their windows explorer, and upload documents there?
If so, how can I do that?
At first I thought you could do that through Azure File Service announced recently but then I found out that you can only mount in the VMs running in the same region as the storage account.
You may want to look at Gladinet Cloud Desktop tool which allows you to mount a blob storage account as a drive on the local computer. More information about this can be found here: http://www.gladinet.com/p/map_azure_storage_as_virtual_drive.htm.
Other than this, AFAIK there are no other ways to map a storage account as a drive on your local computer.
Our team has Windows Azure MSDN - Visual Studio Premium subscriptions for all our devs. I have been taking advantage of the $100 per month allowance and am building more infrastructure in the cloud.
However, I would like other members of our team to access certain of the assets. I am quite new to the Azure infrastructure, so this might be a dumb question. But can they access my blobs? and can I control exactly who can access my blobs?
They can obviously RDP into my VMs, that's not an issue. I assume they can hit my VMs too, via the IP address, inside Azure, etc. However, I am more interested in the Blobs. Mostly because I am starting to upload a lot of utility data (large sample datasets, common software we all install, etc.) and I would like to avoid all of us having to upload all of it again for each subscriptions.
As of today (11/8/2013), you cannot "pool" MSDN resources meaning..have 4 subscriptions add up to $400/month and do ala carte cloud services
You can have one admin/or several for multiple subscriptions, this will allow you to view the different subscriptions in the portal and manage them in a single spot
You can also have different deployment profiles, so one Visual Studio instance can deploy to different Azure accounts.
Specific to your question, you have blob access keys and if you share the name of the storage account and key...yes they can access your data located there.
Yes, it is possible to control access to your blobs by using SAS (Shared Access Signatures)
SAS grants granular access to container, blob, table, & queue
This should be a good resource to start with :
Manage Access to Windows Azure Storage Resources
Create and Use a Shared Access Signature
However, I would like other members of our team to access certain of
the assets. I am quite new to the Azure infrastructure, so this might
be a dumb question. But can they access my blobs? and can I control
exactly who can access my blobs?
To answer specifically this question, Yes your team members can access the data stored in any blob storage account in any of your subscription. There are two ways by which you can provide them access to blob storage:
By giving them account name/account key: Using this, they get full access to storage account and essentially become owners of that storage account.
By using Shared Access Signature: If you want to give them restricted access to blob storage, you would need to use SAS as described by Dan Dinu. SAS basically gives you a URL using which users in possession of that URL can explore storage (by writing some code), however it is not possible to identify which user accessed which storage. For that you would need to write something on your own.
I think we have gone slightly wrong on the way we have used Azure storage in a SAAS system. We created a storage account per client (Securtiy was prime consideration) and containers per system area e.g. Vehicle, Work etc
Having done further reading it seems a suggestion would be that we should have used one account for all clients. Each client should have a container (so we can programmatically create it) which we then secure. Then files should just be structured using "virtual" folder structure e.g. Container called "Client A". Then Files for the Jobs (in Work area of system) stored like Work/Jobs/{entity id}/blah.pdf. Does this sound sensible?
If so we now have about 10 accounts that we need to restructure. Are there any tools that will let us easily copy one accounts contents to another containers account? I appreciate we probably can't move the files between accounts (as we set them up ages ago so can't use native copy function) so I guess some sort of copy. There are GB of files across all the accounts.
It may not be such a bad idea to keep different storage accounts per client. The benefits of doing that (to me) are:
Better security as mentioned by you.
You'll be able to achieve better throughput / client as each client will have their own storage account. If you keep one storage account for all clients, and if one client starts hitting that account badly other clients will be impacted.
Better scalability. Each storage account can hold up to 200 TB of data. So if you keep just one storage account and assuming each client consumes 100 GB of data, you'll be able to accommodate only 2000 clients (I hope my math is right :)). With individual storage accounts, you won't be restricted in that sense.
There're some downsides as well. Some of them are:
Management would be a nightmare. Imagining you have 2000 customers then you would end up managing 2000 storage accounts.
You may be limited by Windows Azure. Currently by default you get about 10 or 20 storage accounts per subscription and you would need to contact support to manually up that limit. They can do that for you but I would imagine you would want this to be a self-service model where you would be able to create as many storage accounts as you want without contacting support.
Now coming to your question about tooling, you could possibly write something on your own which makes use of Copy Blob functionality. This functionality allows you to copy blob data across storage accounts asynchronously. Basically this is what you would do is:
First create a blob container for each client in the target storage account.
Enumerate all blob containers in source storage account.
For each blob container in source storage account, enumerate the blobs.
Copy each blob asynchronously to target storage account in the client's blob container.
If you're a PowerShell fan, you can look into Cerebrata's Azure Management Cmdlets (http://www.cerebrata.com/Products/AzureManagementCmdlets) as well which wraps this functionality. I could have recommended Cerebrata's Azure Management Studio as well but I haven't tried this functionality just yet there [Disclosure: I'm one of the devs on Cerebrata team].
Hope this helps.
Adding to Gaurav Mantri answer...
You can have shared storage account for customers and use Shared Access Signature(SAS) to limiting access to particular container or blobs(as well as for tables and queues)...
http://msdn.microsoft.com/en-us/library/windowsazure/hh508996.aspx
I'm trying to get up-and-going with Windows Azure. I understand that I need to create a "Storage Account". However, what I'm confused about is, how I should set it up. For instance, my Azure subscription is set to my company name. I intend to have multiple ASP.NET web applications (web roles) associated with my subscription. Each web application will have its own database.
My question is, should each web application have its own storage account? Or should only one storage account be used for all of my projects?
Thank you!
There's no one way to answer this, but here are some thoughts to help your decision:
Each storage account is limited to 100TB. If you feel that you will push the limits of this across multiple websites, then create multiple storage accounts for sure.
To make billing easier, I'd suggest separate storage accounts
Storage accounts have an SLA of a few thousand transactions per second across the entire storage account. For performance purposes, it's probably better to have separate storage accounts
Consider putting your diagnostic data in a separate storage account. This way, you can safely give your Storage Account key to a 3rd-party like ParaLeap (creators of AzureWatch) for monitoring your app, while not giving away the key to real customer data, for instance.
If you need more than 5 storage accounts, you'll need to contact Customer Support to increase this number.
Windows Azure Storage server is for simple blob storage. This is for when your app needs a file store. Any application, not just Azure web roles, can target a storage service. It's kind of like Amazon S3 if you're familiar with that.
Storage services are not required to run Azure applications. You just need a "compute" instance.