Windows Azure Local Blob Storage Access? - azure

I don't understand why when I call :
LocalResource ls = RoleEnvironment.GetLocalResource("mystorage");
ls.RootPath;
The local folder is not created, I don't understant how it works.
For me, it's a local path that is create. So I should have.
C:\Resources\directory\xxx.HelloAzureWeb.mystorage\
But there is not such file.
Edit:
What I am trying to do is to store some xml scene save files. In a Unity application, I need to post (save them) et get (use them) those files.

There is some misunderstanding you have about creating the local storage. When you call below code, it does not create Local Storage for you, instead it returns back to you an instance of Local Storage object which is created during Application setup:
LocalResource ls = RoleEnvironment.GetLocalResource("mystorage");
To define local storage in your application you either directly add the following in CSDEF (below settings will create a local storage name mystorage, size 2GB and if any case VM is rebooted or role restarts, the local storage will not be clean and content still there:
<LocalResources>
<LocalStorage cleanOnRoleRecycle="false" name="mystorage" sizeInMB="2048" />
</LocalResources>
To add local storage you can also choose ->> [ Visual studio > Role Properties > Local Storage > Add Local Storage ] option as well.
When you define Local Storage in your Windows Azure Application, a new folder will be added as below in your drive C: on Azure VM and this is done when you role is being provisioned in the VM during VM start time:
[In Azure VM]
C:\Resources\directory\[Your_deploymentID].[your_webrolename]_[Role_Instance_Count]\
[In Compute Emulator]
// Lunch your application in Compute Emulator and then open "Compute Emulator UI" to see the Local Storage Path in the command window related to your instance:
C:\Users\avkashc\AppData\Local\dftmp\Resources\3503509c-2112-45ea-8d63-377fe9f78491\directory\mystorage\
Once you will add above local storage specific settings in ServiceDefinition.CSDEF, your local storage will be created and then the following code will work:
LocalResource ls = RoleEnvironment.GetLocalResource("mystorage");
ls.Root // returns the correct local storage path
// Now you can use ls to read/write your data.

For me, the files I store to the storage emulator blobs go into C:\Users\[username]\AppData\Local\DevelopmentStorage\LDB\BlockBlobRoot. Unfortunately, they're not very useful. It's just a bunch of GUIDs, but the filesizes look correct.

Here is how you access Local Storage in your ASP.NET Web Role:
Step 1: Created a very simple ASP.NET Web Role Project
Step 2: Included the following in servicedefinition.csdef:
<LocalResources>
<LocalStorage name="mystorage" cleanOnRoleRecycle="false" sizeInMB="2096" />
</LocalResources>
Step 3: Add the following code in any aspx (in this case about.aspx)
<div>
Local Storage file content: <b><asp:Label ID="fileContent" runat="server" /></b>
</div>
Step 4: Add the following code in any aspx.cs (in this case about.aspx.cs):
protected void Page_Load(object sender, EventArgs e)
{
LocalResource myStorage = RoleEnvironment.GetLocalResource("mystorage");
string filePath = Path.Combine(myStorage.RootPath, "Lesson.txt");
File.WriteAllText(filePath, "First Lesson");
fileContent.Text = File.ReadAllText(filePath);
}
That's it.
I have tested this code in compute emulator and on Cloud and it does work.

When using the storage emulator, Local Resource allocation is just a file directory. The root path looks a bit different from what you describe, but you should be able to navigate to that directory on your local machine, which will initially be empty. You should be seeing it under your \users\you\AppData (which is a hidden directory).
Oh, and local resources have nothing to do with Blob storage. In Windows Azure, it's just a locally-attached disk resource with a specific size quota. It's non-durable storage, unlike Blob storage. Perfect for temporary file writes, caching, etc. and faster than Blob Storage since it's a local disk, but for anything that needs to persist, you'll want to consider Blobs.

Related

How to upload a file from azure blob storage to Linux VM created on azure

I have one large file on my azure blob storage container. I want to move my file from blob storage to Linux VM created on azure> How can I do that using data factory? or any Powershell Command?
The easiest and without any tools is to generate SAS token for the blob and run CURL.
Generate SAS
And then CURL
curl <blob_sas_url> -o output.txt
If you need this automated every time you can generate SAS URL from the script or just use AzCopy.
Please reference this blog:How to copy data to VM from blob storage, it gives you a way to solve the problem with Data Factory:
"To anyone who might get into same problem in future, I solved my problem by using 'copy wizard' present in ADF.
We need to install Data Management Gateway on VM and register it before we use 'copy wizard'.
We need to specify blob storage as source and in destination we need to choose 'File Server Share' option. In 'File Server Share' option we need to specify user credentials which I suppose pipeline uses to login to VM, folder on VM where pipeline will copy the data."
From the Azure Blog Storage document, there is another way can help you Mount Blob storage as a file system with blobfuse on Linux.
Blobfuse is a virtual file system driver for Azure Blob storage. Blobfuse allows you to access your existing block blob data in your storage account through the Linux file system. Blobfuse uses the virtual directory scheme with the forward-slash '/' as a delimiter.
This guide shows you how to use blobfuse, and mount a Blob storage container on Linux and access data. To learn more about blobfuse, read the details in the blobfuse repository.
If you want to use AzCopy, you can reference this document Transfer data with AzCopy and Blob storage. You can download the AzCopy for Linux. It provided the command for upload and download files.
For example, upload file:
azcopy copy "<local-file-path>" "https://<storage-account-name>.<blob or dfs>.core.windows.net/<container-name>/<blob-name>"
For PowerShell, you need to use PowerShell Core 6.x and later on all platforms. It works with Windows and Linux virtual machines using Windows PowerShell 5.1 (Windows only) or PowerShell 6 (Windows and Linux).
You can find the PowerShell commands in this document:Quickstart: Upload, download, and list blobs by using Azure PowerShell
Here is another link talked about Copy Files to Azure VM using PowerShell Remoting 6 (Windows and Linux).
Hope this helps.
You have many options to copy content from the blob store to the disk on the VM:
1. Use AzCopy
2. Use Azure Pipelines - File copy task
3. Use Powershell cmdlets
A lot of content is available on these approaches on SO!
It seems this is not properly documented anywhere so I am sharing the most basic approach which is to use the azcopy tool that is available for both windows/linux OS. This approach doens't need the complexity of creating the credentials/tokens.
Download azcopy
Its simple executable which can be run directly after extraction
Create a managed identity(system-assigned identity) for your Virtual machine. Navigate to VM-> Identity -> Turn the Status to 'ON' -> Save
Now the VM can be assigned permission at the following levels:
Storage account
Container (file system)
Resource group
Subscription
For this case, navigate to storage account -> IAM -> Add role assignment -> Select role 'Storage Blob Data Contributor' -> Assign access to 'Virtual machine' -> Select the desired VM -> SAVE
NOTE: If you give access to the VM on IAM properties of a Resource Group, the VM will be able to access all the storage accounts of the RG.
Login to VM and assume the identity (run the command from the same location where the azcopy is located)
For windows : azcopy login --identity
For linux : ./azcopy login --identity
Upload or download the files now:
azcopy cp "source-file" "storageUri/blob-container/" --recursive=true
Example: azcopy cp "C:\test.txt" "https://mystorageaccount.blob.core.windows.net/backup/" --recursive=true
IAM permission can take few minutes to propagate. If you change/add the permissions/access level anywhere, run the azcopy login --identity command again to get the updated identity.
More info on Azcopy is available here

access a file from a directory in azure blob storage through Azure Logic App

I am using LogicApp to import a set of files which are inside the directory(/devcontainer/sample1/abc.csv).
The problem here is that,I could not even located to the azure file from my LogicApp, I am getting the following error as:
verify that the path exists and does not contain the blob name.List Folder is not allowed on blobs.
Screenshots for reference
The problem here is that,I could not even located to the azure file from my LogicApp,
The file explorer will show all the contains and blobs when you choose blob path. And it will cache the data for a period of time to ensure the smoothness of the operation. If a blob is added to the container recently, it will not be seen and chosen from the file explorer. The workaround is by clicking the change connection link and using a new connection to retrieve the data.
Does your blob connection pointing to the correct storage account? one thing you can try to do is instead of providing the path try to browse the path so that you can what are the containers and the blobs that are present in the storage account that you are trying to access.

How to attach my uploaded vhd to a virtual machine in Azure?

I have uploaded successfully my 1TB vhd (not containing Windows files) to Azure storage.
Now I want to attach it as a second drive to my virtual machine but in the attach list I can find only the "attach an empty disk" option!
I used Add-AzureVhd to upload the vhd file:
Creating new page blob of size 999653638656...
I linked the storage resource in Cloud Service but the vhd is still not available to mount.
The container of the storage where I uploaded my vhd is the same with the one where C: drive of my VM is saved.
The container access is set to private.
Will it help if I change it to Public Blob or Public Container?
What else to try?
Thanks
Take a look at the PowerShell command Add-AzureDataDisk. This should be what you're looking for, as you can specify media location of the uploaded vhd.
Alternatively, in the portal, go to Virtual Machines and navigate to the Disks tab, where you can create a new disk:
At this point, you can navigate to your uploaded vhd:
After this is done, the new disk should become available for you to add to a Virtual Machine.
It should show the options to attach (Empty Disk and Existing Disk) as show in this link from Azure documentation.
Assuming the above not possible for what ever may be the reason, the alternative is
As the you already claim you are able to see the Attach Empty Disk; you can attach a 1 TB disk and download and put the blob contents there.
You wont be charged for the out-bandwidth as it is all internal
Make sure you used CSUpload and not just pushed the VHD to blob storage. See: http://msdn.microsoft.com/en-us/library/windowsazure/gg466228.aspx

Azure Drive addressing using local emulated blob store

I am unable to get a simple tech demo working for Azure Drive using a locally hosted service running the storage/compute emulator. This is not my first azure project, only my first use of the Azure Drive feature.
The code:
var localCache = RoleEnvironment.GetLocalResource("MyAzureDriveCache");
CloudDrive.InitializeCache(localCache.RootPath, localCache.MaximumSizeInMegabytes);
var creds = new StorageCredentialsAccountAndKey("devstoreaccount1", "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==");
drive = new CloudDrive(new Uri("http://127.0.0.1:10000/devstoreaccount1/drive"), creds);
drive.CreateIfNotExist(16);
drive.Mount(0, DriveMountOptions.None);
With local resource configuration:
LocalStorage name="MyAzureDriveCache" cleanOnRoleRecycle="false" sizeInMB="220000"
The exception:
Uri http://127.0.0.1:10000/devstoreaccount1/drive is Invalid
Information on how to address local storage can be found here: https://azure.microsoft.com/en-us/documentation/articles/storage-use-emulator/
I have used the storage emulator UI to create the C:\Users...\AppData\Local\dftmp\wadd\devstoreaccount1 folder which I would expect to act as the container in this case.
However, I am following those guidelines (as far as I can tell) and yet still I receive the exception. Is anyone able to identify what I am doing wrong in this case? I had hoped to be able to resolve this easily using a working sample where someone else is using CloudDrive with 127.0.0.1 or localhost but was unable to find such on Google.
I think you have passed several required steps before mounting.
You have to initialize the local cache for the drive, and the URI of the page blob containing the Cloud Drive before mounting it.
Initializing the cache:
// Initialize the local cache for the Azure drive
LocalResource cache = RoleEnvironment.GetLocalResource("LocalDriveCache");
CloudDrive.InitializeCache(cache.RootPath + "cache", cache.MaximumSizeInMegabytes);
Defining the URI of the page blob, usually made in the configuration file:
// Retrieve URI for the page blob that contains the cloud drive from configuration settings
string imageStoreBlobUri = RoleEnvironment.GetConfigurationSettingValue("< Configuration name>");

Copying storage data from one Azure account to another

I would like to copy a very large storage container from one Azure storage account into another (which also happens to be in another subscription).
I would like an opinion on the following options:
Write a tool that would connect to both storage accounts and copy blobs one at a time using CloudBlob's DownloadToStream() and UploadFromStream(). This seems to be the worst option because it will incur costs when transferring the data and also be quite slow because data will have to come down to the machine running the tool and then get re-uploaded back to Azure.
Write a worker role to do the same - this should theoretically be faster and not incur any cost. However, this is more work.
Upload the tool to a running instance bypassing the worker role deployment and pray the tool finishes before the instance gets recycled/reset.
Use an existing tool - have not found anything interesting.
Any suggestions on the approach?
Update: I just found out that this functionality has finally been introduced (REST APIs only for now) for all storage accounts created on July 7th, 2012 or later:
http://msdn.microsoft.com/en-us/library/windowsazure/dd894037.aspx
You can also use AzCopy that is part of the Azure SDK.
Just click the download button for Windows Azure SDK and choose WindowsAzureStorageTools.msi from the list to download AzCopy.
After installing, you'll find AzCopy.exe here: %PROGRAMFILES(X86)%\Microsoft SDKs\Windows Azure\AzCopy
You can get more information on using AzCopy in this blog post: AzCopy – Using Cross Account Copy Blob
As well, you could remote desktop into an instance and use this utility for the transfer.
Update:
You can also copy blob data between storage accounts using Microsoft Azure Storage Explorer as well. Reference link
Since there's no direct way to migrate data from one storage account to another, you'd need to do something like what you were thinking. If this is within the same data center, option #2 is the best bet, and will be the fastest (especially if you use an XL instance, giving you more network bandwidth).
As far as complexity, it's no more difficult to create this code in a worker role than it would be with a local application. Just run this code from your worker role's Run() method.
To make things more robust, you could list the blobs in your containers, then place specific file-move request messages into an Azure queue (and optimize by putting more than one object name per message). Then use a worker role thread to read from the queue and process objects. Even if your role is recycled, at worst you'd reprocess one message. For performance increase, you could then scale to multiple worker role instances. Once the transfer is complete, you simply tear down the deployment.
UPDATE - On June 12, 2012, the Windows Azure Storage API was updated, and now allows cross-account blob copy. See this blog post for all the details.
here is some code that leverages the .NET SDK for Azure available at http://www.windowsazure.com/en-us/develop/net
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.WindowsAzure.StorageClient;
using System.IO;
using System.Net;
namespace benjguinAzureStorageTool
{
class Program
{
private static Context context = new Context();
static void Main(string[] args)
{
try
{
string usage = string.Format("Possible Usages:\n"
+ "benjguinAzureStorageTool CopyContainer account1SourceContainer account2SourceContainer account1Name account1Key account2Name account2Key\n"
);
if (args.Length < 1)
throw new ApplicationException(usage);
int p = 1;
switch (args[0])
{
case "CopyContainer":
if (args.Length != 7) throw new ApplicationException(usage);
context.Storage1Container = args[p++];
context.Storage2Container = args[p++];
context.Storage1Name = args[p++];
context.Storage1Key = args[p++];
context.Storage2Name = args[p++];
context.Storage2Key = args[p++];
CopyContainer();
break;
default:
throw new ApplicationException(usage);
}
Console.BackgroundColor = ConsoleColor.Black;
Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("OK");
Console.ResetColor();
}
catch (Exception ex)
{
Console.WriteLine();
Console.BackgroundColor = ConsoleColor.Black;
Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("Exception: {0}", ex.Message);
Console.ResetColor();
Console.WriteLine("Details: {0}", ex);
}
}
private static void CopyContainer()
{
CloudBlobContainer container1Reference = context.CloudBlobClient1.GetContainerReference(context.Storage1Container);
CloudBlobContainer container2Reference = context.CloudBlobClient2.GetContainerReference(context.Storage2Container);
if (container2Reference.CreateIfNotExist())
{
Console.WriteLine("Created destination container {0}. Permissions will also be copied.", context.Storage2Container);
container2Reference.SetPermissions(container1Reference.GetPermissions());
}
else
{
Console.WriteLine("destination container {0} already exists. Permissions won't be changed.", context.Storage2Container);
}
foreach (var b in container1Reference.ListBlobs(
new BlobRequestOptions(context.DefaultBlobRequestOptions)
{ UseFlatBlobListing = true, BlobListingDetails = BlobListingDetails.All }))
{
var sourceBlobReference = context.CloudBlobClient1.GetBlobReference(b.Uri.AbsoluteUri);
var targetBlobReference = container2Reference.GetBlobReference(sourceBlobReference.Name);
Console.WriteLine("Copying {0}\n to\n{1}",
sourceBlobReference.Uri.AbsoluteUri,
targetBlobReference.Uri.AbsoluteUri);
using (Stream targetStream = targetBlobReference.OpenWrite(context.DefaultBlobRequestOptions))
{
sourceBlobReference.DownloadToStream(targetStream, context.DefaultBlobRequestOptions);
}
}
}
}
}
Its very simple with AzCopy. Download latest version from https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/
and in azcopy type:
Copy a blob within a storage account:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer1 /Dest:https://myaccount.blob.core.windows.net/mycontainer2 /SourceKey:key /DestKey:key /Pattern:abc.txt
Copy a blob across storage accounts:
AzCopy /Source:https://sourceaccount.blob.core.windows.net/mycontainer1 /Dest:https://destaccount.blob.core.windows.net/mycontainer2 /SourceKey:key1 /DestKey:key2 /Pattern:abc.txt
Copy a blob from the secondary region
If your storage account has read-access geo-redundant storage enabled, then you can copy data from the secondary region.
Copy a blob to the primary account from the secondary:
AzCopy /Source:https://myaccount1-secondary.blob.core.windows.net/mynewcontainer1 /Dest:https://myaccount2.blob.core.windows.net/mynewcontainer2 /SourceKey:key1 /DestKey:key2 /Pattern:abc.txt
I'm a Microsoft Technical Evangelist and I have developed a sample and free tool (no support/no guarantee) to help in these scenarios.
The binaries and source-code are available here: https://blobtransferutility.codeplex.com/
The Blob Transfer Utility is a GUI tool to upload and download thousands of small/large files to/from Windows Azure Blob Storage.
Features:
Create batches to upload/download
Set the Content-Type
Transfer files in parallel
Split large files in smaller parts that are transferred in parallel
The 1st and 3rd feature is the answer to your problem.
You can learn from the sample code how I did it, or you can simply run the tool and do what you need to do.
Write your tool as a simple .NET Command Line or Win Forms application.
Create and deploy a dummy we/worker role with RDP enabled
Login to the machine via RDP
Copy your tool over the RDP connection
Run the tool on the remote machine
Delete the deployed role.
Like you I am not aware of any of the off the shelf tools supporting a copy between function.
You may like to consider just installing Cloud Storage Studio into the role though and dumping to disk then re-uploading. http://cerebrata.com/Products/CloudStorageStudiov2/Details.aspx?t1=0&t2=7
Use could 'Azure Storage Explorer' (free) or some other such tool. These tools provide a way to download and upload content. You will need to manually create containers and tables - and of course this will incur a transfer cost - but if you are short on time and your contents are of reasonable size then this is a viable option.
I recommend use azcopy, you can copy the all the storage account, a container, a directory or a single blob. Here al example of cloning all the storage account:
azcopy copy 'https://{SOURCE_ACCOUNT}.blob.core.windows.net{SOURCE_SAS_TOKEN}' 'https://{DESTINATION_ACCOUNT}.blob.core.windows.net{DESTINATION_SAS_TOKEN}' --recursive
You can get SAS token from Azure Portal. Navigate to storage account overviews (source and destination), then in the sidenav click on "Shared access sigantura" and generate your own.
More examples here
I had to do somethign similar to move 600 GB of content from a local file system to Azure Storage. After a couple iterations of code I finally ended up with taking the 'Azure Storage Explorer' and extended it with ability to select folders instead of just files and then have it recursively drill into the multiple selected folders, loaded a list of Source / Destination copy item statements into an Azure Queue. Then in the upload section in 'Azure Storage Explorer', in the Queue section to pull from the queue and execute the copy operation.
Then I launched like 10 instances of the 'Azure Storage Explorer' tool and had each pulling from the queue and executing the copy operation. I was able to move the 600 GB of items in just over 2 days. Added in smarts to utilize the modified time stamps on files and have it skip over files that have already been both copied from the queue and not add to the queue if it is in sync. Now I can run "updates" or syncs within an hour or two across the whole library of content.
Try CloudBerry Explorer. It copies blob within and between subscriptions.
For copying between subscriptions, edit the storage account container's access from Private to Public Blob.
The copying process took few hours to complete. If you choose to reboot your machine, the process will continue. Check status by refreshing the target storage account container in Azure management UI by checking the timestamp, the value gets updated until the copy process completes.

Resources