I am trying to upload a file, I get an exception "The specified resource name contains invalid characters".
The path I am using is #"C:\Test\Test.txt". When I change to relative addressing (i.e., #".\Test.txt") and have the file in the exe folder, it will work.
What I need to know is relative addressing the only option to upload a file to Azure File Storage from a .NET client? Is there a way to reference a file with a full path and upload to File Storage?
Update: Based on the comments and answer below, I realized my mistake: I was supplying the incoming file path to the GetFileReference method, where this should be the name of the new file in Azure, hence it contained the ':' which was invalid. Comments are right, I should have provided code, may have been diagnosed easier.
public static async Task WriteFileToStorage(string filePath)
{
CloudFileShare fileShare = GetCloudFileShare();
CloudFileDirectory fileDirectory = fileShare.GetRootDirectoryReference();
CloudFile cloudFile = fileDirectory.GetFileReference(filePath);
await cloudFile.UploadFromFileAsync(filePath);
}
.Net client does support the full path when upload to azure file storage.
You'd better provide the complete code you're using including the file name / path in local and azure file storage.
Here is the code I test with, and it works.(and I'm using the package WindowsAzure.Storage, version 9.3.3 ):
static void Main(string[] args)
{
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials("account_name", "account_key"), true);
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare fileShare = fileClient.GetShareReference("test");
CloudFileDirectory rootDir = fileShare.GetRootDirectoryReference();
CloudFile myfile = rootDir.GetFileReference("mytest.txt");
//the full file path on local
myfile.UploadFromFile(#"C:\Test\Test.txt");
}
Related
I am using "Azure Storage File Shares" to store some files from our website, but failed with error message "The specified share already exists".
I have change the file that being upload, but the error persist.
Here my code
public static void Test2Upload()
{
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls12;
string connectionString = "DefaultEndpointsProtocol=https;AccountName=xxxxx;AccountKey=xxxxx;EndpointSuffix=core.windows.net";
string shareName = "myapp-dev";
string dirName = "files";
string fileName = "catto.jpg";
// Path to the local file to upload
string localFilePath = #"d:\temp\two.jpg";
// Get a reference to a share and then create it
ShareClient share = new ShareClient(connectionString, shareName);
share.Create();
// Get a reference to a directory and create it
ShareDirectoryClient directory = share.GetDirectoryClient(dirName);
directory.Create();
// Get a reference to a file and upload it
ShareFileClient file = directory.GetFileClient(fileName);
using (FileStream stream = File.OpenRead(localFilePath))
{
file.Create(stream.Length);
file.UploadRange(
new HttpRange(0, stream.Length),
stream);
}
}
Looks like I should not create ShareClient with same name several times.
Then how to check and use it?
The most important question is, why the file still not yet uploaded (even if I rename the ShareClient object)?
Looks like I should not create ShareClient with same name several
times. Then how to check and use it?
You can use ShareClient.CreateIfNotExists instead of ShareClient.Create method. Former will try to create a share but if a share already exists, then it won't be changed.
You can also use ShareClient.Exists to check if the share exists and then create it using ShareClient.Create if it does not exist. This is not recommended however as it might not work if multiple users are executing that code at the same time. Furthermore, you will be making 2 network calls - first to check the existence of share and then the second to create it.
The most important question is, why the file still not yet uploaded
(even if I rename the ShareClient object)?
Your code for uploading the file looks ok to me. Are you getting any error in that code?
We could use ShareClient.CreateIfNotExists when creating ShareClient object to avoid the problem. Like below
ShareClient share = new ShareClient(connectionString, shareName);
share.CreateIfNotExists();
You might found Similar problem on ShareDirectoryClient.
This part purpose is to create the folder structure.
The upload will fail if the destination folder is not exist.
Error will occur if we create a folder when it already exist.
So, use method ShareDirectoryClient.CreateIfNotExists, like below
ShareDirectoryClient directory = share.GetDirectoryClient(dirName);
directory.CreateIfNotExists();
Here my complete code
public static void TestUpload()
{
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls12;
string connectionString = "DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=xx;EndpointSuffix=core.windows.net";
string shareName = "myapp-dev";
string dirName = "myfiles";
string fileName = "catto.jpg";
string localFilePath = #"d:\temp\two.jpg";
// Get a reference to a share and then create it
ShareClient share = new ShareClient(connectionString, shareName);
share.CreateIfNotExists();
// Get a reference to a directory and create it
ShareDirectoryClient directory = share.GetDirectoryClient(dirName);
directory.CreateIfNotExists();
// Get a reference to a file and upload it
ShareFileClient file = directory.GetFileClient(fileName);
using (FileStream stream = File.OpenRead(localFilePath))
{
file.Create(stream.Length);
file.UploadRange(
new HttpRange(0, stream.Length),
stream);
}
}
I want to create an logic app which list all my files on my azure file storage and then copy them to an SFTP server. I have setup the following flow
1. List files in file storage
2. Get meta data of file
3. Get content of file
4. Create file on SFTP
With files smaller than 300MB everything works fine, but when I want to copy an file > 300MB I get
"The file contains 540.782 megabytes which exceeds the maximum 300 megabytes."
So is there an workaround or another solution for my issue?
This limit could caused it's still a preview feature, and then I test with blob it doesn't have this limit.
So I suppose you could call a function to upload the file yo blob after get the file name then use blob connector to get the blob content and upload to ftp server.
The below is my function code, In my test I don't pass the file name from request, so you could change it to get the name from request.
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequestMessage req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("storage connection string");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("windows");
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
String filename = "Downloads.zip";
CloudFile file = rootDir.GetFileReference(filename);
string fileSas = file.GetSharedAccessSignature(new SharedAccessFilePolicy()
{
// Only read permissions are required for the source file.
Permissions = SharedAccessFilePermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
Uri fileSasUri = new Uri(file.StorageUri.PrimaryUri.ToString() + fileSas);
var myClient = storageAccount.CreateCloudBlobClient();
var container = myClient.GetContainerReference("test");
var blockBlob= container.GetBlockBlobReference("test.zip");
await blockBlob.StartCopyAsync(fileSasUri);
return (ActionResult)new OkObjectResult($"success");
}
The below is my test flow. Note the blob path, you need to modify it with this /test/#{body('Get_file_metadata')?['DisplayName']}.
And this is the result pic, from the size you could I test with a file more than 300MB.
How do I copy a block (or page) blob in azure storage to an Azure file share?
My example code works fine if I download the block blob to a local file, but there does not appear to be a method to download to an Azure file share. I've looked at the Azure data movements library but there's no example of how to do this.
void Main()
{
string myfile =#"Image267.png";
CredentialEntity backupCredentials = Utils.GetBackupsCredentials();
CloudStorageAccount backupAccount = new CloudStorageAccount(new StorageCredentials(backupCredentials.Name, backupCredentials.Key), true);
CloudBlobClient backupClient = backupAccount.CreateCloudBlobClient();
CloudBlobContainer backupContainer = backupClient.GetContainerReference(#"archive");
CloudBlockBlob blob = backupContainer.GetBlockBlobReference(myfile);
CredentialEntity fileCredentails = Utils.GetFileCredentials();
CloudStorageAccount fileAccount = new CloudStorageAccount(new StorageCredentials(fileCredentails.Name,fileCredentails.Key), true);
CloudFileClient fileClient = fileAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference(#"xfer");
if (share.Exists())
{
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
CloudFileDirectory sampleDir = rootDir.GetDirectoryReference("hello");
if (sampleDir.Exists())
{
CloudFile file = sampleDir.GetFileReference(myfile);
// blob.DownloadToFile(file.ToString());
}
}
}
The part that does not work is the commented out line blob.DownloadToFile
Any ideas on how I can do this?
There is an example in the official docs here: it's used to copy files from file share to blob storage, but you can make a little change to copy from blob storage to file share. I also write a sample code which is used copy from blob storage to file share, you can take a look at it as below.
You can use SAS token(for the source blob or source file) to copy files to blob / or copy blob to files, in the same storage account or different storage account.
A sample code as below(copy blob to file in the same storage account, and you can make a little change if they are in different storage account):
static void Main(string[] args)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("connection string");
var blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = blobClient.GetContainerReference("t0201");
CloudBlockBlob sourceCloudBlockBlob = cloudBlobContainer.GetBlockBlobReference("test.txt");
//Note that if the file share is in a different storage account, you should use CloudStorageAccount storageAccount2 = CloudStorageAccount.Parse("the other storage connection string"), then use storageAccount2 for the file share.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("testfolder");
CloudFile destFile = share.GetRootDirectoryReference().GetFileReference("test.txt");
//Create a SAS for the source blob
string blobSas = sourceCloudBlockBlob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
Uri blobSasUri = new Uri(sourceCloudBlockBlob.StorageUri.PrimaryUri.ToString()+blobSas);
destFile.StartCopy(blobSasUri);
Console.WriteLine("done now");
Console.ReadLine();
}
it works well at my side, hope it helps.
There are a few samples in here but in dot net. A non code related alternative would be to use Azcopy it would help transfer data from blob to File Shares and vice versa.
External Suggestions:
The following tool called Blobxfer seems to support the transfer but in python.
I have rather foolishly uploaded a vhd to Azure file storage thinking I can create a virtual machine from it only to find out it really needs to be in blob storage.
I know I can just upload it again - but it is very large and my upload speed is very slow.
My question is - can I move a file from file storage to blob storage without downloading/uploading again? I.e. is there anything in the Azure portal UI to do it, or even a PowerShell command?
You can try AzCopy:
AzCopy.exe /Source:{*URL to source container*} /Dest:{*URL to dest container*} /SourceKey:{*key1*} /DestKey:{*key2*} /S
When copying from File Storage to Blob Storage, the default blob type is block blob, user can specify option /BlobType:page to change the destination blob type.
AzCopy by default copies data between two storage endpoints asynchronously. Therefore, the copy operation will run in the background using spare bandwidth capacity that has no SLA in terms of how fast a blob will be copied, and AzCopy will periodically check the copy status until the copying is completed or failed. The /SyncCopy option ensures that the copy operation will get consistent speed.
In c#:
public static CloudFile GetFileReference(CloudFileDirectory parent, string path)
{
var filename = Path.GetFileName(path);
var fullPath = Path.GetDirectoryName(path);
if (fullPath == string.Empty)
{
return parent.GetFileReference(filename);
}
var dirReference = GetDirectoryReference(parent, fullPath);
return dirReference.GetFileReference(filename);
}
public static CloudFileDirectory GetDirectoryReference(CloudFileDirectory parent, string path)
{
if (path.Contains(#"\"))
{
var paths = path.Split('\\');
return GetDirectoryReference(parent.GetDirectoryReference(paths.First()), string.Join(#"\", paths.Skip(1)));
}
else
{
return parent.GetDirectoryReference(path);
}
}
The code to copy:
// Source File Storage
string azureStorageAccountName = "shareName";
string azureStorageAccountKey = "XXXXX";
string name = "midrive";
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials(azureStorageAccountName, azureStorageAccountKey), true);
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare fileShare = fileClient.GetShareReference(name);
CloudFileDirectory directorio = fileShare.GetRootDirectoryReference();
CloudFile cloudFile = GetFileReference(directorio, "SourceFolder\\fileName.pdf");
// Destination Blob
string destAzureStorageAccountName = "xx";
string destAzureStorageAccountKey = "xxxx";
CloudStorageAccount destStorageAccount = new CloudStorageAccount(new StorageCredentials(destAzureStorageAccountName, destAzureStorageAccountKey), true);
CloudBlobClient destClient = destStorageAccount.CreateCloudBlobClient();
CloudBlobContainer destContainer = destClient.GetContainerReference("containerName");
CloudBlockBlob destBlob = destContainer.GetBlockBlobReference("fileName.pdf");
// copy
await TransferManager.CopyAsync(cloudFile, destBlob, true);
Another option is to use Azure CLI...
az storage copy -s /path/to/file.txt -d https://[account].blob.core.windows.net/[container]/[path/to/blob]
More info here: az storage copy
Thanks to Gaurav Mantri for pointing me in the direction of AzCopy.
This does allow me to copy between file and blob storage using the command:
AzCopy.exe /Source:*URL to source container* /Dest:*URL to dest container* /SourceKey:*key1* /DestKey:*key2* /S
However as Gaurav also rightly points out in the comment the resulting blob will be of type Block Blob and this is no good for me. I need one of type Page Blob in order to create a VM out of it using https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-specialized-vhd
There is no way to change the blob type as far as I can see once it is up there in the cloud, so it looks like my only option is to wait for a lengthy upload again.
I've got some files sitting in an Azure File storage.
I'm trying to programatically archive them to Azure Blobs and I'm not sure how to do this efficiently.
I keep seeing code samples about copying from one blob container to another blob container .. but not from a File to a Blob.
Is it possible to do without downloading the entire File content locally and then uploading this content? Maybe use Uri's or something?
More Info:
The File and Blob containers are in the same Storage account.
The storage account is RA-GRS
Here was some sample code I was thinking of doing .. but it just doesn't feel right :( (pseudo code also .. with validation and checks omitted).
var file = await ShareRootDirectory.GetFileReference(fileName);
using (var stream = new MemoryStream())
{
await file.DownloadToStreamAsync(stream);
// Custom method that basically does:
// 1. GetBlockBlobReference
// 2. UploadFromStreamAsync
await cloudBlob.AddItemAsync("some-container", stream);
}
How to copy an Azure File to an Azure Blob?
We also can use CloudBlockBlob.StartCopy(CloudFile). CloudFile type is also can be accepted by the CloudBlockBlob.StartCopy function.
How to copy CloudFile to blob please refer to document. The following demo code is snippet from the document.
// Parse the connection string for the storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
Microsoft.Azure.CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create a CloudFileClient object for credentialed access to File storage.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Create a new file share, if it does not already exist.
CloudFileShare share = fileClient.GetShareReference("sample-share");
share.CreateIfNotExists();
// Create a new file in the root directory.
CloudFile sourceFile = share.GetRootDirectoryReference().GetFileReference("sample-file.txt");
sourceFile.UploadText("A sample file in the root directory.");
// Get a reference to the blob to which the file will be copied.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("sample-container");
container.CreateIfNotExists();
CloudBlockBlob destBlob = container.GetBlockBlobReference("sample-blob.txt");
// Create a SAS for the file that's valid for 24 hours.
// Note that when you are copying a file to a blob, or a blob to a file, you must use a SAS
// to authenticate access to the source object, even if you are copying within the same
// storage account.
string fileSas = sourceFile.GetSharedAccessSignature(new SharedAccessFilePolicy()
{
// Only read permissions are required for the source file.
Permissions = SharedAccessFilePermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
// Construct the URI to the source file, including the SAS token.
Uri fileSasUri = new Uri(sourceFile.StorageUri.PrimaryUri.ToString() + fileSas);
// Copy the file to the blob.
destBlob.StartCopy(fileSasUri);
Note:
If you are copying a blob to a file, or a file to a blob, you must use a shared access signature (SAS) to authenticate the source object, even if you are copying within the same storage account.
Use the Transfer Manager:
https://msdn.microsoft.com/en-us/library/azure/microsoft.windowsazure.storage.datamovement.transfermanager_methods.aspx
There are methods to copy from CloudFile to CloudBlob.
Add the "Microsoft.Azure.Storage.DataMovement" nuget package
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.File;
using Microsoft.WindowsAzure.Storage.DataMovement;
private string _storageConnectionString = "your_connection_string_here";
public async Task CopyFileToBlob(string blobContainer, string blobPath, string fileShare, string fileName)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(_connectionString);
CloudFileShare cloudFileShare = storageAccount.CreateCloudFileClient().GetShareReference(fileShare);
CloudFile source = cloudFileShare.GetRootDirectoryReference().GetFileReference(fileName);
CloudBlobContainer blobContainer = storageAccount.CreateCloudBlobClient().GetContainerReference(blobContainer);
CloudBlockBlob target = blobContainer.GetBlockBlobReference(blobPath);
await TransferManager.CopyAsync(source, target, true);
}