How to use StartCopyFromBlob between different accounts? - azure

I am using this code to copy blobs from one account to another... but it throws an exception.
var srcAccount = CloudStorageAccount.Parse("connection string 1");
var dstAccount = CloudStorageAccount.Parse("connection string 2");
var srcBlobClient = srcAccount.CreateCloudBlobClient();
var dstBlobClient = dstAccount.CreateCloudBlobClient();
foreach (var srcCloudBlobContainer in srcBlobClient.ListContainers())
{
var dstCloudBlobContainer = dstBlobClient
.GetContainerReference(srcCloudBlobContainer.Name);
dstCloudBlobContainer.CreateIfNotExists();
foreach (var srcBlob in srcCloudBlobContainer.ListBlobs())
{
if (srcBlob.GetType() == typeof(CloudBlockBlob))
{
var srcBlockBlock = (CloudBlockBlob)srcBlob;
var dstBlockBlock = dstCloudBlobContainer
.GetBlockBlobReference(srcBlockBlock.Name);
// throws exception StorageException:
// The remote server returned an error: (404) Not Found.
dstBlockBlock.StartCopyFromBlob(srcBlockBlock.Uri);
}
}
}
Microsoft states that cross account copy is supported, but I cannot get it to work.
What am I doing wrong?

Can you check the source blob container's ACL? If it's Private you may either need to change the ACL to Public / Blob or create a SAS URL. You can use the following code if you wish to keep your blob container's ACL as Private and make use of SAS URL:
var srcAccount = CloudStorageAccount.Parse("connection string 1");
var dstAccount = CloudStorageAccount.Parse("connection string 2");
var srcBlobClient = srcAccount.CreateCloudBlobClient();
var dstBlobClient = dstAccount.CreateCloudBlobClient();
foreach (var srcCloudBlobContainer in srcBlobClient.ListContainers())
{
var dstCloudBlobContainer = dstBlobClient
.GetContainerReference(srcCloudBlobContainer.Name);
dstCloudBlobContainer.CreateIfNotExists();
//Assuming the source blob container ACL is "Private", let's create a Shared Access Signature with
//Start Time = Current Time (UTC) - 15 minutes to account for Clock Skew
//Expiry Time = Current Time (UTC) + 7 Days - 7 days is the maximum time allowed for copy operation to finish.
//Permission = Read so that copy service can read the blob from source
var sas = srcCloudBlobContainer.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-15),
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(7),
Permissions = SharedAccessBlobPermissions.Read,
});
foreach (var srcBlob in srcCloudBlobContainer.ListBlobs())
{
if (srcBlob.GetType() == typeof(CloudBlockBlob))
{
var srcBlockBlock = (CloudBlockBlob)srcBlob;
var dstBlockBlock = dstCloudBlobContainer
.GetBlockBlobReference(srcBlockBlock.Name);
//Create a SAS URI for the blob
var srcBlockBlobSasUri = string.Format("{0}{1}", srcBlockBlock.Uri, sas);
// throws exception StorageException:
// The remote server returned an error: (404) Not Found.
dstBlockBlock.StartCopyFromBlob(new Uri(srcBlockBlobSasUri));
}
}
}

Related

Migrate from Azure to SharePoint using Migration API - "Must specify IV in Manifest or blob metadata'"

I am trying to use the Migration API with C# to migrate from the default free Azure Cloud (given to all sharepoint accounts) to SharePoint.
I can successfully generate a cloud container, and upload files, but the migration fails. Here is my code:
private void MigrateToAzureThenSharePoint()
{
//Provision The default Azure Containers and Migation Queue
var migrationContainers = destinationCtx.Site.ProvisionMigrationContainers();
var migrationQueue = destinationCtx.Site.ProvisionMigrationQueue();
destinationCtx.ExecuteQuery();
var containerInfoList = migrationContainers.Value;
var dataContainerUri = containerInfoList.DataContainerUri;
var metadataContainerUri = containerInfoList.MetadataContainerUri;
var queueInfo = migrationQueue.Value;
//Instansiate Cloud Queue
var cloudQueue = new CloudQueue(new Uri(queueInfo.JobQueueUri));
//Instansiate CloudBlobContainer
CloudBlobContainer dataContainer = new CloudBlobContainer(new Uri(dataContainerUri));
CloudBlobContainer manifestContainer = new CloudBlobContainer(new Uri(metadataContainerUri));
//Test files to upload
var testfiles = new[]
{
new SourceFile
{
Filename = "test.txt",
LastModified = DateTime.Now,
Contents = Encoding.UTF8.GetBytes("Hi, this is a test text-file"),
Title = "Title of file 1"
},
new SourceFile
{
Filename = "test2.txt",
LastModified = DateTime.Now.AddDays(-1),
Contents = Encoding.UTF8.GetBytes("Tesfile2"),
Title = "Second title"
}
};
//Upload Test Files to Cloud
foreach (var testfile in testfiles)
{
var blobReference = dataContainer.GetBlockBlobReference(testfile.Filename);
blobReference.UploadFromByteArray(testfile.Contents, 0, testfile.Contents.Length, null);
blobReference.CreateSnapshot();
}
var manifestPackage = new ManifestPackage(destOrganisationList.Id,
destinationCtx.Web.Id,
destinationCtx.Web.Title,
destOrganisationList.Title,
#"\",
destOrganisationList.RootFolder.UniqueId,
destOrganisationList.RootFolder.ParentFolder.UniqueId
);
//Create XML Files for Azure Migration (An easier way here would be great? I manualy write them)
var filesInManifestPackage = manifestPackage.GetManifestPackageFiles(testfiles);
//Uploaded manually created XML files
foreach (var migrationPackageFile in filesInManifestPackage)
{
var blobReference = manifestContainer.GetBlockBlobReference(migrationPackageFile.Filename);
blobReference.UploadFromByteArray(migrationPackageFile.Contents, 0, migrationPackageFile.Contents.Length, null);
blobReference.CreateSnapshot();
}
//Start Migration
var result = destinationCtx.Site.CreateMigrationJobEncrypted(destinationCtx.Web.Id,
containerInfoList.DataContainerUri,
containerInfoList.MetadataContainerUri,
queueInfo.JobQueueUri,
new EncryptionOption()
{
AES256CBCKey = containerInfoList.EncryptionKey
});
destinationCtx.ExecuteQuery();
//Check for messags
while (true) {
var msg = cloudQueue.GetMessage();
if (msg == null)
{
Task.Delay(TimeSpan.FromSeconds(1));
continue;
}
var message = JsonConvert.DeserializeObject<EncryptedMessage>(msg.AsString);
cloudQueue.DeleteMessage(msg);
//Decode encrypted message
var decoded = JsonConvert.DeserializeObject<UpdateMessage>(Decrypt(message.Content, containerInfoList.EncryptionKey, message.IV));
//Error
}
}
There are a few errors, but they all are similar to this:
"Unable to download SystemData.xml with exception 'Must specify IV in
Manifest or blob metadata'"
I believe I may have to encrypt the files when I upload from this method:
blobReference.UploadFromByteArray
There is an option:
var blobRequestOptions = new BlobRequestOptions();
blobRequestOptions.EncryptionPolicy =
But I don't know how to instantiate the encryption policy (and I may be going down a rabbit hole)
Any help would be greatly appreciated, as documentation online is shocking minimal.

How to stream Video that are stored in Azure storage Blob using Xamarin

I have uploaded video files to Azur Blob(Containers),I want to access them in the mobile app via Streaming.I have files with extention .mp4 . I have done code to download from blob and Store in local drive then play using default player , But I want to give user a option to stream instead of download.
I have used this method
var credentials = new StorageCredentials("myaccountname", "mysecretkey");
var account = new CloudStorageAccount(credentials, true);
var container = account.CreateCloudBlobClient().GetContainerReference("yourcontainername");
var blob = container.GetBlockBlobReference("yourmp4filename");
var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),//Set this date/time according to your requirements
});
var urlToBePlayed = string.Format("{0}{1}", blob.Uri, sas);//This is the URI which should be embedded in your video player.
Problem:-
If I browse the Url(Blob Url) , It downloads the file instead of directly playing it .But in App , Nothing appears. Blank screen.
I am using
<WebView Source="{Binding VideoUrl}" HeightRequest="200" WidthRequest="200"/>
In Vm:
VideoUrl=url;
First Change content Type : like #Zhaoxing Lu - Microsoft said
public async Task ChangeContentTypeAsync()
{
try
{
UserDialogs.Instance.ShowLoading();
BlobContinuationToken blobContinuationToken = null;
var storageAccount = CloudStorageAccount.Parse(storageConnectionString);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("videos");
var results = await container.ListBlobsSegmentedAsync(null, blobContinuationToken);
blobContinuationToken = results.ContinuationToken;
BlobResultSegment blobs = await blobClient
.GetContainerReference("videos")
.ListBlobsSegmentedAsync(blobContinuationToken)
;
foreach (CloudBlockBlob blob in blobs.Results)
{
if (Path.GetExtension(blob.Uri.AbsoluteUri) == ".mp4")
{
blob.Properties.ContentType = "video/mp4";
}
//// repeat and resume
await blob.SetPropertiesAsync();
}
UserDialogs.Instance.HideLoading();
}
catch (Exception ex)
{
var m = ex.Message;
}
}
Then Use this method :
private async Task StreamVideo(string filename)
{
try
{
UserDialogs.Instance.ShowLoading();
await azureBlob.ChangeContentTypeAsync();
var secretkey = "xxxx";
var credentials = new StorageCredentials("chatstorageblob", secretkey);
var account = new CloudStorageAccount(credentials, true);
var container = account.CreateCloudBlobClient().GetContainerReference("videos");
var blob = container.GetBlockBlobReference(filename);
var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),//Set this date/time according to your requirements
});
var urlToBePlayed = string.Format("{0}{1}", blob.Uri, sas);//This is the URI which should be embedded in your video player.
await Navigation.PushAsync(new VideoPlayerPage(urlToBePlayed));
UserDialogs.Instance.HideLoading();
}
catch (Exception ex)
{
var m = ex.Message;
}
}

CSOM CreateSPAsyncReadJob stays in Queue state

I am referring to the Migration Asynchronous Read API that allows creating a read job on SharePoint using CSOM. I am able to create the read job successfully but unfortunately, the job stays in a queue state since long.
The function returns the Object that includes UniqueJobID, AzureContainerManifestUri, AzureQueueReportUri and EncryptionKey
By using clientContext.Site.GetMigrationJobStatus method I am able to check the read job status that always returns Queued
Here is the sample code for reference:
using (var clientContext = new ClientContext(siteUrl))
{
clientContext.Credentials = new SharePointOnlineCredentials(userName, password);
var result = clientContext.Site.CreateSPAsyncReadJob($"{siteUrl}/List/MyList", new AsyncReadOptions { });
clientContext.ExecuteQuery();
MigrationJobState state;
do
{
var status = clientContext.Site.GetMigrationJobStatus(result[0].JobId);
clientContext.ExecuteQuery();
state = status.Value;
} while (state == MigrationJobState.Queued);
}
I have also tried to connect to the AzureQueueReportUri queue that contains the message with encrypted content. I am not sure how we can decrypt the content to make it human readable. Here is the sample message:
{
"Label": "Encrypted",
"JobId": "079ece4a-cfd2-4676-a27d-2662beb5bb0a",
"IV": "RYc+ZA2feX1hnAcVWR1R+w==",
"Content": "qbjTBbb2N+DkNumLoCJSAAfwj8etDLgjxp+b2T9k03L9WfRJKlFBIZO457q+CbHA+8DHJS7VbPzVMoW6ybo2GxgteTYVP+yVUOPPvz57VGQJyzg2gss+Bsjn73GTWWUfwC/W+oWnEpt8PawZysCjSNf6A4HKZKewkskCshN/pND8ZpevrGt2qq0dTt0NkTIkuYv5AvIP7DSWjdl7nN/W5x4c2nR0sPFqKYom41a4tIqrruzwCDEEjWLFtuXAQ+UN2TMV9PWabRFe9n/P1RHrAJaNU+JjJiJm+lE1dQChz+7OuQoJsYnbjYTbqEE8CnIB0/E0zTrc3zLc6th8MBsKpZJjd31ovqr/Xez6zCnvMKotSdScFtTgQqHxmVDBMfMgi2mm8cKQpdKwRufP/YhaDQlvFkmj2FQN0KAMNxwFBh/MWCVhz5uCJ50CGhChcn4h"
}
I am also not able to connect the AzureContainerManifestUri blob container. It fails with an error Authentication Error. Signature did not match.
Can anyone please guide me how can I proceed ahead?
The method parameters have been changed. Here is the latest updated documentation: https://learn.microsoft.com/en-us/sharepoint/dev/apis/export-amr-api
Sample Code: https://gist.github.com/techmadness/484e7de0a7c51e5faf952a79f1eacb85
using System;
using System.Linq;
using System.Security;
using System.Threading;
using Microsoft.SharePoint.Client;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.Queue;
namespace ConsoleApp1
{
internal class Program
{
private static void Main(string[] args)
{
var userName = "admin#tenant.onmicrosoft.com";
var password = GetSecurePassword("password");
var siteUrl = "https://tenant.sharepoint.com/sites/testsite";
var listUrl = $"{siteUrl}/testlist";
var azStorageConnectionStrong = "DefaultEndpointsProtocol=https;AccountName=abcd;AccountKey=xyz";
using (var clientContext = new ClientContext(siteUrl))
{
clientContext.Credentials = new SharePointOnlineCredentials(userName, password);
var azManifestContainer = CreateContainerIfNotExists(azStorageConnectionStrong, "spread-manifest-container");
var azReportQueue = CreateQueueIfNotExists(azStorageConnectionStrong, "spread-report-queue");
var azManifestContainerUrl = GetSASUrl(azManifestContainer);
var azReportQueueUrl = GetSASUrl(azReportQueue);
var output = clientContext.Site.CreateSPAsyncReadJob(
listUrl,
new AsyncReadOptions
{
IncludeDirectDescendantsOnly = true,
IncludeSecurity = true,
},
null,
azManifestContainerUrl,
azReportQueueUrl);
clientContext.ExecuteQuery();
CloudQueueMessage message;
do
{
Thread.Sleep(TimeSpan.FromSeconds(10));
message = azReportQueue.GetMessage();
if (message != null)
{
Console.WriteLine(message.AsString);
azReportQueue.DeleteMessage(message);
}
} while (message != null);
Console.ReadLine();
}
}
private static SecureString GetSecurePassword(string pwd)
{
SecureString securePassword = new SecureString();
foreach (var ch in pwd.ToArray())
{
securePassword.AppendChar(ch);
}
return securePassword;
}
private static CloudBlobContainer CreateContainerIfNotExists(string storageConnectionString, string containerName)
{
var storageAccount = CloudStorageAccount.Parse(storageConnectionString);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(containerName);
container.CreateIfNotExistsAsync().GetAwaiter().GetResult();
return container;
}
private static CloudQueue CreateQueueIfNotExists(string storageConnectionString, string queueName)
{
var cloudStorageAccount = CloudStorageAccount.Parse(storageConnectionString);
var queueClient = cloudStorageAccount.CreateCloudQueueClient();
var queue = queueClient.GetQueueReference(queueName);
queue.CreateIfNotExistsAsync().GetAwaiter().GetResult();
return queue;
}
public static string GetSASUrl(CloudBlobContainer container)
{
var sharedAccessSignature = container.GetSharedAccessSignature(new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write,
SharedAccessStartTime = DateTime.UtcNow.AddDays(-1),
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(7),
});
return container.StorageUri.PrimaryUri + sharedAccessSignature;
}
public static string GetSASUrl(CloudQueue queue)
{
var sharedAccessSignature = queue.GetSharedAccessSignature(new SharedAccessQueuePolicy
{
Permissions = SharedAccessQueuePermissions.Add | SharedAccessQueuePermissions.Read,
SharedAccessStartTime = DateTime.UtcNow.AddDays(-1),
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(7)
});
return queue.StorageUri.PrimaryUri + sharedAccessSignature;
}
}
}

Azure File Share REST API for Xamarin

In Xamarin client app, I want to access Azure Files using SAS token with Portable Class Library. It seems I can not do it using latest WindowAzure.Storage nuget package as it may only works with Blob, Table,... and it requires lots of dependencies.
Is there anyway to accomplish this?
I ended up with using Azure File Storage REST API.
Basically we request SAS token generated from Azure Share first then using that SAS token in URL to send http request to Azure Files Storage:
https://[yourshare].file.core.windows.net/[yourdirectory]/[yourfile]?[your_sas_token]
I have created a class to help client doing some basic operations as below (it is portable class so can use anywhere in client side):
public class AzureFileREST
{
private AzureSASToken _azureShareToken;
public AzureFileREST(AzureSASToken azureShareToken)
{
_azureShareToken = azureShareToken;
}
public async Task CreateIfNotExist(string directoryName)
{
var existed = await CheckDirectoryExists(directoryName);
if (!existed)
{
await CreateDirectory(directoryName);
}
}
public async Task<bool> CheckDirectoryExists(string directoryName)
{
using (var client = new HttpClient())
{
//Get directory (https://msdn.microsoft.com/en-us/library/azure/dn194272.aspx)
var azureCreateDirUrl = _azureShareToken.Url + directoryName + _azureShareToken.SASToken + "&restype=directory";
var response = await client.GetAsync(azureCreateDirUrl).ConfigureAwait(false);
return (response.StatusCode != System.Net.HttpStatusCode.NotFound);
}
}
public async Task CreateDirectory(string directoryName)
{
using (var client = new HttpClient())
{
//Create directory (https://msdn.microsoft.com/en-us/library/azure/dn166993.aspx)
var azureCreateDirUrl = _azureShareToken.Url + directoryName + _azureShareToken.SASToken + "&restype=directory";
var response = await client.PutAsync(azureCreateDirUrl, null).ConfigureAwait(false);
response.EnsureSuccessStatusCode();
}
}
public async Task UploadFile(string fileName, byte[] fileBytes)
{
using (var client = new HttpClient())
{
//Create empty file first (https://msdn.microsoft.com/en-us/library/azure/dn194271.aspx)
var azureCreateFileUrl = _azureShareToken.Url + fileName + _azureShareToken.SASToken;
client.DefaultRequestHeaders.Add("x-ms-type", "file");
client.DefaultRequestHeaders.Add("x-ms-content-length", fileBytes.Length.ToString());
var response = await client.PutAsync(azureCreateFileUrl, null).ConfigureAwait(false);
response.EnsureSuccessStatusCode();
//Then upload file (https://msdn.microsoft.com/en-us/library/azure/dn194276.aspx)
var azureUploadFileUrl = azureCreateFileUrl + "&comp=range";
client.DefaultRequestHeaders.Clear();
client.DefaultRequestHeaders.Add("x-ms-write", "update");
client.DefaultRequestHeaders.Add("x-ms-range", String.Format("bytes=0-{0}", (fileBytes.Length - 1).ToString()));
var byteArrayContent = new ByteArrayContent(fileBytes);
response = await client.PutAsync(azureUploadFileUrl, byteArrayContent).ConfigureAwait(false);
response.EnsureSuccessStatusCode();
}
}
}
In server side, use the following function to generate SAS token from Share:
public AzureSASToken GetSASFromShare(string shareName)
{
var share = _fileclient.GetShareReference(shareName);
share.CreateIfNotExists();
string policyName = "UPARSharePolicy";
// Create a new shared access policy and define its constraints.
var sharedPolicy = new SharedAccessFilePolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(15),
Permissions = SharedAccessFilePermissions.Read | SharedAccessFilePermissions.Write
};
// Get existing permissions for the share.
var permissions = share.GetPermissions();
// Add the shared access policy to the share's policies.
// Note that each policy must have a unique name.
// Maximum 5 policies for each share!
if (!permissions.SharedAccessPolicies.Keys.Contains(policyName))
{
if (permissions.SharedAccessPolicies.Count > 4)
{
var lastAddedPolicyName = permissions.SharedAccessPolicies.Keys.Last();
permissions.SharedAccessPolicies.Remove(lastAddedPolicyName);
}
permissions.SharedAccessPolicies.Add(policyName, sharedPolicy);
share.SetPermissions(permissions);
}
var sasToken = share.GetSharedAccessSignature(sharedPolicy);
//fileSasUri = new Uri(share.StorageUri.PrimaryUri.ToString() + sasToken);
return new AzureSASToken ()
{
Name = shareName,
Url = share.StorageUri.PrimaryUri.ToString() + "/",
SASToken = sasToken
};
}
Finally using class like this:
var azureFileRest = new AzureFileREST(sasToken);
await azureFileRest.CreateIfNotExist(directoryName);
await azureFileRest.UploadFile(directoryName + "/" + fileName, bytes);

How to show a progress bar while uploading data to azure blob in Windows store app

I have followed the following post to do the upload in chunks to blob
How to track progress of async file upload to azure storage
But this is not working in windows store app or windows 8.1 app as there is no support for MemoryStream in win 8.1 app.
Anyhow I have modified the code in the above thread and come up with something like the following
CloudBlobClient myBlobClient = storageAccount.CreateCloudBlobClient();
var filePicker = new FileOpenPicker();
filePicker.FileTypeFilter.Add("*");
var file = await filePicker.PickSingleFileAsync();
string output = string.Empty;
var fileName = file.Name;
myBlobClient.SingleBlobUploadThresholdInBytes = 1024 * 1024;
CloudBlobContainer container = myBlobClient.GetContainerReference("files");
//container.CreateIfNotExists();
CloudBlockBlob myBlob = container.GetBlockBlobReference(fileName);
var blockSize = 256 * 1024;
myBlob.StreamWriteSizeInBytes = blockSize;
var fileProp = await file.GetBasicPropertiesAsync();
var bytesToUpload = fileProp.Size;
var fileSize = bytesToUpload;
if (bytesToUpload < Convert.ToUInt64(blockSize))
{
CancellationToken ca = new CancellationToken();
var ado = myBlob.UploadFromFileAsync(file).AsTask();
await
//Console.WriteLine(ado.Status); //Does Not Help Much
ado.ContinueWith(t =>
{
//Console.WriteLine("Status = " + t.Status);
//Console.WriteLine("It is over"); //this is working OK
});
}
else
{
List<string> blockIds = new List<string>();
int index = 1;
ulong startPosition = 0;
ulong bytesUploaded = 0;
do
{
var bytesToRead = Math.Min(Convert.ToUInt64(blockSize), bytesToUpload);
var blobContents = new byte[bytesToRead];
using (Stream fs = await file.OpenStreamForWriteAsync())
{
fs.Position = Convert.ToInt64(startPosition);
fs.Read(blobContents, 0, (int)bytesToRead);
//var iStream = fs.AsInputStream();
ManualResetEvent mre = new ManualResetEvent(false);
var blockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(index.ToString("d6")));
//Console.WriteLine("Now uploading block # " + index.ToString("d6"));
blockIds.Add(blockId);
var ado = myBlob.PutBlockAsync(blockId, fs.AsRandomAccessStream(), null).AsTask();
await ado.ContinueWith(t =>
{
bytesUploaded += bytesToRead;
bytesToUpload -= bytesToRead;
startPosition += bytesToRead;
index++;
double percentComplete = (double)bytesUploaded / (double)fileSize;
output += (percentComplete * 100).ToString() + ",";
// AppModel.TasksFormObj.SetProgress(percentComplete * 100);
// Console.WriteLine("Percent complete = " + percentComplete.ToString("P"));
mre.Set();
});
mre.WaitOne();
}
}
while (bytesToUpload > 0);
//Console.WriteLine("Now committing block list");
var pbl = myBlob.PutBlockListAsync(blockIds).AsTask();
pbl.ContinueWith(t =>
{
//Console.WriteLine("Blob uploaded completely.");
});
}
}
The above code will save the file in blob and the progress is also fine but the saved file in blob is always in 0 bytes . After debugging I found there was an error thrown after var ado = myBlob.PutBlockAsync(blockId, fs.AsRandomAccessStream(), null).AsTask(); for the last blob transfer as
<?xml version="1.0" encoding="utf-16"?>
<!--An exception has occurred. For more information please deserialize this message via RequestResult.TranslateFromExceptionMessage.-->
<RequestResult>
<HTTPStatusCode>400</HTTPStatusCode>
<HttpStatusMessage>The value for one of the HTTP headers is not in the correct format.</HttpStatusMessage>
<TargetLocation>Primary</TargetLocation>
<ServiceRequestID>13633308-0001-0031-060b-7249ea000000</ServiceRequestID>
<ContentMd5 />
<Etag />
<RequestDate>Sun, 28 Feb 2016 10:31:44 GMT</RequestDate>
<StartTime>Sun, 28 Feb 2016 09:31:43 GMT</StartTime>
<EndTime>Sun, 28 Feb 2016 09:31:44 GMT</EndTime>
<Error>
<Code>InvalidHeaderValue</Code>
<Message>The value for one of the HTTP headers is not in the correct format.
RequestId:13633308-0001-0031-060b-7249ea000000
Time:2016-02-28T09:34:18.3545085Z</Message>
<HeaderName>Content-Length</HeaderName>
<HeaderValue>0</HeaderValue>
</Error>
<ExceptionInfo>
<Type>StorageException</Type>
<HResult>-2147467259</HResult>
<Message>The value for one of the HTTP headers is not in the correct format.</Message>
<Source>Microsoft.WindowsAzure.Storage</Source>
<StackTrace> at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.<ExecuteAsyncInternal>d__c`1.MoveNext()</StackTrace>
</ExceptionInfo>
</RequestResult>
Then on the last commit the error thrown after myBlob.PutBlockListAsync(blockIds) as The specified block list is invalid
So please someone help me to to figure out where I am doing wrong or what is the possible solution to make it work 100%.
Use AsTask() like this.
CancellationTokenSource _cts;
_cts = new CancellationTokenSource();//<--In Constructor
var progress = new Progress<double>(TranscodeProgress);
await var ado = myBlob.UploadFromFileAsync(file).AsTask(_cts.Token, progress);

Resources