Upload photo from Windows 8 to Azure Blob Storage - azure

The Azure SDK don't working on Windows 8 APP.
How I can upload photo to Azure Blob Storage from Windows 8 App?
I need working code sample.

You don't need the Windows Azure SDK to upload photos from Windows 8 applications to Blob Storage. The HttpClient will also work fine:
using (var client = new HttpClient())
{
CameraCaptureUI cameraCapture = new CameraCaptureUI();
StorageFile media = await cameraCapture.CaptureFileAsync(CameraCaptureUIMode.PhotoOrVideo);
using (var fileStream = await media.OpenStreamForReadAsync())
{
var content = new StreamContent(fileStream);
content.Headers.Add("Content-Type", media.ContentType);
content.Headers.Add("x-ms-blob-type", "BlockBlob");
var uploadResponse = await client.PutAsync(
new Uri(blobUriWithSAS), image);
}
}
The only thing you'll need to do is get the url to the blob together with the Signed Access Signature. Nick Harris explains how you can do this using mobile services.

Related

Azure data lake query acceleration error - One or more errors occurred. (XML specified is not syntactically valid

I am trying to filter data from azure storage account using ADLS query. Using Azure Data Lake Storage Gen2. Not able to filter data and get the result in. Been stuck on this issue, even Microsoft support is not able to crack this issue. Any help is greatly appreciated.
Tutorial Link: https://www.c-sharpcorner.com/article/azure-data-lake-storage-gen2-query-acceleration/
Solution - .Net Core 3.1 Console App
Error: One or more errors occurred. (XML specified is not syntactically valid.)
Status: 400 (XML specified is not syntactically valid.)
private static async Task MainAsync()
{
var connectionString = "DefaultEndpointsProtocol=https;AccountName=gfsdlstestgen2;AccountKey=0AOkFckONVYkTh9Kpr/VRozBrhWYrLoH7y0mW5wrw==;EndpointSuffix=core.windows.net";
var blobServiceClient = new BlobServiceClient(connectionString);
var containerClient = blobServiceClient.GetBlobContainerClient("test");
await foreach (var blobItem in containerClient.GetBlobsAsync(BlobTraits.Metadata, BlobStates.None, "ds_measuringpoint.json"))
{
var blobClient = containerClient.GetBlockBlobClient(blobItem.Name);
var options = new BlobQueryOptions
{
InputTextConfiguration = new BlobQueryJsonTextOptions(),
OutputTextConfiguration = new BlobQueryJsonTextOptions()
};
var result = await blobClient.QueryAsync(#"SELECT * FROM BlobStorage WHERE measuringpointid = 547", options);
var jsonString = await new StreamReader(result.Value.Content).ReadToEndAsync();
Console.WriteLine(jsonString);
Console.ReadLine();
}
After looking every where and testing almost all variations of ADLS query for .net Microsoft support mentioned
Azure.Storage.Blobs version 12.10 is broken version. We had to downgrade to 12.8.0
Downgrading this package to 12.8.0 worked.

managing azure blob using react-native

I want to update images on the azure blob storage . I downloaded #azure/storage-blob and #azure/identity even after all these downloads I am getting errors and after this i downloaded #azure/logger it showed an error
"exit with node 1"
The code is as below.
var AzureStorage = require('azure-storage');
const account = { name: "x", sas:"x" };
var blobUri = 'https://' + account.name + '.blob.core.windows.net';
var blobService = AzureStorage.Blob.createBlobServiceWithSas(blobUri, account.sas);
console.log(azureinformation);
console.log(AzureStorage);
blobService.createBlockBlobFromBrowserFile('aic', "task1", data.sampleImgData, function(error, result, response) {
finishedOrError = true;
if (error) {
console.log(success);
}
});
I am using .61 version of react native. please let me know the solution if you have . Thanks in advance.
There is some mistakes in your description and code, as below.
The latest version is #azure/storage-blob (its version >= 10), but it does not include the function createBlockBlobFromBrowserFile which be belong to V2 SDK. There is a similar SO thread Upload BlockBlob to Azure Storage using React which you can refer to.
Your current code seems to be from the sample Azure Storage JavaScript Client Library Sample for Blob Operations, but you should get the variable AzureStorage via the code <script src="azure-storage.blob.js"></script> in HTML page, not var AzureStorage = require('azure-storage'); for Node server.
If you want to use the latest SDK #azure/storage-blob in browser, please see the sample code https://github.com/Azure/azure-sdk-for-js/blob/master/sdk/storage/storage-blob/samples/browserSamples/largeFileUploads.js.
So first of all, you should choose a SDK version as a solution for your current needs.
i have uploaded image on azure blob using the enlisted code . i used image picker to get the image and after that
{
let blobUri = `https://containername.blob.core.windows.net`;
let sas = "genrate sas token from storage account and give expire limit of 1 year";
let resposne = RNFetchBlob.fetch('PUT', `${blobUri}/aic/${data.sampleImgData.uri.fileName}?${sas}`, {
'x-ms-blob-type': 'BlockBlob',
'content-type': 'application/octet-stream',
'x-ms-blob-content-type': data.sampleImgData.uri.type,
}, data.sampleImgData.uri.data);
resposne.then(res => console.log(res)).catch(err => console.log(err));
console.log(`${blobUri}/aic/${data.sampleImgData.uri.fileName}`);
azureimageurl = `${blobUri}/aic/${data.sampleImgData.uri.fileName}`;
}

Duplicating File Uploading Process - Asp.net WebApi

I created a web API which allows users to send files and upload to Azure Storage. The way it works is, the client app will connect to API to send one or more files to the file upload controller and controller will take care of rest such as
Upload file to Azure storage
Update database
Works great but I don't think it is the right way to do this because now I can see there are two different processes
Upload file from the client's file system to my web API (server)
Upload file to the Azure storage from API (server)
It gives me the feeling that I am duplicating the upload process as the same file first travels to API (server) and then Azure (destination) from the client (file system). I feel the need of showing two progress-bars to the client for file upload progress (from client to server and then the server to Azure) - That just doesn't make sense to me and I feel that my approach is incorrect.
My API accepts up to 250MBs so you can imagine the overload.
What do you guys think?
//// API Controller
if (!Request.Content.IsMimeMultipartContent("form-data"))
{
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
}
var provider = new RestrictiveMultipartMemoryStreamProvider();
var contents = await Request.Content.ReadAsMultipartAsync(provider);
int Total_Files = contents.Contents.Count();
foreach (HttpContent ctnt in contents.Contents)
{
await storageManager.AddBlob(ctnt)
}
////// Stream
#region SteamHelper
public class RestrictiveMultipartMemoryStreamProvider : MultipartMemoryStreamProvider
{
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
var extensions = new[] { "pdf", "doc", "docx", "cab", "zip" };
var filename = headers.ContentDisposition.FileName.Replace("\"", string.Empty);
if (filename.IndexOf('.') < 0)
return Stream.Null;
var extension = filename.Split('.').Last();
return extensions.Any(i => i.Equals(extension, StringComparison.InvariantCultureIgnoreCase))
? base.GetStream(parent, headers)
: Stream.Null;
}
}
#endregion SteamHelper
///// AddBlob
public async Task<string> AddBlob(HttpContent _Payload)
{
CloudStorageAccount cloudStorageAccount = KeyVault.AzureStorage.GetConnectionString();
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("SomeContainer");
cloudBlobContainer.CreateIfNotExists();
try
{
byte[] fileContentBytes = _Payload.ReadAsByteArrayAsync().Result;
CloudBlockBlob blob = cloudBlobContainer.GetBlockBlobReference("SomeBlob");
blob.Properties.ContentType = _Payload.Headers.ContentType.MediaType;
blob.UploadFromByteArray(fileContentBytes, 0, fileContentBytes.Length);
var B = await blob.CreateSnapshotAsync();
B.FetchAttributes();
return "Snapshot ETAG: " + B.Properties.ETag.Replace("\"", "");
}
catch (Exception X)
{
return ($"Error : " + X.Message);
}
}
It gives me the feeling that I am duplicating the upload process as
the same file first travels to API (server) and then Azure
(destination) from the client (file system).
I think you're correct. One possible solution would be have your API generate a Shared Access Signature (SAS) token and return that SAS token/URI to the client whenever a client wishes to upload a file.
Using this SAS URI your client can directly upload the file to Azure Storage without sending it to your API first. Once the file is uploaded successfully by the client, it can send a message to the API to update the database.
You can read more about SAS here: https://learn.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1.
I have also written a blog post long time back on using SAS that you may find useful: https://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/.

Why can't I download an Azure Blob using an asp.net core application published to Azure server

I am trying to download a Blob from an Azure storage account container. When I run the application locally, I get the correct "Download" folder C:\Users\xxxx\Downloads. When I publish the application to Azure and try to download the file, I get an error. I have tried various "Knownfolders", and some return empty strings, others return the folders on the Azure server. I am able to upload files fine, list the files in a container, but am struggling with downloading a file.
string conn =
configuration.GetValue<string>"AppSettings:AzureContainerConn");
CloudStorageAccount storageAcct = CloudStorageAccount.Parse(conn);
CloudBlobClient blobClient = storageAcct.CreateCloudBlobClient();
CloudBlobContainer container =
blobClient.GetContainerReference(containerName);
Uri uriObj = new Uri(uri);
string filename = Path.GetFileName(uriObj.LocalPath);
// get block blob reference
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
Stream blobStream = await blockBlob.OpenReadAsync();
string _filepath = _knownfolder.Path + "\\projectfiles\\";
Directory.CreateDirectory(_filepath);
_filepath = _filepath + filename;
Stream _file = new MemoryStream();
try
{
_file = File.Open(_filepath, FileMode.Create, FileAccess.Write);
await blobStream.CopyToAsync(_file);
}
finally
{
_file.Dispose();
}
The expected end result is the file ends up in the folder within the users "Downloads" folder.
Since you're talking about publishing to Azure, the code is probably from a web application, right? And the code for the web application runs on the server. Which means the code is trying to download the blob to the server running the web application.
To present a downloadlink to the user to enable them to download the file, use the FileStreamResult which
Represents an ActionResult that when executed will write a file from a stream to the response.
A (pseudo code) example:
[HttpGet]
public FileStreamResult GetFile()
{
var stream = new MemoryStream();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
blockBlob.DownloadToStream(stream);
blockBlob.Seek(0, SeekOrigin.Begin);
return new FileStreamResult(stream, new MediaTypeHeaderValue("text/plain"))
{
FileDownloadName = "someFile.txt"
};
}

What is the best way to upload a large number of files to azure file storage?

I want the file storage specifically not the blob storage (I think). This is code for my azure function and I just have a bunch of stuff in my node_modules folder.
What I would like to do is upload a zip of the entire app and then just upload that and have azure unpack it at a given folder. Is this possible?
Right now I'm essentially iterating over all of my files and calling:
var fileStream = new stream.Readable();
fileStream.push(myFileBuffer);
fileStream.push(null);
fileService.createFileFromStream('taskshare', 'taskdirectory', 'taskfile', fileStream, myFileBuffer.length, function(error, result, response) {
if (!error) {
// file uploaded
}
});
And this works its just too slow. So I'm wondering if there is a faster way to upload a bunch of files for use in apps.
And this works its just too slow. So I'm wondering if there is a faster way to upload a bunch of files for use in apps.
If Microsoft Azure Storage Data Movement Library is acceptable, please have a try to use it. The Microsoft Azure Storage Data Movement Library designed for high-performance uploading, downloading and copying Azure Storage Blob and File. This library is based on the core data movement framework that powers AzCopy.
We also could get the demo code from the github document.
string storageConnectionString = "myStorageConnectionString";
CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("mycontainer");
blobContainer.CreateIfNotExists();
string sourcePath = "path\\to\\test.txt";
CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference("myblob");
// Setup the number of the concurrent operations
TransferManager.Configurations.ParallelOperations = 64;
// Setup the transfer context and track the upoload progress
SingleTransferContext context = new SingleTransferContext();
context.ProgressHandler = new Progress<TransferStatus>((progress) =>
{
Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred);
});
// Upload a local blob
var task = TransferManager.UploadAsync(
sourcePath, destBlob, null, context, CancellationToken.None);
task.Wait();

Resources