managing azure blob using react-native - azure

I want to update images on the azure blob storage . I downloaded #azure/storage-blob and #azure/identity even after all these downloads I am getting errors and after this i downloaded #azure/logger it showed an error
"exit with node 1"
The code is as below.
var AzureStorage = require('azure-storage');
const account = { name: "x", sas:"x" };
var blobUri = 'https://' + account.name + '.blob.core.windows.net';
var blobService = AzureStorage.Blob.createBlobServiceWithSas(blobUri, account.sas);
console.log(azureinformation);
console.log(AzureStorage);
blobService.createBlockBlobFromBrowserFile('aic', "task1", data.sampleImgData, function(error, result, response) {
finishedOrError = true;
if (error) {
console.log(success);
}
});
I am using .61 version of react native. please let me know the solution if you have . Thanks in advance.

There is some mistakes in your description and code, as below.
The latest version is #azure/storage-blob (its version >= 10), but it does not include the function createBlockBlobFromBrowserFile which be belong to V2 SDK. There is a similar SO thread Upload BlockBlob to Azure Storage using React which you can refer to.
Your current code seems to be from the sample Azure Storage JavaScript Client Library Sample for Blob Operations, but you should get the variable AzureStorage via the code <script src="azure-storage.blob.js"></script> in HTML page, not var AzureStorage = require('azure-storage'); for Node server.
If you want to use the latest SDK #azure/storage-blob in browser, please see the sample code https://github.com/Azure/azure-sdk-for-js/blob/master/sdk/storage/storage-blob/samples/browserSamples/largeFileUploads.js.
So first of all, you should choose a SDK version as a solution for your current needs.

i have uploaded image on azure blob using the enlisted code . i used image picker to get the image and after that
{
let blobUri = `https://containername.blob.core.windows.net`;
let sas = "genrate sas token from storage account and give expire limit of 1 year";
let resposne = RNFetchBlob.fetch('PUT', `${blobUri}/aic/${data.sampleImgData.uri.fileName}?${sas}`, {
'x-ms-blob-type': 'BlockBlob',
'content-type': 'application/octet-stream',
'x-ms-blob-content-type': data.sampleImgData.uri.type,
}, data.sampleImgData.uri.data);
resposne.then(res => console.log(res)).catch(err => console.log(err));
console.log(`${blobUri}/aic/${data.sampleImgData.uri.fileName}`);
azureimageurl = `${blobUri}/aic/${data.sampleImgData.uri.fileName}`;
}

Related

Azure data lake query acceleration error - One or more errors occurred. (XML specified is not syntactically valid

I am trying to filter data from azure storage account using ADLS query. Using Azure Data Lake Storage Gen2. Not able to filter data and get the result in. Been stuck on this issue, even Microsoft support is not able to crack this issue. Any help is greatly appreciated.
Tutorial Link: https://www.c-sharpcorner.com/article/azure-data-lake-storage-gen2-query-acceleration/
Solution - .Net Core 3.1 Console App
Error: One or more errors occurred. (XML specified is not syntactically valid.)
Status: 400 (XML specified is not syntactically valid.)
private static async Task MainAsync()
{
var connectionString = "DefaultEndpointsProtocol=https;AccountName=gfsdlstestgen2;AccountKey=0AOkFckONVYkTh9Kpr/VRozBrhWYrLoH7y0mW5wrw==;EndpointSuffix=core.windows.net";
var blobServiceClient = new BlobServiceClient(connectionString);
var containerClient = blobServiceClient.GetBlobContainerClient("test");
await foreach (var blobItem in containerClient.GetBlobsAsync(BlobTraits.Metadata, BlobStates.None, "ds_measuringpoint.json"))
{
var blobClient = containerClient.GetBlockBlobClient(blobItem.Name);
var options = new BlobQueryOptions
{
InputTextConfiguration = new BlobQueryJsonTextOptions(),
OutputTextConfiguration = new BlobQueryJsonTextOptions()
};
var result = await blobClient.QueryAsync(#"SELECT * FROM BlobStorage WHERE measuringpointid = 547", options);
var jsonString = await new StreamReader(result.Value.Content).ReadToEndAsync();
Console.WriteLine(jsonString);
Console.ReadLine();
}
After looking every where and testing almost all variations of ADLS query for .net Microsoft support mentioned
Azure.Storage.Blobs version 12.10 is broken version. We had to downgrade to 12.8.0
Downgrading this package to 12.8.0 worked.

Firebase Storage and Cloud functions: how to load content from url and save into storage?

I've realtime databse when user can write url of an image.
When created/updated I'm already able to trigger a cloud function that can read the url of the image from realtime database
I now need to download the image from web (so it's not an upload) and save it on firebase storage.
I cannot find a single example of fetching a web resource and store into firebase storage.
Can please you point me to right solution?
My idea was to reacting to create/update of the url on the db, then fetch (can I use fetch npm package??) and then save the fetched content into the storage bucket using url as key
But fetch + save fetched data is what I am not able to do now
Before someone closes this question because is 'off-topc', I write my own solution.
const bucket = admin.storage().bucket();
const axios = require('axios');
const response = await axios.post(BASE_URL, data_to_post, config);
console.log("Response.status", response.status);
const cache_file_name = `page-cache/page-${pageNumber}.html`;
const cache_file_options = {
metadata : {
contentType : 'text/html'
}
};
const cache_file = bucket.file(cache_file_name);
await cache_file.save(response.data, cache_file_options);

Using Firebase parameters with Google Cloud Storage under node.js

There is no node.js Firebase Storage client at the moment (too bad...), so I'm turning to gcloud-node with the parameters found in Firebase's console.
I'm trying :
var firebase = require('firebase');
var gcloud = require('gcloud')({
keyFilename: process.env.FB_JSON_PATH,
projectId: process.env.FB_PROJECT_ID
});
firebase.initializeApp({
serviceAccount: process.env.FB_JSON_PATH,
databaseURL: process.env.FB_DATABASE_URL
});
var fb = firebase.database().ref();
var gcs = gcloud.storage();
var bucket = gcs.bucket(process.env.FB_PROJECT_ID);
bucket.exists(function(err, exists) {
console.log('err', err);
console.log('exists', exists);
});
Where :
FB_JSON_PATH is the path to the JSON file generated in order to use the Firebase Server SDK
FB_DATABASE_URL is something like https://app-a36e5.firebaseio.com/
FB_PROJECT_ID is the name of the firebase project in Google's console : "app-a36e5"
The id of the bucket is FB_PROJECT_ID (in Firebase's console the storage tab displays gs://app-a36e5.appspot.com)
When I run this code I get :
err null
exists false
But no other errors.
I'm expecting exists true at least.
Some additional info : I can query the database (so I imagine the JSON file is correct), and I have set the storage rules as follow :
service firebase.storage {
match /b/app-a36e5.appspot.com/o {
match /{allPaths=**} {
allow read: if true;
allow write: if request.auth != null;
}
}
}
So that everything on the storage is readable.
Any ideas how to get it to work ? Thank you.
The issue here is that you aren't naming your storage bucket correctly. The bucket initialization should be:
var bucket = gcs.bucket('app-a36e5.appspot.com'); // full name of the bucket includes the .appspot.com
I would assume that process.env.FB_PROJECT_ID is just the your-bucket part, and you'd need to get the full bucket name, not just the project id (though the bucket name may be process.env.FB_PROJECT_ID + '.appspot.com').
Also, sorry about not providing Storage integrated with Firebase--GCS has a high quality library that you've already found (gcloud-node), and we figured that this provides the best story for developers (Firebase for mobile, Google Cloud Platform for server side development), and didn't want to muddy the waters further.

How can I set the contentType of a blob on Azure using azure-cli?

I am trying to update the contentType of an uploaded file or at least be able to reupload a file with the correct contentType.
In my case I am uploading css but it gives it a content type of application/octet-stream by default.
The command line reference doesn't show how to manage the properties of a blob as far as I can tell
Edit
If you just create the file apparently you can use
azure storage blob create -f {file_name} -p contentType=text/css
But I still haven't found a way to edit one yet.
You can now set blob properties using az storage blob update from the Azure CLI (source on GitHub). For example, to set the Content-Type of a blob named $blob in a container named $container to text/css:
az storage blob update -c "$container" -n "$blob" --content-type text/css
Looking at the source code here, I don't think it is possible to update blob's properties using azure-cli.
If you're interested, you can use Node SDK for Azure Storage and update blob properties. For example, look at the sample code below:
var AZURE = require('azure-storage');
var blobService = AZURE.createBlobService("<account name>", "<account key>");
var container = '<blob container name>';
var blob = '<blob name>';
var newContentType = '<new content type e.g. text/css>'
blobService.getBlobProperties(container, blob, function(error, result, response) {
if (!error) {
var contentType = result.contentType;
var cacheControl = result.cacheControl;
var contentEncoding = result.contentEncoding;
var contentMD5 = result.contentMD5;
var contentLanguage = result.contentLanguage;
var options = {
'contentType': newContentType,
'cacheControl': cacheControl,
'contentEncoding': contentEncoding,
'contentMD5': contentMD5,
'contentLanguage': contentLanguage,
};
blobService.setBlobProperties(container, blob, options, function(error, result, response) {
if (!error) {
console.log('Properties updated successfully!');
} else {
console.log(error);
}
});
} else {
console.log(error);
}
});

Upload photo from Windows 8 to Azure Blob Storage

The Azure SDK don't working on Windows 8 APP.
How I can upload photo to Azure Blob Storage from Windows 8 App?
I need working code sample.
You don't need the Windows Azure SDK to upload photos from Windows 8 applications to Blob Storage. The HttpClient will also work fine:
using (var client = new HttpClient())
{
CameraCaptureUI cameraCapture = new CameraCaptureUI();
StorageFile media = await cameraCapture.CaptureFileAsync(CameraCaptureUIMode.PhotoOrVideo);
using (var fileStream = await media.OpenStreamForReadAsync())
{
var content = new StreamContent(fileStream);
content.Headers.Add("Content-Type", media.ContentType);
content.Headers.Add("x-ms-blob-type", "BlockBlob");
var uploadResponse = await client.PutAsync(
new Uri(blobUriWithSAS), image);
}
}
The only thing you'll need to do is get the url to the blob together with the Signed Access Signature. Nick Harris explains how you can do this using mobile services.

Resources