Get an Azure Blob (image) and POST to an external API - node.js

I have an app that is written and working in Google Apps script. It grabs an image from Google Drive, converts it to a blob, and sends it to an external API via the request body.
I am converting this app to nodejs and using an Azure Storage Account to store the images now, but still calling the same external API.
In Google Apps script, to get the Google Drive image blob to send to the external API, I use DriveApp.getFileById().getBlob(). This is not a string, but a blob itself.
What do I use in node.js for Azure blob storage?
I have tried to use getBlobToText, before understanding that's not what the API wants. It doesn't want a string, but the blob itself. I have looked at getBlobToStream and others as well, but none of them seem to actually get the blob itself.
I have read many, many stackoverflow Q&As, along with many other articles and sites, but to can't find any talking about using a downloaded Azure blob to send to an external API.

In Azure SDK, you can download the blob as Stream/string. And, that is how it works in enterprise applications as well. You can download blob like below:
// Get blob content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadBlockBlobResponse.readableStreamBody
// In browsers, get downloaded data by accessing downloadBlockBlobResponse.blobBody
const downloadBlockBlobResponse = await blockBlobClient.download(0);
console.log('\nDownloaded blob content...');
console.log('\t', await streamToString(downloadBlockBlobResponse.readableStreamBody));
Helper function to convert to string:
// A helper function used to read a Node.js readable stream into a string
async function streamToString(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", (data) => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
You can then send this response to your external API.
I am not sure about your requirements but this is not the right approach as it will have security concerns. You can give external API a reader role using RBAC to your Azure Blob storage and let external API read blob directly from the storage.

Related

Firebase Storage Admin SDK: upload from image url

In the web sdk of firebase storage you can upload an image from Blob data. I have a nodeJS app and want to upload images from a url to my storage bucket. In the docs it's recommended to use the admin SDK if running a node server environment. But i cannot find this feature in the firbase storage admin documentation.
Heres my code:
const admin = require('firebase-admin');
const serviceAccount = require(`./credentials/my-keyfile.json`);
const app = admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
const storage = app.storage();
// Get Blob data from an external ImageUrl
const axios = require("axios");
const getBlobFromUrl = (url) => {
const response = await axios.get(url, { responseType: "blob" });
return response.data;
}
const blobData = getBlobFromUrl("https://images.unsplash.com/photo-1591476315349-faa1c8e6f567?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max")
// MY QUESTION -> how can i upload that blob data to my storage?
I was recently trying to do something similar, writing an in-memory file from a Node server (with the Firebase Admin SDK) to Firebase Storage.
I solved it by realizing that I needed to use Google Cloud Storage's Node API to do so (mentioned in the docs). This is the relevant doc page to do streaming transfers with Google Cloud Storage: https://cloud.google.com/storage/docs/streaming#storage-stream-upload-object-nodejs.
Here are the basic steps:
Set the appropriate storageBucket address (e.g. gs://APP-NAME.appspot.com/), which you can get from your Firebase Storage console. Also, get the appropriate Firebase storage and bucket JS objects. Follow the example at https://firebase.google.com/docs/storage/admin/start
get the stream of byte data from your blob/file: https://developer.mozilla.org/en-US/docs/Web/API/Blob/stream
use Google Cloud Storage's API to pipe/write that stream to a specified location in your bucket: https://cloud.google.com/storage/docs/streaming#storage-stream-upload-object-nodejs
So, following the the docs above, you'd do something like:
// Create a reference to a file object.
// myBucket is the bucket obtained in step 1 above
const file = myBucket.file("directoryName/imageName.png");
// step 2: get ReadableStream from Blob
const stream = blobData.stream();
// step 3: asynchronously pipe/write to file in cloud
async function streamFileUpload() {
stream.pipe(file.createWriteStream()).on('finish', () => {
// The file upload is complete
});
}
streamFileUpload().catch(console.error);
After this you'll see your file uploaded to Firebase Storage under directoryName/imageName.png.
The first link you have posted on uploading an image with Blob data
contains information on how you would do it with the client sdk.
When you use the Admin SDK, the use case is for the backend, for example, cloud functions. What you have shown in your post is used for this use case.
For implementations on how to upload an image using the client sdk, check out some of the Quickstart guides as well as the code labs.
If you're working primarily with Blob, you can check out some libraries like busboy and send HTTP requests instead.

Serve private files directly from azure blob storage

My web app allows users to upload files.
I want to use cloud azure blob storage for this.
Since downloading will be very frequent (more than uploading)
I would like to save server computing time and bandwith and serve files directly from the azure blob server.
I believe it is made possible on Google cloud with Firebase (Firestorage).
Where you can upload and download directly from the client. (I know authentication and authorization are also managed by firebase so it makes things easier)
Does any similar mechanisms/service exist on Azure?
For example
When a user clicks an azure storage download link a trigger would check the JWT for authorization and data would be sent directly to the client from azure storage
Similar option is available with Azure Blob storage as well. You can use the Storage SDK to access the containers and list/download the blob
With a javascript backend You can either use SAS Token or Azure Storage JavaScript Client Library also supports creating BlobService based on Storage Account Key for authentication besides SAS Token. However, for security concerns, use of a limited time SAS Token, generated by a backend web server using a Stored Access Policy.
Example here
EDIT:
I have not answered the question completely above, However if you want to access the blob storage or download any files from the blob storage you can make use of normal http get request with SAS token generated with any JavaScript application.
With Angualr:
uploadToBLob(files) {
let formData: FormData = new FormData();
formData.append("asset", files[0], files[0].name);
this.http.post(this.baseUrl + 'insertfile', formData)
.subscribe(result => console.log(result));
}
downloadFile(fileName: string) {
return this.http.get(this.baseUrl + 'DownloadBlob/' + fileName, { responseType: "blob" })
.subscribe((result: any) => {
if (result) {
var blob = new Blob([result]);
let saveAs = require('file-saver');
let file = fileName;
saveAs(blob, file);
this.fileDownloadInitiated = false;
}
}, err => this.errorMessage = err
);
}
However the best practice (considering the security) is to have a backend API/Azure function to handle the file upload.

How do I pass-through non-multipart file content from a client to Blob Storage without buffering it in a Web API app?

In my Web API, I need to receive files and then save them in Blob Sorage. Clients are not allowed to access and are not aware of the Blob Storage.
I'm trying to avoid buffering files, which could be up to 300 MB in size. I've seen this post...
How do I pass a Stream from a Web API to Azure Blob Storage without temp files?
... but the solution described in the post is not going to work for me because it assumes multipart content which in turn allows for custom providers.
Clients that I need to deal with are not sending files using multipart content. Instead, they simply send file content in message bodies.
Here is what works for me now (with buffering):
using (var inStream = await this.Request.Content.ReadAsStreamAsync())
{
var blob = container.GetBlockBlobReference(fileName);
var outStream = await blob.OpenWriteAsync();
await inStream.CopyToAsync(outStream);
outStream.Close();
}
Is there a way to connect a Request's Stream with a Blob Stream without the former being buffered?

How to upload an image in BLOB format to Firebase Storage?

I have an image in the SQLite database as a BLOB type, I want to upload that image into firebase storage, get download URL and dimensions of the image, and add them to Firebase real-time database. I've successfully uploaded an image from the local drive with bucket.upload but I have no idea how can I upload a blob.
I am using admin-sdk on the serverside (NodeJS), there is a "put" function but that's in client-side sdk.
bucket.upload("./file.jpeg", options).then(result => {
const file = result[0];
console.log(file)
}).then(results => {
console.log(results);
}).catch(error => {
console.error(error);
});
You have to use streaming transfert if you want to send a blob directly. There is a third party library providing this functionality: https://cloud.google.com/storage/docs/boto-plugin#streaming-transfers

Upload file to Azure Blob Storage directly from browser?

Is it possible to create an html form to allow web users to upload files directly to azure blob store without using another server as a intermediary? S3 and GAW blobstore both allow this but I cant find any support for azure blob storage.
EDIT November 2019
You can now refer to the official documentation:
Azure Storage JavaScript Client Library Sample for Blob Operations
Azure Storage client library for JavaScript
Initial answer
There is a New Azure Storage JavaScript client library for browsers (Preview).
(Everything from this post comes from the original article above)
The JavaScript Client Library for Azure Storage enables many web development scenarios using storage services like Blob, Table, Queue, and File, and is compatible with modern browsers
The new JavaScript Client Library for Browsers supports all the storage features available in the latest REST API version 2016-05-31 since it is built with Browserify using the Azure Storage Client Library for Node.js
We highly recommend use of SAS tokens to authenticate with Azure Storage since the JavaScript Client Library will expose the authentication token to the user in the browser. A SAS token with limited scope and time is highly recommended. In an ideal web application it is expected that the backend application will authenticate users when they log on, and will then provide a SAS token to the client for authorizing access to the Storage account. This removes the need to authenticate using an account key. Check out the Azure Function sample in our Github repository that generates a SAS token upon an HTTP POST request.
Code sample:
Insert the following script tags in your HTML code. Make sure the JavaScript files located in the same folder.
<script src="azure-storage.common.js"></script/>
<script src="azure-storage.blob.js"></script/>
Let’s now add a few items to the page to initiate the transfer. Add the following tags inside the BODY tag. Notice that the button calls uploadBlobFromText method when clicked. We will define this method in the next step.
<input type="text" id="text" name="text" value="Hello World!" />
<button id="upload-button" onclick="uploadBlobFromText()">Upload</button>
So far, we have included the client library and added the HTML code to show the user a text input and a button to initiate the transfer. When the user clicks on the upload button, uploadBlobFromText will be called. Let’s define that now:
<script>
function uploadBlobFromText() {
// your account and SAS information
var sasKey ="....";
var blobUri = "http://<accountname>.blob.core.windows.net";
var blobService = AzureStorage.createBlobServiceWithSas(blobUri, sasKey).withFilter(new AzureStorage.ExponentialRetryPolicyFilter());
var text = document.getElementById('text');
var btn = document.getElementById("upload-button");
blobService.createBlockBlobFromText('mycontainer', 'myblob', text.value, function(error, result, response){
if (error) {
alert('Upload filed, open browser console for more detailed info.');
console.log(error);
} else {
alert('Upload successfully!');
}
});
}
</script>
Do take a look at these blog posts for uploading files directly from browser to blob storage:
http://coderead.wordpress.com/2012/11/21/uploading-files-directly-to-blob-storage-from-the-browser/
http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript
The 2nd post (written by me) makes use of HTML 5 File API and thus would not work in all browsers.
The basic idea is to create a Shared Access Signature (SAS) for a blob container. The SAS should have Write permission. Since Windows Azure Blob Storage does not support CORS yet (which is supported by both Amazon S3 and Google), you would need to host the HTML page in the blob storage where you want your users to upload the file. Then you can use jQuery's Ajax functionality.
Now that Windows Azure storage services support CORS, you can do this. You can see the announcement here: Windows Azure Storage Release - Introducing CORS, JSON, Minute Metrics, and More.
I have a simple example that illustrates this scenario here: http://www.contentmaster.com/azure/windows-azure-storage-cors/
The example shows how to upload and download directly from a private blob using jQuery.ajax. This example still requires a server component to generate the shared access signature: this avoids the need to expose the storage account key in the client code.
You can use HTML5 File API, AJAX and MVC 3 to build a robust file upload control to upload huge files securely and reliably to Windows Azure blob storage with a provision of monitoring operation progress and operation cancellation. The solution works as below:
Client-side JavaScript that accepts and processes a file uploaded by user.
Server-side code that processes file chunks sent by JavaScript.
Client-side UI that invokes JavaScript.
Get the sample code here: Reliable Uploads to Windows Azure Blob Storage via an HTML5 Control
I have written a blog post with an example on how to do this, the code is at GitHub
It is based on Gaurav Mantris post and works by hosting the JavaScript on the Blob Storage itself.
Configure a proper CORS rule on your storage account.
Generate a Shared Access Signature from your target container.
Install the blob storage SDK: npm install #azure/storage-blob.
Assuming your file is Blob/Buffer/BufferArray, you can do something like this in your code:
import { ContainerClient } from "#azure/storage-blob";
const account = "your storage account name";
const container = "your container name";
const sas = "your shared access signature";
const containerClient = new ContainerClient(
`https://${account}.blob.core.windows.net/${container}${sas}`
);
async function upload(fileName, file) {
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
const result = await blockBlobClient.uploadData(file);
console.log("uploaded", result);
}

Resources