With Nodejs, why can I read from one Azure blob container but not from another? - node.js

I have a nodejs app running as a Worker Role in Azure Cloud Services where I have two blob storage containers that the app will read from when responding to certain user requests.
Here's my setup:
Using the azure-storage package to interface with my blob storage.
Two containers, each holding files of different types that the user may ask for at some point.
And I use the following code to stream the files to the HTTP response:
exports.getBlobToStream = function(containerName, fileName, res) {
var blobService = azure.createBlobService();
blobService.getBlobProperties(containerName, fileName, function(error, properties, status){
if(error || !status.isSuccessful)
{
res.header('Content-Type', "text/plain");
res.status(404).send("File " + fileName + " not found");
}
else
{
res.header('Content-Type', properties.contentType);
res.header('Content-Disposition', 'attachment; filename=' + fileName);
blobService.createReadStream(containerName, fileName).pipe(res);
}
});
};
One important
In the past I've had no issues reading from either container. In my research on the problemI've found an identical (but outdated) issue on the all-encompassing azure-sdk-for-node here https://github.com/Azure/azure-sdk-for-node/issues/434. The solution that fixed that problem also fixed mine, but I can't understand why. Particularly when I can read from the other container from within the same app and using the same code without any issues.
I can live with the solution but want to understand what's going on. Any thoughts or suggestions?

#winsome, thanks for raising this issue to us. If you set EMULATED, it means the code will call the local storage emulator other than the real storage account and expected to fail. Regarding the one container works under EMULATED, a guess is your local storage emulator also has a same container. Please check.

Related

Can I avoid using live Firebase Storage when using emulators?

as I patiently wait for Firebase storage to be added to the emulators, I was wondering if there is a way I can avoid modifying live storage files and folders when running hosting / functions in the emulator?
For example I use the following code to delete all the files in a folder. Last night someone accidentally deleted all the documents in our emulator as part of a test and it deleted all the LIVE storage folders as we use an import of real documents into our emulator 🤦
async function deleteStorageFolder(path:string) {
const bucket = admin.storage().bucket();
return bucket.deleteFiles({
prefix: path
})
Is there any way I can tell firebase to avoid using the production storage APIs when emulators are running?
I have used the following condition in my function to prevent using firebase storage API when running in emulator:
if (process.env.FUNCTIONS_EMULATOR == "true") {
console.log(`Running in emulator, won't call firebase storage`)
} else {
// Code goes here to run storage APIs
}

Read content from Azure blob storage in node API

I am new to azure and working on the storage account for one my application.Basically I have json files stored in azure blob storage.
I want to read the data from these files in Node JS application and do some filtering on the data, which is eventually secured REST end point to view data in the UI/Client as HTTP response.
I have gone through the docs about different operations on the blob storage which is exposed as NODE SDK, we can see find them in below link,
https://github.com/Azure/azure-storage-node
But the question I have is "How to read the json files". I see one method getBlobToStream. Is this going to give me json content in the callback, so that I can do further processing on the data and send as response to clients who requested.
Please some one explain how to do this in better way or is this the only option we have.
Thanks for the help.
To use getBlobToStream, you have to define a writable stream.
So I recommend you to use getBlobToText to avoid trouble.
If no error occurs, this method will get blob content into text in callback. You can then parse it to a JSON string. A simple example is as below.
blobService.getBlobToText(container, blobname, function(error, text){
if(error){
console.error(error);
res.status(500).send('Fail to download blob');
} else {
var data = JSON.parse(text);
res.status(200).send('Filtered Data you want to send back');
}
});

GCloud storage download incomplete on GCE with NodeJS

I'm using GCloud version 0.18.0. On my dev PC, I actually download very well file on Google Storage. But once the script is running on a Google Compute Engine, the download start and says it's complete but it's not. It's a log file so when I'm checking the downloaded log file, I see it's not completed. The beginning of the file is there.
I made so much search and no body looks like to have my problem.
var gcloud = require('gcloud')({projectId: project_id});
var bucket = google.storage.bucket('logs');
var log_one = {filename: 'test.log'};
bucket.file(log_one.filename).download({destination: './tmp/processing_' + log_one.filename}, function(err) {
if (err.code == 403 || err.code == 404 || err.message) {
console.log(err);
console.log("Error after bucket.file.download");
return;
}
console.log("Downloaded file " + log_one.filename + " completed");
// Do stuff on the log file <-- Which is here the file is incomplete on GCE
});
The instance can talk with all Google API and anyway, it downloads a part of the log file.
I also tried to put a timer to give time to finish to write the file or something like that. No success. I'm never asking for help on forum, because I'm always finding a solution. But this one, I need help.
Since 0.18.0, we've run into a few issues related to streams & complete events. Upgrading to the latest version, 0.24.1, will likely magically resolve the issue.

How to get blob to a string in Node,js with azure

I am trying to download a blob from a windows azure blob storage, straight to a string. Currently I can get it to a file. But can not find any documentation of how to get the data straight to a variable.
blobService.getBlobToStream('ratings'
, '0001'
, fs.createWriteStream('output.txt')
, function(error){
if(!error){
// Wrote blob to stream
console.log("blob written");
}
});
Any Ideas?
There's a function called getBlobToText in node.js SDK (Source code here: https://github.com/Azure/azure-sdk-for-node/blob/master/lib/services/blob/blobservice.js). Give that a try.

Azure Blob storage Download

I am learning Azure, I have successfully uploaded and list files in my containers. When I run the code below on my home pc everything works fine, no exceptions, however when I run on my work pc i catch an exception that states:
Blob data corrupted. Incorrect number of bytes received '12288' / '-1'
The file seems to download to my local drive just fine, I just cannot figure why it works different on two different PCs, exact same code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("My connection string");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob blockBlob = container.GetBlockBlobReference("ARCS.TXT");
using (var fileStream = System.IO.File.OpenWrite(#"c:\a\ARCS.txt"))
{
blockBlob.DownloadToStream(fileStream);
}
May be your Organization's firewall is blocking a specific port. I have written a blog which discuss similar kind of port related issues. Will request you to verify once from that. http://nabaruns.blogspot.in/2012/11/common-port-related-troubleshoot-in.html
Regards
Nabarun
Your code looks correct.
That is a weird issue. It's more weird becuase file gets downloaded properly even after the error. I would recomend you use Azure storage explorer on both of your machines.
If Azure storage explorer works fine on both the machine then next step would be to check the SDK version on both machine. There are chances of such error with older version of SDK.
You may also want to try Commandline Downloader to trouble shoot your issue.
Note - Azure storage explorer and Commandline Downloader are open source. If download through them works fine then you can download its code and debug through it also.
I'd recommend trying CloudBlob.DownloadToFile or CloudBlob.DownloadToStream instead of CloudBlockBlob

Resources