Editing a file in Azure Blob Storage with an API - node.js

Here is my scenario.
I have placed a config file (.xml) into an Azure Blob Storage container
I want to edit that xml file and update/add content to it.
I want to deploy an api to an azure app service that will do that.
I built an api that runs locally that handles this but that isn't exactly going to cut it as a cloud application. This particular iteration is a NODEjs api that uses the Cheerio and File-System modules in order to manipulate and read the file respectively.
How can I retool this to be work with a file that lives in Azure blob storage?
note: Are azure blobs the best place to start with the file even? Is there a better place to put it?
I found this but it isn't exactly what I am after.....Azure Edit blob

Considering the data stored in blob is XML (in other words string type), instead of using getBlobToStream method, you can use getBlobToText method, manipulate the string, and then upload that updated string using createBlockBlobFromText.
Here's the pseudo code:
blobService.getBlobToText('mycontainer', 'taskblob', function(error, result, response) {
if (error) {
console.log('Error in reading blob');
console.error(error);
} else {
var blobText = result;//It
var xmlContent = someMethodToConvertStringToXml(blobText);//Convert string to XML if it is easier to manipulate
var updatedBlobText = someMethodToEditXmlContentAndReturnString(xmlContent);
//Reupload blob
blobService.createBlockBlobFromText('mycontainer', 'taskblob', updatedBlobText, function(error, result, response) {
if (error) {
console.log('Error in updating blob');
console.error(error);
} else {
console.log('Blob updated successfully');
}
});
}
});

Simply refactor your code to use the Azure Storage SDK for Node.js https://github.com/Azure/azure-storage-node

Related

How to update file when hosting in Google App Engine?

I have node js server service running on a Google Cloud App Engine.
I have JSON file in the assets folder of the project that needs to update by the process.
I was able to read the file and configs inside the file. But when adding the file getting Read-Only service error from the GAE.
Is there a way I could write the information to the file without using the cloud storage option ?
It a very small file and using the cloud storage thing would be using a very big drill machine for a Allen wrench screw
Thanks
Nope, in App Engine Standard there is no such a file system. In the docs, the following is mentioned:
The runtime includes a full filesystem. The filesystem is read-only except for the location /tmp, which is a virtual disk storing data in your App Engine instance's RAM.
So having this consideration you can write in /tmp but I suggest to Cloud Storage because if the scaling shutdowns all the instances, the data will be lost.
Also you can think of App Engine Flex which offers to have a HDD (because its backend is a VM) but the minimum size is 10GB so it will be worst than using Storage.
Once thanks for steering me not to waste time finding a hack solution for the problem.
Any way there was no clear code how to use the /tmp directory and download/upload the file using the app engine hosted node.js application.
Here is the code if some one needs it
const {
Storage
} = require('#google-cloud/storage');
const path = require('path');
class gStorage {
constructor() {
this.storage = new Storage({
keyFile: 'Please add path to your key file'
});
this.bucket = this.storage.bucket(yourbucketname);
this.filePath = path.join('..', '/tmp/YourFileDetails');
// I am using the same file path and same file to download and upload
}
async uploadFile() {
try {
await this.bucket.upload(this.filePath, {
contentType: "application/json"
});
} catch (error) {
throw new Error(`Error when saving the config. Message : ${error.message}`);
}
}
async downloadFile() {
try {
await this.bucket.file(filename).download({
destination: this.filePath
});
} catch (error) {
throw new Error(`Error when saving the config. Message : ${error.message}`);
}
}
}

How to upload Google Cloud text to speech API's response to Cloud Storage [Node.js]

I make a simple audio creating web app using Node.js server. I would like to create audio using Cloud text to speech API and then upload that audio to Cloud storage.
(I use Windows 10, Windows Subsystems for Linux, Debian 10.3 and Google Chrome browser. )
This is the code in Node.js server.
const client = new textToSpeech.TextToSpeechClient();
async function quickStart() {
// The text to synthesize
const text = 'hello, world!';
// Construct the request
const request = {
input: {text: text},
// Select the language and SSML voice gender (optional)
voice: {languageCode: 'en-US', ssmlGender: 'NEUTRAL'},
// select the type of audio encoding
audioConfig: {audioEncoding: 'MP3'},
};
// Performs the text-to-speech request
const [response] = await client.synthesizeSpeech(request);
// Write the binary audio content to a local file
console.log(response);
I would like to upload response to Cloud Storage.
Can I upload response to Cloud Storage directly? Or Do I have to save response in Node.js server and upload it to Cloud Storage?
I searched the Internet, but couldn't find the way to upload response to Cloud Storage directly. So, if you have a hint, please tell me. Thank you in advance.
You should be able to do that, with all your code in the same file. The best way for you to achieve that, it's by using a Cloud Function, that will be the one sending the file to your Cloud Storage. But, yes, you will need to save your file using Node.js, so then, you will upload it to Clou Storage.
To achieve that, you will need to save your file locally and then, upload it to Cloud Storage. As you can check in a complete tutorial in this other post here, you need to construct the file, save it locally and then, upload it. Below code is the main part you will need to add in your code.
...
const options = { // construct the file to write
metadata: {
contentType: 'audio/mpeg',
metadata: {
source: 'Google Text-to-Speech'
}
}
};
// copied from https://cloud.google.com/text-to-speech/docs/quickstart-client-libraries#client-libraries-usage-nodejs
const [response] = await client.synthesizeSpeech(request);
// Write the binary audio content to a local file
// response.audioContent is the downloaded file
return await file.save(response.audioContent, options)
.then(() => {
console.log("File written to Firebase Storage.")
return;
})
.catch((error) => {
console.error(error);
});
...
Once you have this part implemented, you will have the file that is saved locally downloaded and ready to be uploaded. I would recommend you to take a better look at the other post I mentioned, in case you have more doubts on how to achieve it.
Let me know if the information helped you!

strange behaviour for azure blob

i am using azure-storage to read blob files from container in my nodejs code
My code
blobService.listBlobsSegmented(containerName, token, { maxResults : 10 }, function(err, result) {
if (err) {
console.log("Couldn't list blobs for container %s", containerName);
console.error(err);
} else {
// do things here
}
});
It is working fine but when increase blob limit 10 to 500 my network stop working. what can be issue here ?
If you do an HTTP traffic capture with network analyzers like Fiddler or Wireshark, you will find this SDK (azure-storage) is just a REST API wrapper. So, it doesn't have much control on the network. If you still have this problem, I would recommend you check your computer or router network settings.

Azure Functions: Nodejs, What are restrictions / limitations when using file system?

I have not been able to get an azure function working that uses the node file system module.
I created a brand new function app with most basic HTTP trigger function and included the 'fs' module:
var fs = require('fs');
module.exports = function (context, req, res) {
context.log('function triggered');
context.log(req);
context.done();
}
This works fine. I see the full request in live streaming logs, and in the function invocation list.
However, as soon as I add the code which actually uses the file system, it seems to crash the azure function. It neither completes or throws the error. It also doesn't seem to show up in the azure function invocations list which is scary since this is loss of failure information and I might think my service was running fine when there were actually crashes.
var fs = require('fs');
module.exports = function (context, req, res) {
context.log('function triggered');
context.log(req);
fs.writeFile('message.txt', 'Hello Node.js', (err) => {
if (err) throw err;
console.log('It\'s saved!');
context.done();
});
}
The fs.writeFile code taken directly from the node.js website:
https://nodejs.org/dist/latest-v4.x/docs/api/fs.html#fs_fs_writefile_file_data_options_callback
I added the context.done() in the callback, but that snippet should work without issue on normal development environment.
This brings up the questions:
Is it possible to use the file system when using Azure Functions?
If so, what are the restrictions?
If no restrictions, are developers required to keep track and perform
cleanup or is this taken care of by some sandboxing?
From my understanding even though this is considered server-less computing there is still a VM / Azure Website App Service underneath which has a file system.
I can use the Kudu console and navigate around and see all the files in /wwwroot and the /home/functions/secrets files.
Imagine a scenario where an azure function is written to write a file with unique name and not perform cleanup it would eventually take up all the disk space on the host VM and degrade performance. This could happen accidentally by a developer and possibly go unnoticed until it's too late.
This makes me wonder if it is by design not to use the file system, or if my function is just written wrong?
Yes you can use the file system, with some restrictions as described here. That page describes some directories you can access like D:\HOME and D:\LOCAL\TEMP. I've modified your code below to write to the temp dir and it works:
var fs = require('fs');
module.exports = function (context, input) {
fs.writeFile('D:/local/Temp/message.txt', input, (err) => {
if (err) {
context.log(err);
throw err;
}
context.log('It\'s saved!');
context.done();
});
}
Your initial code was failing because it was trying to write to D:\Windows\system32 which is not allowed.

Get upload progress using Azure Storage Node.JS SDK

Looking at this, we have some code that looks like
var azure = require('azure-storage');
var blobSvc = azure.createBlobService()
blobSvc.createBlockBlobFromLocalFile('mycontainer', 'giantFile', 'giantFile.bak', function(error, result, response){
if(!error){
// file uploaded
}
});
This "work" but we have no idea about the status of each upload until it returns. We'd like to print the progress of the upload to the console since it's a sequence of very large files that can take several hours.
I'm afraid i don't think its possible out of the box with the SDK. MSDN Artical Apparently such an API hasn't been implemented yet.

Resources