How to download files in azure blob storage into local folder using node js - node.js

I am using node js to download azure blob storage files into our local machine. I am able to download it into my project path but not able to download into my local machine. I am using html, Express,and node js. Currently working on localhost only. How to download ?
Below is the code that i am using to download blob file to local folder.
app.get("/downloadImage", function (req, res) {
var fileName = req.query.fileName;
var downloadedImageName = util.format('CopyOf%s', fileName);
blobService.getBlobToLocalFile(containerName, fileName, downloadedImageName, function (error, serverBlob) {
});
});
I am able to to download it to my project folder but i want to download it to my downloads folder. Please help me on this ?

To download a file from Azure blob storage
ConnectionString: Connection string to blob storage
blobContainer: Blob Container Name
sourceFile: Name of file in container i.e sample-file.zip
destinationFilePath: Path to save file i.e ${appRoot}/download/${sourceFile}
const azure = require('azure-storage');
async function downloadFromBlob(
connectionString,
blobContainer,
sourceFile,
destinationFilePath,
) {
logger.info('Downloading file from blob');
const blobService = azure.createBlobService(connectionString);
const blobName = blobContainer;
return new Promise((resolve, reject) => {
blobService.getBlobToLocalFile(blobName, sourceFile, destinationFilePath, (error, serverBlob) => {
if (!error) {
logger.info(`File downloaded successfully. ${destinationFilePath}`);
resolve(serverBlob);
}
logger.info(`An error occured while downloading a file. ${error}`);
reject(serverBlob);
});
});
};

According to the reference of the method blobService.getBlobToLocalFile, as below, the value of parameter localFileName should be the local file path with related dir path.
localFileName string The local path to the file to be downloaded.
So I created a directory named downloadImages and changed your code as below.
var downloadDirPath = 'downloadImages'; // Or the absolute dir path like `D:/downloadImages`
app.get("/downloadImage", function (req, res) {
var fileName = req.query.fileName;
var downloadedImageName = util.format('%s/CopyOf%s', path, fileName);
blobService.getBlobToLocalFile(containerName, fileName, downloadedImageName, function (error, serverBlob) {
});
});
It works for me, the image file was downloaded into my downloadImages directory, not under the path of my node app.js running.
Note: If you want to deploy it on Azure WebApp later, please must use the absolute directory path like D:/home/site/wwwroot/<your defined directory for downloading images>, because the related directory path is always related the path of IIS started up node.

Related

Is fs.mkdir() required to create a sub directory within /tmp in Firebase Cloud Functions?

Let's look at the code below. If I wanted to save a file to /tmp/new_folder should I use node's fs.mkdir() function or can I just give it the path as a string even though the sub-directory does not exist yet?
Also, is it a requirement to use path.join() over concatenating strings to create a the destination path?
// Download file from bucket.
const bucket = gcs.bucket(fileBucket);
const tempFilePath = path.join(os.tmpdir(), fileName);
const metadata = {
contentType: contentType,
};
return bucket.file(filePath).download({
destination: tempFilePath,
})
In the Cloud Functions runtime, /tmp already exists, so there is no need to try to create it before you write a file there. If you want to create a subdirectory under /tmp, you will have to create that on your own (and delete it when your function is done).

how to upload image to backblaze using hapijs

I am trying to upload image to backblaze-b2 using hapi.js. I am using a plugin called easy-backblaze but while using it I need to mention the path of the file. But While importing the file using Hapijs I am not able to understand how to get that path.
This is the Code I have written and here in b2.uploadFile I need to mention the path of the file on local Drive:
var B2 = require('easy-backblaze');
var b2 = new B2('accountId', 'applicationKey');
b2.uploadFile('C:/Users/Lovika/Desktop/addDriver.png', {
name: 'addDriver.png', // Optional, renames file
bucket: 'testBucket', // Optional, defaults to first bucket
}, function(err, res) {
console.log('Done!', err, res);
});
Do I need to upload the file to the server first and then to backblaze or Is there any way to upload the file directly to backblaze

Retrieving a Firebase storage image link via Cloud Function

How do I retrieve the download links to stored images in Firebase via a cloud function?
I've tried all kinds of variations including the next one:
exports.getImgs = functions.https.onRequest((req, res) => {
var storage = require('#google-cloud/storage')();
var storageRef = storage.ref;
console.log(storageRef);
storageRef.child('users/user1/avatar.jpg').getDownloadURL().then(function(url) {
});
});
It annoyed me, so I will put the solution with a straight forward explanation to those who are looking for it.
1st, install GCD Storage using the firebase command line:
npm install --save #google-cloud/storage
Cloud function code:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
const bucket = gcs.bucket('name-of-bucket.appspot.com');
const file = bucket.file('users/user1/avatar.jpg');
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
console.log('signed URL', signedUrls[0]); // this will contain the picture's url
});
The name of your bucket can be found in the Firebase console under the Storage section.
The 'service-account.json' file can be created and downloaded from here:
https://console.firebase.google.com/project/_/settings/serviceaccounts/adminsdk
And should be stored locally in your Firebase folder under the functions folder. (or other as long as change the path in the code above)
That's it.

Move/rename folder in Google Cloud Storage using nodejs gcloud api

I am trying to rename or move a folder in google cloud storage using the gcloud api.
A similar question explains how to delete a folder:
Delete folder in Google Cloud Storage using nodejs gcloud api
But how can one rename a folder? or move to another path?
You can try something like this:
'use strict'
var async = require('async')
var storage = require('#google-cloud/storage')()
var bucket = storage.bucket('stephen-has-a-new-bucket')
bucket.renameFolder = function(source, dest, callback) {
bucket.getFiles({ prefix: source }, function(err, files) {
if (err) return callback(err)
async.eachLimit(files, 5, function(file, next) {
file.move(file.name.replace(source, dest), next)
}, callback)
})
}
bucket.renameFolder('photos/cats', 'photos/dogs', console.log)
There are no folders. There is simply a collection of objects that all happen to have the same key prefix, for example photos/animals/cat.png and photos/animals/dog.png both have a common prefix photos/animals/ and that's what makes them appear to be in the same folder.
You will need to copy (or move) each of the objects to its new key, for example move photos/animals/cat.png to photos/pets/cat.png and move photos/animals/dog.png to photos/pets/dog.png.
That said, Google Cloud provides a way to do this from the command line using gsutil mv.

Get source url for image stored in Bluemix Object Storage container using Node.js app

I have an Object Storage instance on Bluemix where I am storing images in the container. I need a source url for the images stored there so that I can use that image. To do this, I'm thinking of creating a Node.js app so that I will write a post call where I'll pass image name present in Object Storage as request, so that it will give me the image url as response.
Is this possible or not? If possible, can anyone suggest whether there are any npm modules which do this functionality? If not, are there any other suggestions to get the url of image?
Any help is appreciated..Thanks!
start the server by command node app.js also u need package pkgcloud to perform this operation. You can get the object storage credentials simply by creating a key on IBM console inside Storage module.
inside app.js insert a new route for download
var objectStorageHandler = require("./lib/objectStorageHandler.js");
app.get('/download', function(req, res) {
(new objectStorageHandler()).download('YourContainerName', 'imagenamewithextension',function(download){
console.log(res);
download.pipe(res);
});
});
Inside Lib folder create a module with name objectStorageHandler.js
Inside objectStorageHandler.js write code
var pkgcloud = require('pkgcloud');
var objectStorageHandler = function(){
}
objectStorageHandler.prototype.download = function(container, file,callback)
{
var config = {
provider: 'openstack',
useServiceCatalog: true,
useInternal: false,
keystoneAuthVersion: 'v3',
authUrl: 'https://identity.open.softlayer.com',
tenantId: 'YOURPROJECTID', //projectId from credentials
domainId: 'YOURDOMAINID',
username: 'YOURUSRNAME',
password: 'YOURPASSWORD',
region: 'dallas' //dallas or london region
};
var client = pkgcloud.storage.createClient(config);
client.auth(function (error) {
if(error) {
console.error("Authorization error for storage client (pkgcloud): ", error);
}
else {
var request = client.download({
container: container,
remote: file
});
callback(request);
}
});
}
module.exports = objectStorageHandler;
after server started lets assume at port 3000 simply call localhost:3000/download that will download the image, we can also pass image name in parameters to download images dynamically.

Resources