how to upload image to backblaze using hapijs - node.js

I am trying to upload image to backblaze-b2 using hapi.js. I am using a plugin called easy-backblaze but while using it I need to mention the path of the file. But While importing the file using Hapijs I am not able to understand how to get that path.
This is the Code I have written and here in b2.uploadFile I need to mention the path of the file on local Drive:
var B2 = require('easy-backblaze');
var b2 = new B2('accountId', 'applicationKey');
b2.uploadFile('C:/Users/Lovika/Desktop/addDriver.png', {
name: 'addDriver.png', // Optional, renames file
bucket: 'testBucket', // Optional, defaults to first bucket
}, function(err, res) {
console.log('Done!', err, res);
});
Do I need to upload the file to the server first and then to backblaze or Is there any way to upload the file directly to backblaze

Related

Create a read stream for a pdf file to upload to s3 bucket

I have an express service that's taking a pdf file from my front-end and saving it to an s3 bucket. I'm running into issues trying to take the file and create a stream so that I can then pass that to the s3 upload function. I'm trying to avoid writing the file to disc so I don't think I can use fs.createReadStream() but I can't seem to find an alternative way to do it..
router.post('/upload', upload.single('my-pdf'), async (req, res, next) {
const file = req.file;
// Needs a file path not an actual file
const stream = fs.createReadStream(file);
return s3.upload(file).promise();
}
Any help or advice on how to get around this would be greatly appreciated.
Assuming that req.file.<name_of_upload_field> is a buffer holding the file contents, you can convert that to a readable stream via
var str = new stream.PassThrough();
str.end(req.file.<name_of_upload_field>);
return s3.upload(str).promise();

Generating rss.xml for Angular 8 app locally works fine, but not on prod

I am trying to generate a rss.xml file for my angular 8 app with ssr + firebase + gcp inside the domain.
I've created a RssComponent which can be reached at /rss route. There i call getNews() method and receive an array of objects. Then I make a http request to /api/rss and in server.ts i handle that request:
app.post('/api/rss', (req, res) => {
const data = req.body.data;
const feedOptions = // defining options here
const feed = new RSS(feedOptions);
data.forEach((item) => {
feed.item({
title: item.headingUa,
description: item.data[0].dataUa,
url: item.rssLink,
guid: item.id,
date: item.utcDate,
enclosure: {url: item.mainImg.url.toString().replace('&', '&'), type: 'image/jpeg'}
});
});
const xml = feed.xml({indent: true});
fs.chmod('dist/browser/rss.xml', 0o600, () => {
fs.writeFile('dist/browser/rss.xml', xml, 'utf8', function() {
res.status(200).end();
});
});
});
And finally on response i'm opening the recently generated rss.xml file in RssComponent. Locally everything is working fine but on Google Cloud Platform it's not generating a file.
As explained in the Cloud Functions docs:
The only writeable part of the filesystem is the /tmp directory
Try changing the path to the file to the /tmp directory.
Nevertheless, using local files in a serverless environment is a really bad idea. You should assume the instance handling the following request will not be the same as the previous one.
The best way to handle this would be to avoid writing local files and instead storing the generated file in GCP Storage or Firebase Storage, and then retrieving it from there when needed.
This will ensure your functions are idempotent, and also will comply with the best practices.

How to download files in azure blob storage into local folder using node js

I am using node js to download azure blob storage files into our local machine. I am able to download it into my project path but not able to download into my local machine. I am using html, Express,and node js. Currently working on localhost only. How to download ?
Below is the code that i am using to download blob file to local folder.
app.get("/downloadImage", function (req, res) {
var fileName = req.query.fileName;
var downloadedImageName = util.format('CopyOf%s', fileName);
blobService.getBlobToLocalFile(containerName, fileName, downloadedImageName, function (error, serverBlob) {
});
});
I am able to to download it to my project folder but i want to download it to my downloads folder. Please help me on this ?
To download a file from Azure blob storage
ConnectionString: Connection string to blob storage
blobContainer: Blob Container Name
sourceFile: Name of file in container i.e sample-file.zip
destinationFilePath: Path to save file i.e ${appRoot}/download/${sourceFile}
const azure = require('azure-storage');
async function downloadFromBlob(
connectionString,
blobContainer,
sourceFile,
destinationFilePath,
) {
logger.info('Downloading file from blob');
const blobService = azure.createBlobService(connectionString);
const blobName = blobContainer;
return new Promise((resolve, reject) => {
blobService.getBlobToLocalFile(blobName, sourceFile, destinationFilePath, (error, serverBlob) => {
if (!error) {
logger.info(`File downloaded successfully. ${destinationFilePath}`);
resolve(serverBlob);
}
logger.info(`An error occured while downloading a file. ${error}`);
reject(serverBlob);
});
});
};
According to the reference of the method blobService.getBlobToLocalFile, as below, the value of parameter localFileName should be the local file path with related dir path.
localFileName string The local path to the file to be downloaded.
So I created a directory named downloadImages and changed your code as below.
var downloadDirPath = 'downloadImages'; // Or the absolute dir path like `D:/downloadImages`
app.get("/downloadImage", function (req, res) {
var fileName = req.query.fileName;
var downloadedImageName = util.format('%s/CopyOf%s', path, fileName);
blobService.getBlobToLocalFile(containerName, fileName, downloadedImageName, function (error, serverBlob) {
});
});
It works for me, the image file was downloaded into my downloadImages directory, not under the path of my node app.js running.
Note: If you want to deploy it on Azure WebApp later, please must use the absolute directory path like D:/home/site/wwwroot/<your defined directory for downloading images>, because the related directory path is always related the path of IIS started up node.

Rename file before upload using remote hooks in loopback component storage

I am having difficulty in renaming a file before upload in loopback component storage. As it seems, loopback does'nt provide a built-in option for the same. For uploading from a angular form, I have used the angular uploader beforeupload method to change the filename using following method:
this.fileExtension = '.' + item.file.name.split('.').pop();
item.file.name = Math.random().toString(36).substring(7) + new Date().getTime() + this.fileExtension;
Is it possible to perform the same operations in before remote hook of the upload method in loopback component storage? My intention is to do the same file name change operation for api requests coming from mobile devices. If a remote hook cannot do the same, is there any other method for achieving the same result? Thanks in advance!
Say you have storage DS defined in datasources.json.
You can do it in a boot script :
//server/boot/any.js
module.exports = function(app){
app.dataSources.storage.connector.getFilename = function (file, req, res) {
//file.name is original filename uploaded
var filename = req.query.filename || 'general.ext';
return filename;
}
};
and add filename in the upload url.
e.g : /containers/my-container/upload?filename=profile.jpg

Use Cordova's Filetransfer plugin with express/blob storage

I am using Typescript and Cordova 4.0.
I have the following sample code:
uploadImages(imageUris: Array<string>): any {
var fileTransfer = new FileTransfer();
for (var i = 0; i < imageUris.length; i++) {
fileTransfer.upload(imageUris[i], encodeURI('http://3187cf3.ngrok.com/test/photos'), (success) => {
alert('success');
}, (err) => {
alert('error');
});
}
}
This corresponds to an express route:
var router = express.Router(),
test = test.controller;
router
.post('/test/photos', bind(test.uploadPhotos, test));
Which corresponds to a controller method:
uploadPhotos(req: express.Request, res: express.Response) {
console.log(req);
}
I can't seem to figure out how to, inside of my controller, grab the "file" or image I'm posting to my server using Filetransfer. It's not on req.body or req.query, and when I look through the entire req I can't seem to locate the file. The app flow is working enough to actually make the POST request to test/photos, I just don't know how to or if I can access the file at that point.
How does Filetransfer work, and how can I access the data I need in my controller so that I can push it to Azure Blob Storage?
It looks like you have everything setup correctly to send the data through to your controller. The issue is that you need to put the file on the request since cordova's filetransfer plugin doesn't do that by default.
You can do that with a popular library multer.
npm install multer --save-dev To install multer and save it to your package.json file.
In your express config file, add something like the following:
var multer = require('multer');
app.use(multer({ dest: path.resolve(config.root, 'public/img/') }))
'public/img/' is the path you would like for your image to be saved.
Now your req will have files on it. To upload a single file, you would use req.files.file. You'll want to use this variable to send your file to azure's blob storage using something like blobSvc.createBlockBlobFromLocalFile(containerName, fileName, filePath)
Since you're using Azure for remote storage, chances are you will want to remove the local file that multer has saved. I'd recommend using fs or rimraf to remove the file stored in public/img/, or whatever you set the path to in your express config. If you are using fs, you'll want to use the .unlink command.

Resources