GCP Cloud Storage: What is the difference between bucket.upload and file.save methods? - node.js

Is there a better option for uploading a file?
In my case I need to upload a lot of small files (pdf or text) and I have to decide the best option, but anyway are there differences between these two methods?
Here two examples taken directly from the documentation
Save method: (Docs)
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const myBucket = storage.bucket('my-bucket');
const file = myBucket.file('my-file');
const contents = 'This is the contents of the file.';
file.save(contents, function(err) {
if (!err) {
// File written successfully.
}
});
//-
// If the callback is omitted, we'll return a Promise.
//-
file.save(contents).then(function() {});
Upload method: (Docs)
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('albums');
//-
// Upload a file from a local path.
//-
bucket.upload('/local/path/image.png', function(err, file, apiResponse) {
// Your bucket now contains:
// - "image.png" (with the contents of `/local/path/image.png')
// `file` is an instance of a File object that refers to your new file.
});

At its core, both of these functions do the same thing. They upload a file to a bucket. One is just a function on bucket and the other a function on file. They both end up calling file.createWriteStream, so they have the same performance as well.
The functions behave differently in terms of upload type. file.save will default to a resumable upload unless you specify otherwise (you can set the resumable boolean on SaveOptions to false). bucket.upload will perform a multipart upload if the file is smaller than 5MB and a resumable upload otherwise. For bucket.upload, you can force a resumable or multipart upload by modifying the resumable boolean on UploadOptions.
Note that in the upcoming major version release (https://github.com/googleapis/nodejs-storage/pull/1876), this behavior will be unified. Both functions will default to a resumable upload regardless of file size. The behavior will be the same.
For small files, multipart uploads are recommended.

Related

Create a read stream for a pdf file to upload to s3 bucket

I have an express service that's taking a pdf file from my front-end and saving it to an s3 bucket. I'm running into issues trying to take the file and create a stream so that I can then pass that to the s3 upload function. I'm trying to avoid writing the file to disc so I don't think I can use fs.createReadStream() but I can't seem to find an alternative way to do it..
router.post('/upload', upload.single('my-pdf'), async (req, res, next) {
const file = req.file;
// Needs a file path not an actual file
const stream = fs.createReadStream(file);
return s3.upload(file).promise();
}
Any help or advice on how to get around this would be greatly appreciated.
Assuming that req.file.<name_of_upload_field> is a buffer holding the file contents, you can convert that to a readable stream via
var str = new stream.PassThrough();
str.end(req.file.<name_of_upload_field>);
return s3.upload(str).promise();

How do you add a header to wav file?

I am sending audio data stored as a blob to my backend (node/express). When I save the file as .wav and attempt to use in the SpeechRecogition package in python it throws an error saying the "file does not start with RIFF id". So how can I add the headers to my blob file before I save it so that it is a correctly formatted .wav file? I can provide the code if necessary.
node.js file
var multer = require('multer');
var fs = require('fs'); //use the file system so we can save files
var uniqid = require('uniqid');
var spawn = require('child_process').spawn;
const storage = multer.memoryStorage()
var upload = multer({ storage: storage });
router.post('/api/test', upload.single('upl'), function (req, res) {
console.log(req.file);
console.log(req.file.buffer);
var id = uniqid();
fs.writeFileSync(id+".wav", Buffer.from(new Uint8Array(req.file.buffer))); //write file to server as .wav file
const scriptPath = 'handleAudio.py'
const process = spawn('python3', [__dirname+"/../"+scriptPath, "/home/bitnami/projects/sample/"+id+".wav", req.file.originalname, 'True']); //throws error about header in .wav
});
Also I had this same example working with a php endpoint that just saved the blob to a file with .wav extension and the python file accepted it. What could be different in the move_uploaded_file in php and what I am doing above with node?
Every .wav file needs a header specified by the WAVE file format, available here. While it's fine for you to build the header yourself, it's much easier to just use a proper lib to do the work for you.
One example is node-wav, which has a nice API to write WAVE files from raw PCM data (what you have at the moment). Example code is provided by the node-wav documentation.

Is fs.mkdir() required to create a sub directory within /tmp in Firebase Cloud Functions?

Let's look at the code below. If I wanted to save a file to /tmp/new_folder should I use node's fs.mkdir() function or can I just give it the path as a string even though the sub-directory does not exist yet?
Also, is it a requirement to use path.join() over concatenating strings to create a the destination path?
// Download file from bucket.
const bucket = gcs.bucket(fileBucket);
const tempFilePath = path.join(os.tmpdir(), fileName);
const metadata = {
contentType: contentType,
};
return bucket.file(filePath).download({
destination: tempFilePath,
})
In the Cloud Functions runtime, /tmp already exists, so there is no need to try to create it before you write a file there. If you want to create a subdirectory under /tmp, you will have to create that on your own (and delete it when your function is done).

how to upload image to backblaze using hapijs

I am trying to upload image to backblaze-b2 using hapi.js. I am using a plugin called easy-backblaze but while using it I need to mention the path of the file. But While importing the file using Hapijs I am not able to understand how to get that path.
This is the Code I have written and here in b2.uploadFile I need to mention the path of the file on local Drive:
var B2 = require('easy-backblaze');
var b2 = new B2('accountId', 'applicationKey');
b2.uploadFile('C:/Users/Lovika/Desktop/addDriver.png', {
name: 'addDriver.png', // Optional, renames file
bucket: 'testBucket', // Optional, defaults to first bucket
}, function(err, res) {
console.log('Done!', err, res);
});
Do I need to upload the file to the server first and then to backblaze or Is there any way to upload the file directly to backblaze

Convert multer file to string

I'm using multer to read in multi-part form data. However, I don't actually want to upload it. I want to put its contents into a string. Is there a simple way to do this?
Non-file fields are not stored on disk when you use multer's DiskStorage (the default storage type).
However, if you want files to be stored in memory too, then you need to use multer's MemoryStorage which will store files as Buffers, which you can then convert to string if you like:
var storage = multer.memoryStorage();
var upload = multer({ storage: storage });
// ...
app.post('/profile', upload.single('aboutme'), function(req, res) {
console.log(req.file.buffer);
});

Resources