Streaming upload from NodeJS to Dropbox - node.js

Our system needs to use out internal security checks when interacting with dropbox, we can therefore not use the clientside SDK for Dropbox.
We would rather upload to our own endpoint, apply security checks, and then stream the incoming request to dropbox.
I am coming up short here as there was an older NodeJS Dropbox SDK which supported pipes, but the new SDK does not.
Old SDK:
https://www.npmjs.com/package/dropbox-node
We want to take the incoming upload request and forward it to dropbox as it comes in. (and thus prevent the upload from taking twice as long if we first upload the entire thing to our server and then upload to dropbox)
Is there any way to solve this?

My Dropbox NPM module (dropbox-v2-api) supports streaming. It's based on HTTP API, so you can take an advantage of streams. Example? I see it this way:
const contentStream = fs.createReadStream('file.txt');
const securityChecks = ... //your security checks
const uploadStream = dropbox({
resource: 'files/upload',
parameters: { path: '/target/file/path' }
}, (err, result, response) => {
//upload finished
});
contentStream
.pipe(securityChecks)
.pipe(uploadStream);
Full stream support example here.

Related

How to upload Google Cloud text to speech API's response to Cloud Storage [Node.js]

I make a simple audio creating web app using Node.js server. I would like to create audio using Cloud text to speech API and then upload that audio to Cloud storage.
(I use Windows 10, Windows Subsystems for Linux, Debian 10.3 and Google Chrome browser. )
This is the code in Node.js server.
const client = new textToSpeech.TextToSpeechClient();
async function quickStart() {
// The text to synthesize
const text = 'hello, world!';
// Construct the request
const request = {
input: {text: text},
// Select the language and SSML voice gender (optional)
voice: {languageCode: 'en-US', ssmlGender: 'NEUTRAL'},
// select the type of audio encoding
audioConfig: {audioEncoding: 'MP3'},
};
// Performs the text-to-speech request
const [response] = await client.synthesizeSpeech(request);
// Write the binary audio content to a local file
console.log(response);
I would like to upload response to Cloud Storage.
Can I upload response to Cloud Storage directly? Or Do I have to save response in Node.js server and upload it to Cloud Storage?
I searched the Internet, but couldn't find the way to upload response to Cloud Storage directly. So, if you have a hint, please tell me. Thank you in advance.
You should be able to do that, with all your code in the same file. The best way for you to achieve that, it's by using a Cloud Function, that will be the one sending the file to your Cloud Storage. But, yes, you will need to save your file using Node.js, so then, you will upload it to Clou Storage.
To achieve that, you will need to save your file locally and then, upload it to Cloud Storage. As you can check in a complete tutorial in this other post here, you need to construct the file, save it locally and then, upload it. Below code is the main part you will need to add in your code.
...
const options = { // construct the file to write
metadata: {
contentType: 'audio/mpeg',
metadata: {
source: 'Google Text-to-Speech'
}
}
};
// copied from https://cloud.google.com/text-to-speech/docs/quickstart-client-libraries#client-libraries-usage-nodejs
const [response] = await client.synthesizeSpeech(request);
// Write the binary audio content to a local file
// response.audioContent is the downloaded file
return await file.save(response.audioContent, options)
.then(() => {
console.log("File written to Firebase Storage.")
return;
})
.catch((error) => {
console.error(error);
});
...
Once you have this part implemented, you will have the file that is saved locally downloaded and ready to be uploaded. I would recommend you to take a better look at the other post I mentioned, in case you have more doubts on how to achieve it.
Let me know if the information helped you!

NodeJS stream out of AWS Lambda function

We are trying to migrate our zip microservice from regular application in nodejs Express to AWS API Gateway integrated with AWS Lambda.
Our current application sends request to our API, gets list of attachments and then visits those attachments and pipes their content back to user in form of zip archive. It looks something like this:
module.exports = function requestHandler(req, res) {
//...
//irrelevant code
//...
return getFileList(params, token).then(function(fileList) {
const filename = `attachments_${params.id}`;
res.set('Content-Disposition', `attachment; filename=${filename}.zip`);
streamFiles(fileList, filename).pipe(res); <-- here magic happens
}, function(error) {
errors[error](req, res);
});
};
I have managed to do everything except the part where I have to stream content out of Lambda function.
I think one of possible solutions is to use aws-serverless-express, but I'd like a more elegant solution.
Anyone has any ideas? Is it even possible to stream out of Lambda?
Unfortunately lambda does not support streams as events or return values. (It's hard to find it mentioned explicitly in the documentation, except by noting how invocation and contexts/callbacks are described in the working documentation).
In the case of your example, you will have to await streamFiles and then return the completed result.
(aws-serverless-express would not help here, if you check the code they wait for your pipe to finish before returning: https://github.com/awslabs/aws-serverless-express/blob/master/src/index.js#L68)
n.b. There's a nuance here that a lot of the language SDK's support streaming for requests/responses, however this means connecting to the stream transport, e.g. the stream downloading the complete response from the lambda, not listening to a stream emitted from the lambda.
Had the same issue, now sure how you can do stream/pipe via the native lambda + API Gateway directly... but it's technically possible.
We used Serverless Framework and were able to use XX.pipe(res) using this starter kit (https://github.com/serverless/examples/tree/v3/aws-node-express-dynamodb-api)
What's interesting is that this just wraps over native lambda + API Gateway so, technically it is possible as they have done it.
Good luck

export firebase data node to pdf report

What a have is a mobile app emergency message system which uses Firebase as a backend. When the end of an emergency event ends, I would like to capture the message log in a pdf document. I have not been able to find any report editors that work with Firebase. This means I may have to export this to php mysql. The Firebase php SDK looks to be to much overkill for this task. I have been googling php get from firebase and most responses have to do with using the Firebase php SDK. Is this the only way it can be acomplished?
You could use PDF Kit (...) on Cloud Functions (it's all nodeJS, no PHP available there).
On npmjs.com there are several packages for #firebase-ops, googleapis and #google-cloud.
In order to read from Firebase and write to Storage Bucket or Data Store; that example script would still require a database reference and a storage destination, to render the PDF content (eventually from a template) and puts it, where it belongs. also see firebase / functions-samples (especially the package.json which defines the dependencies). npm install -g firebase-tools installs the tools required for deployment; also the requires need to be installed in order to be locally known (quite alike composer - while remotely these are made known while the deployment process).
You'd need a) Firebase Event onUpdate() as the trigger, b) check the endTime of the returned DeltaSnapshot for a value and c) then render & store the PDF document. the code may vary, just to provide a coarse idea of how it works, within the given environment:
'use strict';
const admin = require('firebase-admin');
const functions = require('firebase-functions');
const PDFDocument = require('pdfkit');
const gcs = require('#google-cloud/storage')();
const bucket = gcs.bucket( 'some-bucket' );
const fs = require('fs');
// TODO: obtain a handle to the delta snapshot
// TODO: render the report
var pdf = new PDFDocument({
size: 'A4',
info: {Title: 'Tile of File', Author: 'Author'}
});
pdf.text('Emergency Incident Report');
pdf.pipe(
// TODO: figure out how / where to store the file
fs.createWriteStream( './path/to/file.pdf' )
).on('finish', function () {
console.log('PDF closed');
});
pdf.end();
externally running PHP code is in this case nevertheless not run on the server-side. the problem with it is, that an external server won't deliver any realtime trigger and therefore the file will not appear instantly, upon time-stamp update (as one would expect it from a Realtime Database). one could also add external web-hooks (or interface them with PHP), eg. to obtain these PDF files through HTTPS (or even generated upon HTTPS request, for externally triggered generation). for local testing one can use command firebase serve, saves much time vs. firebase deploy.
the point is, that one can teach Cloud Function how the PDF files shall look alike, when they shall be created and where to put them, as micro-service which does nothing else but to render these files. scripting one script should be still within acceptable range, given all the clues provided.

Best NodeJS Workflow for team development

I'm trying to implement NodeJS and Socket.io for real time communication between two devices (PC & Smartphones) in my company product.
Basically what I want to achieve is sending a notification to all online users when somebody change something on a file.
All the basic functionality for saving the updates are already there and so, when everything is stored and calculated, I send a POST request to my Node server saying that something changed and he need to notify the users.
The problem now is that when I want to change some code in the NodeJS scripts, as long as I work alone, I can just upload the new files via FTP and just restart the pm2 service, but when my colleagues will start working with me on this story we will have problems merging our changes without overlapping each other.
Launching a local server is also not possible because we need the connection between our current server and the node machine and since our server is online it cannot access our localhosts.
It's there a way for a team to work together in the same Node server but without overlapping each other ?
Implement changes using some other option rather than FTP. For example:
You can use webdav-fs in authenticated or non-authenticated mode:
// Using authentication:
var wfs = require("webdav-fs")(
"http://example.com/webdav/",
"username",
"password"
);
wfs.readdir("/Work", function(err, contents) {
if (!err) {
console.log("Dir contents:", contents);
} else {
console.log("Error:", err.message);
}
});
putFileContents(remotePath, format, data [, options])
Put some data in a remote file at remotePath from a Buffer or String. data is a Buffer or a String. options has a property called format which can be "binary" (default) or "text".
var fs = require("fs");
var imageData = fs.readFileSync("someImage.jpg");
client
.putFileContents("/folder/myImage.jpg", imageData, { format: "binary" })
.catch(function(err) {
console.error(err);
});
And use callbacks to notify your team, or lock the files via the callback.
References
webdav-fs
webdav
lockfile
Choosing Secure Passwords

What "streams and pipe-capable" means in pkgcloud in NodeJS

My issue is to get image uploading to amazon working.
I was looking for a solution that doesnt save the file on the server and then upload it to Amazon.
Googling I found pkgcloud and on the README.md it says:
Special attention has been paid so that methods are streams and
pipe-capable.
Can someone explain what that means and if it is what I am looking for?
Yupp, that means you've found the right kind of s3 library.
What it means is that this library exposes "streams". Here is the API that defines a stream: http://nodejs.org/api/stream.html
Using node's stream interface, you can pipe any readable stream (in this case the POST's body) to any writable stream (in this case the S3 upload).
Here is an example of how to pipe a file upload directly to another kind of library that supports streams: How to handle POSTed files in Express.js without doing a disk write
EDIT: Here is an example
var pkgcloud = require('pkgcloud'),
fs = require('fs');
var s3client = pkgcloud.storage.createClient({ /* ... */ });
app.post('/upload', function(req, res) {
var s3upload = s3client.upload({
container: 'a-container',
remote: 'remote-file-name.txt'
})
// pipe the image data directly to S3
req.pipe(s3upload);
});
EDIT: To finish answering the questions that came up in the chat:
req.end() will automatically call s3upload.end() thanks to stream magic. If the OP wants to do anything else on req's end, he can do so easily: req.on('end', res.send("done!"))

Resources