Possible to use cloud function to download video onto Firebase server? - node.js

I want to use write a cloud function to download a video file. I will do something like below to save the file into a mp4 format file.
const http = require('http');
const fs = require('fs');
const file = fs.createWriteStream("video.mp4");
const request = http.get("http://my-video-server.com/nice-day.mp4", function(response) {
response.pipe(file);
});
But I am wondering where is the file saved? Is it somewhere under /tmp on the firebase server? Can I specify the path for downloading the file? Also, if the video size is several GB, does firebase has any limit about how we use your server?
The reason I want to download a video file is I want to use the file to upload to YouTube via Youtube's API. They don't support upload via a link, it has to be a physical file.

The only writable path in Cloud Functions is /tmp. It's a memory-based filesystem, so you will only have a very limited amount of space available. I suggest reading the documentation on the Cloud Functions execution environment.
You can specify the exact file of the download using the API of the modules you're using for the download.
Cloud Functions is not a suitable product for work that requires large amount of disk space or memory.

Related

Suitable approach for serving HLS playlist from GCP using node.js server

I have a dilemma in serving dynamic hls playlists to users, since all hls files will have to be served from GCP storage. Since index file is static, I can simply serve it using express static as outlined in one answer in here but it is limited to file directory (in my opinion). For streaming HLS playlist from GCP, I tried the following approach
For serving dynamic index files from GCP
router.get('/segments-list', (req, res) => {
const playlistName = req.query;
const remoteFile = hlsPlaylistBucket.file('${playlistName}.m3u8');
remoteFile.download().then((data) => {
const contents = data[0];
res.send(contents);
});
});
For serving individual segment files (.ts) to users
router.get('/segments/:segment', (req, res) => {
const { segment } = req.params;
//* Gets specific segment files from GCP cloud in request of particular segments
const remoteFile = hlsPlaylistBucket.file(`${segment}`);
remoteFile.download().then((data) => {
const contents = data[0];
res.send(contents);
});
});
Here is the manifest file
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:8
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:8.341667,
segments/file000.ts
#EXTINF:8.341667,
segments/file001.ts
#EXTINF:8.341667,
segments/file002.ts
#EXTINF:8.341667,
segments/file003.ts
#EXTINF:5.572233,
segments/file004.ts
#EXT-X-ENDLIST
Now this approach seems to work but this is a flawed approach due to downloading small chunks for files in memory every time user requests. And serving signed urls for each segment files is also not a suitable way was well as one video might contain 200 segments files. Is my approach correct or do I have to serve HLS playlists like this in GCP. The real problem arises in serving base url segments files and I have been stuck for the past couple of days in this problem but I cannot find a suitable way to serve VOD file to users by serving HLS playlist
Any help will be appreciated
I think the best option here is to serve them from a GCP bucket using CORS (Cross-Origin Resource Sharing), here you can find a guide on how to configure CORS for a GCP bucket 1.
Even further, you could render the conversion and packaging of the original video files to the Transcoder API Google service 2, which can generate and store the HLS manifest and files into a GCP bucket, to later be used by any Video Streaming Player, take a look at this Quickstart guide 3.

puppeteer fileChooser: how to use URL instead of local file path?

I have the http link of a video file (cloud storage) and I want to select that file in filechooser.
I am using cloud hosting (glitch), so I don't want to store that file to local storage.
const [fileChooser] = await Promise.all([
page.waitForFileChooser(),
page.click('#select-files-button')
]);
await fileChooser.accept(["https://examle.mp4"]);
It only seems to accept a local file path, can anyone help me?
As you could have guessed, it obviously accepts only local files, you can download your desired file first (with request module or even http(s?) module), then pass it to puppeteer.

Upload large files to Google Cloud Storage using Google App Engine

I would like to upload files up to 1GB to Google Cloud Storage. I'm using Google App Engine Flexible. From what I understand, GAE has a 32MB limit on file uploads, which means I have to either upload directly to GCS or break the file into chunks.
This answer from several years ago suggests using the Blobstore API, however there doesn't seem to be an option for Node.js and the documentation also recommends using GCS instead of Blobstore to store files.
After doing some searching, it seems like using signed urls to upload directly to GCS may be the best option, but I'm having trouble finding any example code on how to do this. Is this the best way and are there any examples of how to do this using App Engine with Node.js?
Your best bet would be to use the Cloud Storage client library for Node.js to create a resumable upload.
Here's the official code example on how to create the session URI:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const myBucket = storage.bucket('my-bucket');
const file = myBucket.file('my-file');
file.createResumableUpload(function(err, uri) {
if (!err) {
// `uri` can be used to PUT data to.
}
});
//-
// If the callback is omitted, we'll return a Promise.
//-
file.createResumableUpload().then(function(data) {
const uri = data[0];
});
Edit: It seems you can nowadays use the createWriteStream method to perform an uplaod without having to worry about creation of a URL.

How to use aws s3 image url in node js lambda?

I am trying to use aws s3 image in lambda node js but it throws an error 'no such file or directory'. But I have made that image as public and all permissions are granted.
fs = require('fs');
exports.handler = function( event, context ) {
var img = fs.readFileSync('https://s3-us-west-2.amazonaws.com/php-7/pic_6.png');
res.writeHead(200, {'Content-Type': 'image/png' });
res.end(img, 'binary');
};
fs is node js file system core module. It is for writing and reading files on local machine. That is why it gives you that error.
There are multiple things wrong with your code.
fs is a core module used for file operations and can't be used to access S3.
You seem to be using express.js code in your example. In lambda, there is no built-in res defined(unless you define it yourself) that you can use to send response.
You need to use the methods on context or the new callback mechanism. The context methods are used on the older lambda node version(0.10.42). You should be using the newer node version(4.3.2 or 6.10) which return response using the callback parameter.
It seems like you are also using the API gateway, so assuming that, I'll give a few suggestions. If the client needs access to the S3 object, these are some of your options:
Read the image from S3 using the AWS sdk and return the image using the appropriate binary media type. AWS added support for binary data for API gateway recently. See this link OR
Send the public S3 URL to client in your json response. Consider whether the S3 objects need to be public. OR
Use the S3 sdk to generate pre-signed URLs that are valid for a configured duration back to the client.
I like the pre-signed URL approach. I think you should check that out. You might also want to check the AWS lambda documentation
To get a file from S3, you need to use the path that S3 give you. The base path is https://s3.amazonaws.com/{your-bucket-name}/{your-file-name}.
On your code, you must replace the next line:
var img = fs.readFileSync('https://s3.amazonaws.com/{your-bucket-name}/pic_6.png');
If don't have a bucket, you should to create one to give permissions.

How to transfer base64 image from client to server or download binary / base64 from s3 bucket?

In my app, i'm sending photos directly from the client to s3, using something similar to this suggested heroku recommendation: https://devcenter.heroku.com/articles/s3-upload-node
The main benefit is that it saves server cost (i'm assuming since chunks aren't being sent to the server using something such as multipart-y form data).
However, I wish to be able to share these images to twitter also, which states this requirement:
Ensure the POST is a multipart/form-data request. Either upload the raw binary (media parameter) of the file, or its base64-encoded contents (media_data parameter). Use raw binary when possible, because base64 encoding results in larger file sizes
I've tried sending the base64 needed for the client-side s3 upload back to the server, but depending on the photo size -- I often get an error that it's too big to send back.
TLDR
Do I need to send my photos using mulitparty / multipart form data to my server, so I can have the needed base64 / binary to share a photo to twitter, or can I keep sending photos from my client to s3?
Then, somehow, efficiently obtain the needed base64 / binary on the server (possibly using the request module), so I can then send the image to twitter?
One fairly easy way to do this without changing your client code a whole lot would be to use S3 events. S3 events can trigger a lambda function in AWS that can post the image to twitter. You can use any library inside the lambda function to do efficient posting to twitter. Not sure if you want to use Lambda or stick to Heroku.
If you are directly uploading documents from the client to upload to s3, you are exposing your AWS secret/private keys with the client. A more secure way would be uploading the images to node and node in turn upload it to S3. A recommended way to upload images to node server would be using
multipart/form-data and using Multer middleware.
Regardless of the upload method, you can use the following code to serve images to twitter. This code uses AWS-SDK module.
var s3 = new AWS.S3();
var filename = req.query.filename;
var params = {
Bucket: <bucketname>,
Key: <image path>
};
var extension = filename.split('.')[1];
if (extension == "jpg" || extension == "JPG" || extension == "jpeg" || extension == "JPEG")
{
res.setHeader('Content-Type', 'image');
}
else if (extension == "png" || extension == "PNG")
{
res.setHeader('Content-Type', 'image/png');
}
s3.getObject(params).createReadStream().pipe(res);
This method can scale with easy like any other express app.

Resources