GCP Cloud Function reading files from Cloud Storage - node.js

I'm new to GCP, Cloud Functions and NodeJS ecosystem. Any pointers would be very helpful.
I want to write a GCP Cloud Function that does following:
Read contents of file (sample.txt) saved in Google Cloud Storage.
Copy it to local file system (or just console.log() it)
Run this code using functions-emulator locally for testing
Result: 500 INTERNAL error with message 'function crashed'. Function logs give following message
2019-01-21T20:24:45.647Z - info: User function triggered, starting execution
2019-01-21T20:24:46.066Z - info: Execution took 861 ms, finished with status: 'crash'
Below is my code, picked mostly from GCP NodeJS sample code and documentation.
exports.list_files = (req, res) => {
const fs = require('fs');
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('curl-tests');
bucket.setUserProject("cf-nodejs");
const file = bucket.file('sample.txt'); // file has couple of lines of text
const localFilename = '/Users/<username>/sample_copy.txt';
file.createReadStream()
.on('error', function (err) { })
.on('response', function (response) {
// Server connected and responded with the specified status and
headers.
})
.on('end', function () {
// The file is fully downloaded.
})
.pipe(fs.createWriteStream(localFilename));
}
I run like this:
functions call list_files --trigger-http
ExecutionId: 4a722196-d94d-43c8-9151-498a9bb26997
Error: { error:
{ code: 500,
status: 'INTERNAL',
message: 'function crashed',
errors: [ 'socket hang up' ] } }
Eventually, I want to have certificates and keys saved in Storage buckets and use them to authenticate with a service outside of GCP. This is the bigger problem I'm trying to solve. But for now, focusing on resolving the crash.

Start your development and debugging on your desktop using node and not an emulator. Once you have your code working without warnings and errors, then start working with the emulator and then finally with Cloud Functions.
Lets' take your code and fix parts of it.
bucket.setUserProject("cf-nodejs");
I doubt that your project is cf-nodejs. Enter the correct project ID.
const localFilename = '/Users/<username>/sample_copy.txt';
This won't work. You do not have the directory /Users/<username> in cloud functions. The only directory that you can write to is /tmp. For testing purposes change this line to:
const localFilename = '/tmp/sample_copy.txt';
You are not doing anything for errors:
.on('error', function (err) { })
Change this line to at least print something:
.on('error', function (err) { console.log(err); })
You will then be able to view the output in Google Cloud Console -> Stack Driver -> Logs. Stack Driver supports select "Cloud Functions" - "Your function name" so that you can see your debug output.
Last tip, wrap your code in a try/except block and console.log the error message in the except block. This way you will at least have a log entry when your program crashes in the cloud.

Related

Google Cloud res.send is not a function

The function is triggered by a Cron PubSub to pull data from an API and update a Big Query table.
It works when locally tested (npx #google-cloud/functions-framework --target pullElevate)
But results in an error (TypeError: res.send is not a function at exports.pullElevate (/workspace/index.js:261:9) when uploaded and triggered by 'Test Function' from console.
I can't see why the res object would have lost its functions?
exports.pullElevate = async function (req, res) {
// get CSV from Elevate API -> Write it to bucket -> copy from bucket to table...
try {
const payload = await getCSV()
const filename = await putInBucket(payload.data)
await writeCSVtoTable(filename)
await mergeToMainTable()
} catch (err) {
console.error(err);
res.status(500).send(err)
return Promise.reject(err)
}
res.send("OK")
return
}
I understand (and I guess by your description) that you bind your Cloud Functions to a PubSub topic to pull the message. Like this, you didn't have created a Http Function, but a background function. And the signature isn't the same
The second argument is a context object, and obviously, it hasn't a send function. Change your deployment. Deploy an HTTP function and create a PubSub push subscription to your Cloud Functions if you want (need?) to send a response to a received message (from PubSub or elsewhere!!)

AWS lambda function issue with FormData file upload

I have a nodejs code which uploads files to S3 bucket.
I have used koa web framework and following are the dependencies:
"#types/koa": "^2.0.48",
"#types/koa-router": "^7.0.40",
"koa": "^2.7.0",
"koa-body": "^4.1.0",
"koa-router": "^7.4.0",
following is my sample router code:
import Router from "koa-router";
const router = new Router({ prefix: '/' })
router.post('file/upload', upload)
async function upload(ctx: any, next: any) {
const files = ctx.request.files
if(files && files.file) {
const extension = path.extname(files.file.name)
const type = files.file.type
const size = files.file.size
console.log("file Size--------->:: " + size);
sendToS3();
}
}
function sendToS3() {
const params = {
Bucket: bName,
Key: kName,
Body: imageBody,
ACL: 'public-read',
ContentType: fileType
};
s3.upload(params, function (error: any, data: any) {
if (error) {
console.log("error", error);
return;
}
console.log('s3Response', data);
return;
});
}
The request body is sent as FormData.
Now when I run this code locally and hit the request, the file gets uploaded to my S3 bucket and can be viewed.
In Console the file size is displayed as follows:
which is the correct actual size of the file.
But when I deploy this code as lambda function and hit the request then I see that the file size has suddenly increased(cloudwatch log screenshot below).
Still that file gets uploaded to S3 but the issue is when I open the file it show following error.
I further tried to find whether this behaviour persisted on standalone instance on aws. But it did not. So the problem occurs only when the code is deployed as a serverless lambda function.
I tried with postman as well as my own front end app. But the issue remains.
I don't know whether I have overlooked any configuration when setting up the lambda function that handles such scenarios.
This is an unprecedented issue I have encountered, and really would want to know if any one else encountered same before. Also I am not able to debug and find why the file size is increasing. I can only assume that when the file reaches the service, some kind of encoding/padding is being done on the file.
Finally was able to fix this issue. Had to add "Binary Media Type" in AWS API Gateway
Following steps helped.
AWS API Gateway console -> "API" -> "Settings" -> "Binary Media Types" section.
Added following media type:
multipart/form-data
Save changes.
Deploy api.
Info location: https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings-configure-with-console.html

Uploading data from firebase functions to firebase storage?

I have a website running with node.js, with the backend running on Firebase Functions. I want to store a bunch of JSON to Firebase Storage. The below snippet works just fine when I'm running on localhost, but when I upload it to Firebase functions, it says Error: EROFS: read-only file system, open 'export-stock-trades.json. Anyone know how to get around this?
fs.writeFile(fileNameToReadWrite, JSON.stringify(jsonObjToUploadAsFile), function(err){
bucket.upload(fileNameToReadWrite, {
destination: destinationPath,
});
res.send({success: true});
});
I can't tell for sure, since much of the context of your function is missing, but it looks like you function is attempting to write a file to local disk first (fs.writeFile), then upload it (bucket.upload).
On Cloud Functions, code you write only has write access to /tmp,
which is os.tmpdir() in node. Read more about that in the
documentation:
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
This is probably what's causing your code to fail.
Incidentally, if the data you want to upload is in memory, you don't have to write it to a file first as you're doing now. You can instead use file.save() for that.
Another way I feel this could work is to convert the JSON file into a buffer and then perform an action like this (the code snippet below). I wrote an article on how you can do this using Google Cloud Storage but it works fine with Firebase storage. The only thing you need to change is the "service-account-key.json" file.
The link to the article can be found here: Link to article on medium
const util = require('util')
const gc = require('./config/')
const bucket = gc.bucket('all-mighti') // should be your bucket name
/**
*
* #param { File } object file object that will be uploaded
* #description - This function does the following
* - It uploads a file to the image bucket on Google Cloud
* - It accepts an object as an argument with the
* "originalname" and "buffer" as keys
*/
export const uploadImage = (file) => new Promise((resolve, reject) => {
const { originalname, buffer } = file
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
})
.on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})

Client Callable Firebase Function fails when deployed to regions other than us-central1

Client Callable Firebase Function fails with "Error: The data couldn’t be read because it isn’t in the correct format." when deployed to regions other than us-central1 (tried Europe West and Asia).
Server Code
exports.getName = functions.region('europe-west1').https.onCall((data, context) => {
return { name: "Testopoulos" };
});
Client Code (React Native Firebase)
const httpsCallableUserName = firebase.functions().httpsCallable('getName');
getUserName = () => {
httpsCallableUserName()
.then(({ data }) => {
console.log(data.name);
})
.catch(httpsError => {
console.log(httpsError);
})
}
There is a note in the docs saying that "[when] using HTTP functions to serve dynamic content for hosting, you must use us-central1" but I am just returning an object to our frontend React Native client and not using Firebase Hosting. I read other posts about the regions but I don't think it is the same case. Any ideas?
According to the documentation "the client can also specify a region, and must do so if the function runs in any region other than us-central1.".
As you have found (see comments above), it has to be done with:
var functions = firebase.initializeApp(config).functions('europe-west1');

Cloud Functions for Firebase - Merge multiple PDFs into one via Nodejs/Cloud Function

I'm running into an issue where I'm trying to combine a bunch of PDFs via a Cloud Function and then have that merged PDF be downloaded to the user's computer.
I have a function in my provider that calls the cloud function and passes an array of URLs that point to the pdfs like so:
mergePDFs(pdfs) {
// Create array of URLs that point to PDFs in Firebase storage
let pdfArr = [];
for(let key in pdfs) {
pdfArr.push(pdfs[key].url);
}
// Call Firebase Cloud function to merge the PDFs into one
this.http.get('https://us-central1-rosebud-9b0ed.cloudfunctions.net/mergePDFs', {
}).subscribe(result => {
console.log('merge result?', result);
});
}
Has anyone had any success utilizing pdf-merge NPM module to accomplish this?
I have a basic cloud function stubbed like so:
exports.mergePDFs = functions.https.onRequest((request, response) => {
cors(request, response, () => {
console.log('request', request);
response.status(200).send('test');
})
});
Though when I try to call it, I get error code 500 just so I can see if it's being called.
Has anyone else gotten the pdf merging functionality working via nodejs/Cloud Functions for Firebase?

Resources