Google Cloud res.send is not a function - node.js

The function is triggered by a Cron PubSub to pull data from an API and update a Big Query table.
It works when locally tested (npx #google-cloud/functions-framework --target pullElevate)
But results in an error (TypeError: res.send is not a function at exports.pullElevate (/workspace/index.js:261:9) when uploaded and triggered by 'Test Function' from console.
I can't see why the res object would have lost its functions?
exports.pullElevate = async function (req, res) {
// get CSV from Elevate API -> Write it to bucket -> copy from bucket to table...
try {
const payload = await getCSV()
const filename = await putInBucket(payload.data)
await writeCSVtoTable(filename)
await mergeToMainTable()
} catch (err) {
console.error(err);
res.status(500).send(err)
return Promise.reject(err)
}
res.send("OK")
return
}

I understand (and I guess by your description) that you bind your Cloud Functions to a PubSub topic to pull the message. Like this, you didn't have created a Http Function, but a background function. And the signature isn't the same
The second argument is a context object, and obviously, it hasn't a send function. Change your deployment. Deploy an HTTP function and create a PubSub push subscription to your Cloud Functions if you want (need?) to send a response to a received message (from PubSub or elsewhere!!)

Related

Slack delayed message integration with Node TypeScript and Lambda

I started implementing a slash-command which kept evolving and eventually might hit the 3-second slack response limit. I am using serverless-stack with Node and TypeScript. With sst (and the vscode launchfile) it hooks and attaches the debugger into the lambda function which is pretty neat for debugging.
When hitting the api endpoint I tried various methods to send back an acknowledgement to slack, do my thing and send a delayed message back without success. I didnt have much luck finding info on this but one good source was this SO Answer - unfortunetly it didn't work. I didn't use request-promise since it's deprecated and tried to implement it with vanilla methods (maybe that's where i failed?). But also invoking a second lambda function from within (like in the first example of the post) didn't seem to be within the 3s limitation.
I am wondering if I am doing something wrong or if attachinf the debugger is just taking to long etc.
However, before attempting to send a delayed message it was fine including accessing and scaning dynamodb records, manipulating the results and then responding back to slack while debugger attached without hitting the timeout.
Attempting to use a post
export const answer: APIGatewayProxyHandlerV2 = async (
event: APIGatewayProxyEventV2, context, callback
) => {
const slack = decodeQueryStringAs<SlackRequest>(event.body);
axios.post(slack.response_url, {
text: "completed",
response_type: "ephemeral",
replace_original: "true"
});
return { statusCode: 200, body: '' };
}
The promise never resolved, i guess that once hitting return on the function the lambda function gets disposed and so the promise?
Invoking 2nd Lambda function
export const v2: APIGatewayProxyHandlerV2 = async (
event: APIGatewayProxyEventV2, context, callback
): Promise<any> => {
//tried with CB here and without
//callback(null, { statusCode: 200, body: 'processing' });
const slack = decodeQueryStringAs<SlackRequest>(event.body);
const originalMessage = slack.text;
const responseInfo = url.parse(slack.response_url)
const data = JSON.stringify({
...slack,
})
const lambda = new AWS.Lambda()
const params = {
FunctionName: 'dev-****-FC******SmE7',
InvocationType: 'Event', // Ensures asynchronous execution
Payload: data
}
return lambda.invoke(params).promise()// Returns 200 immediately after invoking the second lambda, not waiting for the result
.then(() => callback(null, { statusCode: 200, body: 'working on it' }))
};
Looking at the debugger logs it does send the 200 code and invokes the new lambda function though slack still times out.
Nothing special happens logic wise ... the current non-delayed-message implementation does much more logic wise (accessing DB and manipulating result data) and manages not to timeout.
Any suggestions or help is welcome.
Quick side note, I used request-promise in the linked SO question's answer since the JS native Promise object was not yet available on AWS Lambda's containers at the time.
There's a fundamental difference between the orchestration of the functions in the linked question and your own from what I understand but I think you have the same goal:
> Invoke an asynchronous operation from Slack which posts back to slack once it has a result
Here's the problem with your current approach: Slack sends a request to your (1st) lambda function, which returns a response to slack, and then invokes the second lambda function.
The slack event is no longer accepting responses once your first lambda returns the 200. Here lies the difference between your approach and the linked SO question.
The desired approach would sequentially look like this:
Slack sends a request to Lambda no. 1
Lambda no. 1 returns a 200 response to Slack
Lambda no. 1 invokes Lambda no. 2
Lambda no. 2 sends a POST request to a slack URL (google incoming webhooks for slack)
Slack receives the POST requests and displays it in the channel you chose for your webhook.
Code wise this would look like the following (without request-promise lol):
Lambda 1
module.exports = async (event, context) => {
// Invoking the second lambda function
const AWS = require('aws-sdk')
const lambda = new AWS.Lambda()
const params = {
FunctionName: 'YOUR_SECOND_FUNCTION_NAME',
InvocationType: 'Event', // Ensures asynchronous execution
Payload: JSON.stringify({
... your payload for lambda 2 ...
})
}
await lambda.invoke(params).promise() // Starts Lambda 2
return {
text: "working...",
response_type: "ephemeral",
replace_original: "true"
}
}
Lambda 2
module.exports = async (event, context) => {
// Use event (payload sent from Lambda 1) and do what you need to do
return axios.post('YOUR_INCOMING_WEBHOOK_URL', {
text: 'this will be sent to slack'
});
}

Firebase cloud functions throws timeout exception but standalone works fine

I am trying to call a third party API using my Firebase cloud functions. I have billing enabled and all my other function are working fine.
However, I have one method that throws Timeout exception when it tries to call third API. The interesting thing is, when I run the same method from a standalone nodeJS file, it works fine. But when I deploy it on Firebase cloud or start the function locally, it shows timeout error.
Following is my function:
exports.fetchDemo = functions.https.onRequest(async (req, response) =>
{
var res = {};
res.started = true;
await myMethod();
res.ended = true;
response.status(200).json({ data: res });
});
async function myMethod() {
var url = 'my third party URL';
console.log('Line 1');
const res = await fetch(url);
console.log('Line 2'); // never prints when run with cloud functions
var data = await res.text();
console.log(`Line 3: ${data}`);
}
Just now I also noticed, when I hit the same URL in the browser it gives the following exception. It means, it works only with standalone node.
<errorDTO>
<code>INTERNAL_SERVER_ERROR</code>
<uid>c0bb83ab-233c-4fe4-9a9e-3f10063e129d</uid>
</errorDTO>
Any help will be appreciated...
It turned out that one of my colleague wrote a new method with the name fetch. I was not aware about it. So when my method was calling to the fetch method, it was actually calling his method he wrote down the file. I just took git update and did not notice he wrote this method.

GCP Cloud Function reading files from Cloud Storage

I'm new to GCP, Cloud Functions and NodeJS ecosystem. Any pointers would be very helpful.
I want to write a GCP Cloud Function that does following:
Read contents of file (sample.txt) saved in Google Cloud Storage.
Copy it to local file system (or just console.log() it)
Run this code using functions-emulator locally for testing
Result: 500 INTERNAL error with message 'function crashed'. Function logs give following message
2019-01-21T20:24:45.647Z - info: User function triggered, starting execution
2019-01-21T20:24:46.066Z - info: Execution took 861 ms, finished with status: 'crash'
Below is my code, picked mostly from GCP NodeJS sample code and documentation.
exports.list_files = (req, res) => {
const fs = require('fs');
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('curl-tests');
bucket.setUserProject("cf-nodejs");
const file = bucket.file('sample.txt'); // file has couple of lines of text
const localFilename = '/Users/<username>/sample_copy.txt';
file.createReadStream()
.on('error', function (err) { })
.on('response', function (response) {
// Server connected and responded with the specified status and
headers.
})
.on('end', function () {
// The file is fully downloaded.
})
.pipe(fs.createWriteStream(localFilename));
}
I run like this:
functions call list_files --trigger-http
ExecutionId: 4a722196-d94d-43c8-9151-498a9bb26997
Error: { error:
{ code: 500,
status: 'INTERNAL',
message: 'function crashed',
errors: [ 'socket hang up' ] } }
Eventually, I want to have certificates and keys saved in Storage buckets and use them to authenticate with a service outside of GCP. This is the bigger problem I'm trying to solve. But for now, focusing on resolving the crash.
Start your development and debugging on your desktop using node and not an emulator. Once you have your code working without warnings and errors, then start working with the emulator and then finally with Cloud Functions.
Lets' take your code and fix parts of it.
bucket.setUserProject("cf-nodejs");
I doubt that your project is cf-nodejs. Enter the correct project ID.
const localFilename = '/Users/<username>/sample_copy.txt';
This won't work. You do not have the directory /Users/<username> in cloud functions. The only directory that you can write to is /tmp. For testing purposes change this line to:
const localFilename = '/tmp/sample_copy.txt';
You are not doing anything for errors:
.on('error', function (err) { })
Change this line to at least print something:
.on('error', function (err) { console.log(err); })
You will then be able to view the output in Google Cloud Console -> Stack Driver -> Logs. Stack Driver supports select "Cloud Functions" - "Your function name" so that you can see your debug output.
Last tip, wrap your code in a try/except block and console.log the error message in the except block. This way you will at least have a log entry when your program crashes in the cloud.

How to get external api data using inline editor in dialogflow

I've got a Dialogflow agent for which I'm using the Inline Editor (powered by Cloud Functions for Firebase). When I try to get external api data by using request-promise-native I keep getting Ignoring exception from a finished function in my firebase console.
function video(agent) {
agent.add(`You are now being handled by the productivity intent`);
const url = "https://reqres.in/api/users?page=2";
return request.get(url)
.then(jsonBody => {
var body = JSON.parse(jsonBody);
agent.add(body.data[0].first_name)
return Promise.resolve(agent);
});
}
Your code looks correct. The exception in this case might be that you're not using a paid account, so network access outside Google is blocked. You can probably see the exact exception by adding a catch block:
function video(agent) {
agent.add(`You are now being handled by the productivity intent`);
const url = "https://reqres.in/api/users?page=2";
return request.get(url)
.then(jsonBody => {
var body = JSON.parse(jsonBody);
agent.add(body.data[0].first_name)
return Promise.resolve(agent);
})
.catch(err => {
console.error('Problem making network call', err);
agent.add('Unable to get result');
return Promise.resolve(agent);
});
}
(If you do this, you may want to update your question with the exact error from the logs.)
Inline Editor uses Firebase. If you do not have a paid account with Firebase, you will not be able to access external APIs.

Cloud Functions for Firebase - Merge multiple PDFs into one via Nodejs/Cloud Function

I'm running into an issue where I'm trying to combine a bunch of PDFs via a Cloud Function and then have that merged PDF be downloaded to the user's computer.
I have a function in my provider that calls the cloud function and passes an array of URLs that point to the pdfs like so:
mergePDFs(pdfs) {
// Create array of URLs that point to PDFs in Firebase storage
let pdfArr = [];
for(let key in pdfs) {
pdfArr.push(pdfs[key].url);
}
// Call Firebase Cloud function to merge the PDFs into one
this.http.get('https://us-central1-rosebud-9b0ed.cloudfunctions.net/mergePDFs', {
}).subscribe(result => {
console.log('merge result?', result);
});
}
Has anyone had any success utilizing pdf-merge NPM module to accomplish this?
I have a basic cloud function stubbed like so:
exports.mergePDFs = functions.https.onRequest((request, response) => {
cors(request, response, () => {
console.log('request', request);
response.status(200).send('test');
})
});
Though when I try to call it, I get error code 500 just so I can see if it's being called.
Has anyone else gotten the pdf merging functionality working via nodejs/Cloud Functions for Firebase?

Resources