I have a node app that serves a react app as well as makes requests to Google Cloud Storage. The App works perfectly locally, but after I've deployed it to Heroku I get the following error whenever I make requests to any of my endpoints:
2017-05-26T21:53:34.426234+00:00 app[web.1]: app.post /upload_url Endpoint
2017-05-26T21:53:34.484393+00:00 app[web.1]: (node:34) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): RangeError: Invalid status code: 0
The first line is a console log to check if the endpoint is reached, the second line is the error. Here's the code for the /upload_url endpoint:
app.post('/upload_url', (req, res) => {
console.log("app.post /upload_url Endpoint");
var originalFileName = req.body["originalFileName"];
var buf = crypto.randomBytes(16);
var uniqueFilename = buf.toString('hex') + '_' + originalFileName;
var file = bucket.file(uniqueFilename);
file.createResumableUpload(function (err, uri) {
if (!err) {
var json = JSON.stringify({
uri: uri,
uniqueFilename: uniqueFilename,
originalFileName: originalFileName
});
res.writeHead(200)
res.end(json);
} else {
res.writeHead(err.code);
res.end();
}
});
});
This endpoint is called by the react front end with the following fetch call:
function upload(file) {
var data = new FormData();
data.append('file', file);
return fetch(`upload_url`,{
method: 'POST',
headers: new Headers({
"Content-Type": "application/json",
}),
body: JSON.stringify({
originalFileName: file.name
})
});
}
}
Again, this works fine in development but not after deploying to Heroku. I've tried Heroku's suggestion of adding concurrency to the app (detailed here) without any luck. Any thoughts or suggestions on how to solve this problem would be very much appreciated.
EDIT:
bucket is a google cloud bucket and is defined like this:
const gcs = require('#google-cloud/storage')({
projectId: 'my-project',
keyFilename: process.env.GCS_KEYFILE
});
var bucket = gcs.bucket('my-bucket');
ANSWER:
While this didn't solve my issue entirely, by handling response error codes more appropriately I was able to determine that my actual problem is related to google cloud authentication. Here's my updated upload_url endpoint:
file.createResumableUpload(function (err, uri) {
if (!err) {
var json = JSON.stringify({
uri: uri,
uniqueFilename: uniqueFilename,
originalFileName: originalFileName
});
res.writeHead(200)
res.end(json);
} else {
if (err.code >= 100 && err.code < 600){
console.error(err)
res.writeHead(err.code);
res.end();
}else{
console.error(err)
res.writeHead(500);
res.end();
}
}
});
});
Refer to this answer https://stackoverflow.com/a/38258590/4348875, make sure err.code is a valid HTTP status code.
Related
I am using React + NodeJS & Axios but have been trying to send a post request but experiencing difficulties.
The request seems to be posting successfully, but all actions at the nodejs server is returning in the "undefined" data value, even if the data is passed successfully shown in the console.
REACT
const fireAction = (data1, data2) => {
const data = JSON.stringify({data1, data2})
const url = `http://localhost:5000/data/corr/fire`;
const config = {
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Authorization': 'AUTHCODE',
}
}
axios.post(url, data, config)
.then(function (response) {
console.log(response);
})
.catch(function (error) {
console.log(error);
});
}
fireAction("Oklahoma", "Small apartment")
NODE
app.post('/data/corr/fire', async (req, res) => {
try {
const data = req.body.data1;
console.log(data)
} catch(e) {
res.send({success: "none", error: e.message})
}
});
Result of node: "undefined"
I have added the following body parser:
app.use(express.json());
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({extended: true}));
I am not sure why this error is happening. I see there is similar questions to mine: however none of them are applicable as I'm using both express and body parser which is already suggested.
You're POSTing JSON with a content-type meant for forms. There's no need to manually set content-type if you're sending JSON, but if you want to manually override it, you can use 'Content-Type': 'application/json', and access the response in your route with req.body. If it does need to be formencoded, you'll need to build the form:
const params = new URLSearchParams();
params.append('data1', data1);
params.append('data2', data2);
axios.post(url, params, config);
My app works fine locally and I have a much more complicated app running on heroku. But I consistently get a 503 / H12 timeout when it tries to run a POST request. It takes two seconds locally. Any ideas?
The post route in question (uses a node.js sdk for aylien nlp api):
app.post('/apiRequest', (req, res) => {
const reqUrl = req.body.url;
aylienApi.combined({
'url': reqUrl,
'endpoint': ['language', 'sentiment', 'summarize']
}, function(error, result) {
if (error === null){
res.send(result.results)
}
})
})
The async POST request:
const postData = async (url = '', data = {}) => {
const res = await fetch(url, {
method: 'POST',
credentials: 'same-origin',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify(data)
})
try {
const articleData = await res.json();
return articleData;
} catch (error) {
console.log('error', error)
}
}
I've read on the heroku documentation that it can be caused by large bits of data, but I'm only grabbing a small amount of JSON and send a url.
It turns out I had not uploaded by .env file with the api keys etc, as it was still in the gitignore file.
Silly me. But turns out the Aylien nlp api doesn't warn you if the key isn't set.
I have an AWS Lambda function which triggers https request to Google API. I want the function to be awaitable, so that it does not end immediately, but only after getting response from Google API.
Yes, I know I pay for the execution, but this will not be called often, so it is fine.
The problem is that the http request does not seem to fire correctly. The callback is never executed.
I have made sure that the async/await works as expected by using setTimeout in a Promise. So the issue is somewhere in the https.request.
Also note that I am using Pulumi to deploy to AWS, so there might be some hidden problem in there. I just can't figure out where.
The relevant code:
AWS Lambda which calls the Google API
import config from '../../config';
import { IUserInfo } from '../../interfaces';
const https = require('https');
function sendHttpsRequest(options: any): Promise<any> {
console.log(`sending request to ${options.host}`);
console.log(`Options are ${JSON.stringify(options)}`);
return new Promise(function (resolve, reject) {
console.log(` request to ${options.host} has been sent A`);
let body = new Array<Buffer>();
const request = https.request(options, function (res: any) {
console.log('statusCode:', res.statusCode);
console.log('headers:', res.headers);
if (res.statusCode != 200) {
reject(res.statusCode);
}
res.on('data', (data: any) => {
console.log(`body length is ${body.length}`);
console.log('data arrived', data);
body.push(data);
console.log('pushed to array');
console.log(data.toString());
});
});
request.on('end', () => {
console.error('Request ended');
// at this point, `body` has the entire request body stored in it as a string
let result = Buffer.concat(body).toString();
resolve(result);
});
request.on('error', async (err: Error) => {
console.error('Errooooorrrr', err.stack);
console.error('Errooooorrrr request failed');
reject(err);
});
request.end();
console.log(` request to ${options.host} has been sent B`);
});
}
/**
* AWS Lambda to create new Google account in TopMonks domain
*/
export default async function googleLambdaImplementation(userInfo: IUserInfo) {
const payload = JSON.stringify({
"primaryEmail": userInfo.topmonksEmail,
"name": {
"givenName": userInfo.firstName,
"familyName": userInfo.lastName
},
"password": config.defaultPassword,
"changePasswordAtNextLogin": true
});
const resultResponse: Response = {
statusCode: 200,
body: 'Default response. This should not come back to users'
}
console.log('Calling google api via post request');
try {
const options = {
host: 'www.googleapis.com',
path: '/admin/directory/v1/users',
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': payload.length.toString()
},
form: payload
}
const responseFromGoogle = await sendHttpsRequest(options);
console.log('responseFromGoogle', JSON.stringify(responseFromGoogle));
}
catch (err) {
console.log('Calling google api failed with error', err);
resultResponse.statusCode = 503;
resultResponse.body = `Error creating new Google Account for ${userInfo.topmonksEmail}.`;
return resultResponse;
}
console.log('request to google sent');
return resultResponse;
}
The problem is that the http request does not seem to fire correctly. The callback is never executed.
I believe this part of the issue is related to some combination of (a) potentially not actually sending the https request and (b) not using the correct callback signature for https.request. See the documentation at https://nodejs.org/api/https.html#https_https_request_options_callback for details on both of these.
Use node-fetch package
The following example works for me using node-fetch:
import * as aws from "#pulumi/aws";
import fetch from "node-fetch";
const api = new aws.apigateway.x.API("api", {
routes: [{
method: "GET", path: "/", eventHandler: async (ev) => {
const resp = await fetch("https://www.google.com");
const body = await resp.text();
return {
statusCode: resp.status,
body: body,
}
},
}],
})
export const url = api.url;
Pulumi complains, it something like "Can not serialize native function" or something like that. The problematic part is that node-fetch relies on Symbol.iterator
As noted in the comments, some of the conditions that can lead to this are documented at https://pulumi.io/reference/serializing-functions.html. However, I don't see any clear reason why this code would hit any of those limitations. There may be details of how this is used outside the context of the snippet shared above which lead to this.
I am creating a google function. However, when I try to deploy to Google Cloud Platform, I am getting this error
ERROR: (gcloud.beta.functions.deploy) OperationError: code=3, message=Function load error: Code in file index.js can't be loaded.
Did you list all required modules in the package.json dependencies?
Detailed stack trace: Error: Cannot find module 'request'
How do I upload/install the 'request' library in google cloud platform?
Code Snippet
'use strict';
const https = require('https');
const host = 'https://www.example.com';
const clientId = 'qpopMIGtVdeIdVk3oEtr2LGbn8vTeTWz';
const clientSecret = 'eUnsWQ8y3AuiFHJu';
const grant_type = 'client_credentials';
const resource = 'b.microsoft.com/4fa4b4a7-d34f-49af-8781-c8b39f0cf770';
const request = require("request");
exports.oauthtoken = (req, res) => {
// Call the Apigee API
callGetOAuthToken().then((output) => {
// Return the results from the APigee to DialogFlow
res.setHeader('Content-Type', 'application/json');
res.send(JSON.stringify({ 'speech': output, 'displayText': output }));
}).catch((error) => {
// If there is an error let the user know
res.setHeader('Content-Type', 'application/json');
res.send(JSON.stringify({ 'speech': error, 'displayText': error }));
});
};
function callGetOAuthToken () {
return new Promise((resolve, reject) => {
let path = '/customers/v1/accesstoken';
var authHeader = Buffer.from(clientId + ':' + clientSecret).toString('base64');
var post_options = {
url: host + path,
method: 'POST',
headers:
{
'Content-Type': 'application/x-www-form-urlencoded',
'Authorization': 'Basic ' + authHeader,
'grant_type':grant_type
}
};
// Make the HTTP request to get the weather
request(post_options, function(err, res, body) {
let output = JSON.parse(body);
console.log(output);
resolve(output);
});
});
}
-Alan-
Read through the Google Cloud documentation regarding dependencies:
https://cloud.google.com/functions/docs/writing/dependencies
List the 'request' module as a dependency in your package.json file if using the gcloud CLI.
Or, run 'npm install --save request' in the folder containing your cloud function and upload your pre-installed dependencies as part of your ZIP file.
I have a javascript function as below which I hosted it at S3
function myFunction() {
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function(){
if(this.readyState == 4 && this.status == 200){
document.getElementById("testing").innerHTML = this.responseText;
}
};
xhttp.open("GET","https://id.execute-api.ap-southeast-1.amazonaws.com/prod/lambdafunction", true);
xhttp.send();
}
And this lambdafunction is written in Node.jsas below
'use strict';
console.log('Loading function');
exports.handler = (event, context, callback) => {
let response = {
statusCode: '200',
body: JSON.stringify({ error: 'you messed up!' }),
headers: {
'Content-Type': 'application/json',
}
};
context.succeed(response);
//callback(null, context); // Echo back the first key value
//callback('Something went wrong');
};
What I expect was that div with id testing will be replaced by error: 'you messed up! but nothing happened? May I know which part may have gone wrong?
It looks like you are using the api for the (very) old node v.0.10.42.
It seems more likely that you would be using the newer version so you should have:
callback(null, JSON.stringify({ error: 'you messed up!' }));
// if you are not using proxy integration
or
callback(null, response)
// if you set up the function with proxy integration
If this doesn't help, it would be useful to know what you get when you access the url directly and if you are seeing anything in the AWS logs. You should also be able to directly invoke the lambda function from the AWS console, which makes testing easier.