Passing params from a Google Cloud Function GET Request to BigQuery - node.js

I have successfully deployed a google cloud function that takes in parameters from a POST request. I am now trying to change it to take in parameters from a GET request since the parameters don't contain any private data.
It seems I am correctly getting the passed in parameters, but when I then try to pass them to bigQuery it tells me I have missing parameters for my query. I know my code is correct because if I hard code the values of the parameters it works correctly e.g.:
bigQuery.createQueryJob({
query,
params: {
"make": "acura",
"model": "mdx",
"modelYear": 2005
}
}).then...
I also know I am getting the parameters correctly because if I change my cloud function to just return the passed in query string params, it correctly returns them (commented out lines below). If I change the cloud function to use req.body instead of req.query and make it a POST request, it also works fine.
I'm at a loss as to why "params" is not being passed to createQueryJob correctly. Any help would be much appreciated. Here is the code (I had to remove the actual query for privacy reasons):
package.json:
{
"name": "sample-http",
"version": "0.0.1",
"dependencies": {
"#google-cloud/bigquery": "^2.0.6"
}
}
index.js:
const { BigQuery } = require("#google-cloud/bigquery");
/**
* Responds to any HTTP request.
*
* #param {!express:Request} req HTTP request context.
* #param {!express:Response} res HTTP response context.
*/
exports.getRecallDataByVehicleInfo = (req, res) => {
res.set('Access-Control-Allow-Origin', "*");
res.set('Access-Control-Allow-Methods', 'GET, POST');
res.setHeader(
"Access-Control-Allow-Headers",
"X-Requested-With,content-type"
);
const params = req.query;
// res.status(200).send("make is - " + params.make + ", model is - " + params.model + ", model year is -" + params.modelYear);
// return;
const bigQuery = new BigQuery();
const query = `myQuery
where Make = #make
and Model = #model
and ModelYear = #modelYear`
bigQuery.createQueryJob({
query,
params
}).then(results => {
const job = results[0];
return job.getQueryResults({
autoPaginate: false,
timeoutMs: 1000000
},
callback());
});
const callback = () => (err, rows) => {
if (err) {
res.status(401).send(JSON.stringify(err));
}
else {
res.status(200).send(rows);
}
};
}

You don't use the right method. Have a look to the createQueryJob definition
There is no param, it's for creating a query as job. If you have a look to the official (bad) example, you have to use query method. Here its definition
Note: Why the example is bad?
In the official example, the param provided to the query method is named option. In the documentation of the query method, the first (and mandatory) param is named query, and you can optionally add an option param. So, the naming is confusing.

Thank-you #guillaume blaquiere for the help! I changed my code to the following and it appears to be working now:
const { BigQuery } = require("#google-cloud/bigquery");
/**
* Responds to any HTTP request.
*
* #param {!express:Request} req HTTP request context.
* #param {!express:Response} res HTTP response context.
*/
exports.getRecallDataByVehicleInfoTest = (req, res) => {
res.set('Access-Control-Allow-Origin', "*");
res.set('Access-Control-Allow-Methods', 'GET, POST');
res.setHeader(
"Access-Control-Allow-Headers",
"X-Requested-With,content-type"
);
const params = req.query;
const query = `query
where lower(Make) = #make
and lower(Model) = #model
and CAST(ModelYear as String) = #modelYear`
const queryObj = {query, params};
const options = {
autoPaginate: false,
timeoutMs: 1000000
}
const bigQuery = new BigQuery();
bigQuery.query(queryObj, options, function(err, rows){
if (err) {
res.status(401).send(JSON.stringify(err));
}
else {
res.status(200).send(rows);
}
});
}
UPDATE
The issue wasn't me using an old package or me using createQueryJob instead of query. The reason a POST would work and why hard coded params would work was because they would properly send over modelYear as a number. But when you GET the params from the queryString modelYear comes in as a string (obviously) which blows up the bigQuery sql comparison in the whereclause.
The UI for testing cloud functions assumes a POST request and so it would tell me that I wasn't passing in the make param whereas the actual call in the browser would just error out without a helpful error message because I wasn't catching and returning the error properly because I just did a .then and not a .catch on my createQueryJob. So I didn't know what the actual issue was.
Bottom line - both createQueryJob and query can accept params despite the documentation only showing it for query and there is no difference between GET and POST for cloud functions in terms of how they handle passing params into these functions.
Here is the code that shows createQueryJob also handles params:
https://github.com/googleapis/nodejs-bigquery/blob/master/src/bigquery.ts#L1139

Related

Can not invoke the firebase cloud function where I have 3rd party api request

I have function to get vehicle images from a 3rd party api called evox.
The following is the function I use in my cloud function: EDITED VERSION:
const axios = require('axios').default;
/**
*
* #param {Number} vifNum vif / evox number of the requested vehicle
* #returns {Promise} { image: '' }
*/
module.exports.makeImagesRequest = (vifNum) => {
return new Promise(async (resolve, reject) => {
try{
var returnData = { image: "" }
var productId = 0
var productType = 0
// Set headers for request
var imagesReqUrl = 'http://api.evoximages.com/api/v1/vehicles/' + vifNum + '/products/' + productId + '/' + productType
const imagesRequestOptions = {
url: imagesReqUrl,
method: 'GET',
headers: {
'x-api-key': functions.config().evox.key,
'Accept': 'application/json',
'Content-Type': 'application/json;charset=UTF-8'
}
};
// Send request and set the returned data
const response = await axios(imagesRequestOptions);
returnData.image =response.data.urls[0]
resolve(returnData)
}
catch(error){
reject({ error, helperMsg: "Vif number is not valid or connection problem" })
}
});
}
I use postman to send requests to api in the following format:
{
"data" : {
"vifNum": 99999
}
}
Where I export the cloud function: EDITED VERSION:
module.exports.getVehicleImages = functions.https.onCall((data, context) => {
return new Promise(async (resolve, reject) => {
try{
const val = await makeImagesRequest(data.vifNum)
resolve(val)
}
catch(error){
reject({ error, msg: "Error while getting vehicle image by evox" });
}
})
})
The weird thing is that, when I use emulators to test my functions, it works quite good, no error and the information is correct and consistent. But when I deployed my functions, and called that function with the api url from postman with the exact same data, now it gives an error inside this makeImagesRequest function
IMPORTANT EDIT: I tried using another 3rd party api VIN, and that api worked in cloud functions. How I built the evox and VIN functions is almost exactly same. The only notable difference is that, evox require that 'x-api-key' in the header, while VIN does not need any api-key. Could the error be related to this? PS: functions.config() is properly set, checked that out via pulling a .runtimeconfig.json file from firebase
Postman error:
Firebase error:
It seems that the problem was with my firebase.config(), even though I retrieved evox.key when I ran the command firebase functions:config:get > .runtimeconfig.json, in my cloud function it was not valid.

Google Cloud Storage NodeJS getFilesStream Async

I have a Google Function that never returns; it just hits the timeout limit as a Google Function. It works fine locally within under 60 seconds. Not sure what the issue might be. Code is below:
/**
* Responds to any HTTP request.
*
* #param {!express:Request} req HTTP request context.
* #param {!express:Response} res HTTP response context.
*/
const {Storage} = require('#google-cloud/storage');
exports.main = async (req, res) => {
const storage = new Storage({projectId: 'our-project'});
const store = storage.bucket('our-bucket');
const incomplete = {
LT04: [],
LT05: [],
LE07: [],
LC08: []
};
store.getFilesStream({prefix : 'prefixToMatch', autoPaginate : true })
.on('error', (err) => {
return console.error(err.toString())
})
.on('data', (file) => {
// Find small/bad files
if (file.metadata.size === 162) {
const split = file.name.split('/');
const prefix = split[2].substr(0, 4);
incomplete[prefix].push(file.name);
}
})
.on('end', () => {
return JSON.stringify(incomplete, false, ' ');
});
};
Your code it seems ok. But you need to take into account some additional details about this.
Does your Cloud function's memory is enough for this? I think that you could increase the memory allocated of your CF.
Are you sure that this is due to a timeout issue? If you have not seen the logs you can do it going to the Error reporting section.
In case that you have already confirm this, another option could be to change the timeout duration.
I think the issue was that I needed to send a "res.send" instead a Promise.resolve. As well, I needed to remove the async before the function.
Thanks for the quick response with guidelines, error was easier than that apparently.

Getting data from API inside a graphQL resolver deployed on Azure node.js function

I am trying to deploy a GraphQL server on node.js platform using Azure functions. I have been able to deploy a basic hello world app.
However, I need to get data from a backend API in the resolver. I am not able to get either fetch or request package to work in Azure functions.
Below is my code:
var { graphql, buildSchema } = require('graphql');
var fetch = require('node-fetch');
var request = require('request');
var schema = buildSchema(`
type Query {
myObject: MyObject
}
type MyObject {
someId (data: String) : String
}
`);
var root = {
myObject: () => {
return {
someId: (args) => {
// Code enters till this point.
// I can see context.info messages from here.
// return "hello"; <--- This works perfectly fine.
return request('http://example.com', function (error, response, body) {
// -----> Code never enters here.
return body;
});
}
}
}
};
module.exports = function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
graphql(schema, req.body, root)
.then(response => {
context.res = {
body: JSON.strigify(response)
};
context.done();
});
};
I have tried using fetch and request modules. But with both of them, I see the same behavior - the response never returns. The request eventually times out after 5 minutes. If instead of fetch or request, I choose to return some dummy value, I see the response getting returned correctly to the query. With fetch, I don't see the then block or the catch block ever executing.
Note: I have tried both http and https URLs in the request URIs but none of them seem to return any data.
Is it an issue with the way I have implemented the fetch/request or is it an issue with Azure functions in general?
Answering my own question:
It seems that node-fetch and request don't actually return promises. Wrapping the request around Promise seems to solve the problem. Something similar to this answer.

Issue with Sequelize Query and PUT errors in Chrome

STUDENT QUESTION!
I'm learning about Node.js/Express and MySQL databases using the Sequelize ORM. Traditionally in our simple applications, after querying a MySQL database with Sequelize, we will issue a res.redirect('/') within Express PUT route's .then promise, similar to this:
app.post("/", function (req, res) {
db.Burgers.create({
burger_name: req.body.burger_name
}).then(function () {
res.redirect('/');
});
});
I'm running into a problem when creating a sequelize query using the findOrCreate() method. Namely, I'm struggling to find where to place the res.redirect statement on an AJAX PUT request. For some reason, when I have the res.redirect('/') attached within the express route for the PUT statement, I will see duplicate PUT requests in the Chrome Network inspector. The first PUT request is displayed as (localhost:3000/devour, type:text/plain, status:302).
The PUT request is received by the Express server and the sequelize query succeeds, updating the proper tables in the MySQL database. However, the redirect on the Express route does not succeed and Chrome Inpector shows an error " PUT http://localhost:3000/ 404 (Not Found)" and when I look at the Network tab I see a second PUT request (localhost:3000/, type:xhr, status:404).
This is the Express PUT route:
app.put("/devour", function (req, res) {
var customerId;
// Check to see if the customer name entered already exists, if not create
db.Customers.findOrCreate({
where: {
customer_name: req.body.customer_name
}
})
.spread((user, created) => {
if (created) console.log("User created");
customerId = user.id;
var update = {
"devoured": req.body.devoured,
"CustomerId": customerId
};
db.Burgers.update(update, {
where: {
id: req.body.id
}
}).then(function () {
res.redirect('/');
});
})
});
What is generating this second PUT request? Is it a response rather than request?
This is my first Sequelize query using the findOrCreate() method so perhaps I'm misunderstading the use of .spread().
If I comment out the res.redirect on the PUT Express route, the error does not occur but I have to manually refresh the page to see the updated data from the MySQL database.
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
I added the 303 status code to my Express PUT route: res.redirect(303, '/'). This elmininated the 404 error on the redirect, however the HTML page was not refreshing with GET request to reload the page with updates from the PUT request.
Then I looked at my Ajax call and realized that, perhaps, I needed to add code there to handle the response from the server:
$.ajax({
url: URL,
type: 'PUT',
contentType: 'application/json',
data: JSON.stringify(dataObject)
})
So I added a .done promise callback and the page successfully refreshes following the PUT request:
$.ajax({
url: URL,
type: 'PUT',
contentType: 'application/json',
data: JSON.stringify(dataObject)
}).done(function(){
window.location.href = window.location.origin + '/'
})
I guess I'm a bit confused about why the server-side res.redirect(303, '/'), alone, doesn't result in the page refresh on the client-side. What is the point of providing the '/' path as an argument?
Thank you!
You can read more about what a redirect header does when included in PUT requests in Why POST redirects to GET and PUT redirects to PUT?
Long story short your PUT request remains a PUT request, just gets redirected to /.
Your app should work properly if you provide 303 status code.
res.redirect( 303, '/' );

xhttp GET request to an express.js server - nothing returned

I am trying to do a simple xhttp GET request to an express.js server. Unfortunately I get no response data with this code. The connection is fine as I have successfully used "res.send" to send a body back from the server.
I am not sure if my use of "findOne" on the server is incorrect or if my use of xhttp on the client is incorrect. I suspect it is the client.
I'd appreciate any advice.
* CLIENT CODE *
function getfood() {
var xhttp = new XMLHttpRequest();
xhttp.open("GET", "http://localhost:3000/clientfood", true);
xhttp.send();
}
* SERVER CODE - Express.js / Node *
app.get('/clientfood', cors(), (req, res) => {
//res.send('test'); //this works at least
db.collection('quotes').findOne({
"_id": ObjectId("12345")
},
{
name: 1,
quote: 1
})
})
xhttp GET request to an express.js server - nothing returned
Your server code does not return a response. You need to do something like res.send(...) or res.json(...) to return a response back to the caller and you need to do that in the callback that your database provides for communicating back the result of a query (in most DBs, you can either use a plain callback or a promise).
Your client code does not listen for a response. Example for how to do that shown here on MDN and would typically be:
function getfood() {
var xhttp = new XMLHttpRequest();
xhttp.addEventListener("load", function() {
if (xhttp.status === 200) {
// have data in xhttp.responseText, process it here
} else {
// got some other response here
}
});
xhttp.open("GET", "http://localhost:3000/clientfood", true);
xhttp.send();
}
Thanks so much - especially #jfriend00. I have a lot to learn about how these frameworks work. After taking your advice about SEND I had a little trouble seeing the result on my frontend. I got the message "promise pending". I fixed that with the code suggested in this post.
Express - Promise pending when loop queries
Also I modified my findOne function to grab the entire object for my id.
Final code:
app.get('/clientfood', cors(), (req, res) => {
mydata = db.collection('quotes').findOne(
{
"_id": ObjectId("12345")
})
// promise code
Promise.all([mydata]).then(listOfResults => {
res.send(JSON.stringify(listOfResults)) //for example
}, err => {
res.send(500, JSON.stringify(err)); // for example
});
})

Resources