How to run server on Google Cloud Function - with Node.js - node.js

I have NODE.js code which works perfectly locally (127.0.0.1:CUSTOM_PORT). But now I would like to set it up to run it on Google Cloud Function.
This is the code which I'm using to run code locally:
function connect_to_server() {
const PORT = process.env.PORT || 8080;
app.listen(PORT,'127.0.0.1',function () {
console.log('---> SERVER IS RUNNNG <---')
})
}
Does someone know the way to set a running server with Google Cloud Functions?
What port should I use and URL INSIDE THE NODE.JS ?? Or I do not need to use it at all as GCF already initially set up a server for me?
GCF Provide trigger URL which can be hit, but it still does not work.
Full function with out app.listen()
// CONFIGURATION
const express = require('express')
const app = express()
const config = require('./config')
const bodyParser = require('body-parser')
const moment = require('moment')
const sql = require("mssql")
const jwt = require('jwt-simple')
const compression = require('compression')
function token(token) {
var secret = Buffer.from('xxx', 'hex')
return jwt.decode(token, secret)
}
function sql_puller(res, req) {
sql.connect(config, function (err) {
if (err) {
console.log(err)
res.send(err.code)
}
const request = new sql.PreparedStatement()
const {
x
} = req.body
let newProps = {}
x.forEach(filters => {
newProps[filters.x] = filters.x
})
const isValidInput = validateInput(x, x, x, res)
if (!isValidInput) {
return
}
request.input('1', sql.VarChar(1))
request.input('2', sql.VarChar(1))
request.input('3', sql.VarChar(1))
sqlQuery = `XXXXXX`
request.prepare(sqlQuery, err => {
if (err) {
console.log(err)
res.send(err.code)
return
}
request.execute({
iso: x,
start: x,
end: x
}, (err, recordset) => {
request.unprepare(err => {
if (err) {
console.log(err)
res.send(err.code)
return
}
})
if (err) {
console.log(err)
res.send(err.code)
return
}
res.json(recordset)
sql.close()
})
})
})
sql.on('close', function (err) {
console.log('SQL Connection Closed.', err)
})
sql.on('error', function (err) {
sql.close()
console.log('SQL error occurred.', err)
})
}
exports.main = function main() {
app.use(compression())
app.use(bodyParser.json())
app.post('/', function (req, res) {
try {
res.setHeader('Cache-Control', 'public, max-age=3600')
var decodedToken = token(req.body['Token'])
console.log(req.body)
console.log('Successefully connected - token accepted')
// connect to your databas
if (decodedToken == "XXXXXX") {
sql_puller(res, req)
} else {
console.log('Incorrect Token')
}
} catch (err) {
if (err) {
console.log(err)
res.send('Invalid Token')
return
}
}
})
}

You cannot the way you have designed it. Google Cloud Functions has a maximum runtime and then the function is terminated. As of today this limit is 540 seconds. Cloud Functions are invoked by an outside process, Cloud Functions do not wait for someone to connect to them (e.g. they don't listen, they are not asleep). The exception is HTTP Trigger, but this is not usable to present a website, but can be usable for actions.
There are companies that run their entire website using Cloud Functions, Cloud Datastore and Cloud Storage. The magic is using an API gateway product. An API gateway provides the URL, www.example.com, that customers go to. The API gateway then invokes Cloud Functions to handle the request. You create similar mappings for each page on your serverless website to Cloud Functions.
Many developers use Google App Engine to accomplish what you are trying to do. Very low cost and very easy to develop for. Another excellent Google product for you to consider is Google Firebase. Google has many other products that are not serverless such as Containers on Compute Engine and Kubernetes.

Related

How to efficiently forward request to multiple endpoints using nodejs?

I built a nodejs server to act as an adapter server, which upon receiving a post request containing some data, extracts the data from the request body and then forwards it to a few other external servers. Finally, my server will send a response consisting of the responses from each of the external server (success/fail).
If there's only 1 endpoint to forward to, it seems fairly straightforward. However, when I have to forward to more than one servers, I have to rely on things like Promise.All(), which has a fail-fast behaviour. That means if one promise is rejected (an external server is down), all other promises will also be rejected immediately and the rest the servers will not receive my data.
May be this ain't be the exact solution. But, what I am posting could be the work around of your problem.
Few days back I had the same problem, as I wanted to implement API versioning. Here is the solution I implemented, please have a look.
Architecture Diagram
Let me explain this diagram
Here in the diagram is the initial configuration for the server as we do. all the api request come here will pass to the "index.js" file inside the release directory.
index.js (in release directory)
const express = require('express');
const fid = require('./core/file.helper');
const router = express.Router();
fid.getFiles(__dirname,'./release').then(releases => {
releases.forEach(release => {
// release = release.replace(/.js/g,'');
router.use(`/${release}`,require(`./release/${release}/index`))
})
})
module.exports = router
code snippet for helper.js
//requiring path and fs modules
const path = require('path');
const fs = require('fs');
module.exports = {
getFiles: (presentDirectory, directoryName) => {
return new Promise((resolve, reject) => {
//joining path of directory
const directoryPath = path.join(presentDirectory, directoryName);
//passsing directoryPath and callback function
fs.readdir(directoryPath, function (err, files) {
// console.log(files);
//handling error
if (err) {
console.log('Unable to scan directory: ' + err);
reject(err)
}
//listing all files using forEach
// files.forEach(function (file) {
// // Do whatever you want to do with the file
// console.log(file);
// });
resolve(files)
});
})
}
}
Now, from this index file all the index.js inside each version folder is mapped
Here is the code bellow for "index.js" inside v1 or v2 ...
const express = require('express');
const mongoose = require('mongoose');
const fid = require('../../core/file.helper');
const dbconf = require('./config/datastore');
const router = express.Router();
// const connection_string = `mongodb+srv://${dbconf.atlas.username}:${dbconf.atlas.password}#${dbconf.atlas.host}/${dbconf.atlas.database}`;
const connection_string = `mongodb://${dbconf.default.username}:${dbconf.default.password}#${dbconf.default.host}:${dbconf.default.port}/${dbconf.default.database}`;
mongoose.connect(connection_string,{
useCreateIndex: true,
useNewUrlParser:true
}).then(status => {
console.log(`Database connected to mongodb://${dbconf.atlas.username}#${dbconf.atlas.host}/${dbconf.atlas.database}`);
fid.getFiles(__dirname,'./endpoints').then(files => {
files.forEach(file => {
file = file.replace(/.js/g,'');
router.use(`/${file}`,require(`./endpoints/${file}`))
});
})
}).catch(err => {
console.log(`Error connecting database ${err}`);
})
module.exports = router
In each of this index.js inside version folder is actually mapped to each endpoints inside endpoints folder.
code for one of the endpoints is given bellow
const express = require('express');
const router = express.Router();
const userCtrl = require('../controllers/users');
router.post('/signup', userCtrl.signup);
router.post('/login', userCtrl.login);
module.exports = router;
Here in this file actually we are connecting the endpoints to its controllers.
var config = {'targets':
[
'https://abc.api.xxx',
'https://xyz.abc',
'https://stackoverflow.net'
]};
relay(req, resp, config);
function relay(req, resp, config) {
doRelay(req, resp, config['targets'], relayOne);
}
function doRelay(req, resp, servers, relayOne) {
var finalresponses = [];
if (servers.length > 0) {
var loop = function(servers, index, relayOne, done) {
relayOne(req, servers[index], function(response) {
finalresponses.push[response];
if (++index < servers.length) {
setTimeout(function(){
loop(servers, index, relayOne, done);
}, 0);
} else {
done(resp, finalresponses);
}
});
};
loop(servers, 0, relayOne, done);
} else {
done(resp, finalresponses);
}
}
function relayOne(req, targetserver, relaydone) {
//call the targetserver and return the response data
/*return relaydone(response data);*/
}
function done(resp, finalresponses){
console.log('ended');
resp.writeHead(200, 'OK', {
'Content-Type' : 'text/plain'
});
resp.end(finalresponses);
return;
}
It sounds like you are trying to design a reverse proxy. If you are struggling to get custom code to work, there is a free npm library which is very robust.
I would recommend node-http-proxy
I have posted link below, which will lead you directly to the "modify response", since you mentioned modification of the API format in your question. Be sure to read the entire page though.
https://github.com/http-party/node-http-proxy#modify-a-response-from-a-proxied-server
Note: this library is also very good because it can support SSL, and proxies to both localhost (servers on the same machine) and servers on other machines (remote).
Promise.all() from MDN
It rejects with the reason of the first promise that rejects.
To overcome the problem, you'll need to catch() each request you've made.
e.g.
Promise.all([
request('<url 1>').catch(err => /* .. error handling */),
request('<url 2>').catch(err => /* .. error handling */),
request('<url 3>').catch(err => /* .. error handling */)
])
.then(([result1, result2, result3]) => {
if(result1.err) { }
if(result2.err) { }
if(result3.err) { }
})

how to connect multiple app in Nodejs.... (Nodejs in action)

let api = connect()
.use(users.users)
.use(pets.pets)
.use(errorHandler.errorHandler);
let app = connect()
.use(hello.hello)
.use('/api', api)
.use(errorPage.errorPage)
.listen(3000);
source code in Nodejs in action..
it doesn't work. => 'api' is never called. it's nothing happen when URL is /api.
how can i fix it?
pets.js
module.exports.pets = function pets(req, res, next) {
if (req.url.match(/^\/pet\/(.+)/)) {
foo();
}
else{
next();
}
}
users.js
let db = {
users: [
{name: 'tobi'},
{name: 'loki'},
{name: 'jane'}
]
};
module.exports.users = function users(req, res, next) {
let match = req.url.match(/^\/user\/(.+)/);
if(match) {
let user;
db.users.map(function(value){
if(value.name == match[1])
user = match[1];
});
if(user) {
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify(user));
}
else {
let err = new Error('User not found');
err.notFound = true;
next(err);
}
}
else {
next();
}
};
and connect version is
"connect": "^3.6.6"
is it possible 'connect(app)'?
You shouldn't instantiate two connect servers. What you want to do is chain those middlewares as
.use('/api', users.users);
.use('/api', pets.pets);
The first middleware will pass the request onto pets.pets via next().
You can read more at this link. Sadly connect doesn't support this type of chaining:
.use('/api', [users.users,pets.pets]);
Which would be a cute solution to your problem, but express supports it.
So if you're looking into NodeJS, you should definitely get familiar with Express, Connect is a good starting tool but it's as simple as it gets without any decent functionality without some 'hacking' on your side.

How do I preload value on variable NodeJS

I am just learning NodeJS and I come from a Java/Scala background.
I am writing a service that communicates with Amazon SNS and handles endpoints/tokens.
Basically there is a list of SNS applications that I have in my environment, and this list is rarely modified, so I would like to pre-load its values into a variable or constant on server startup.
The SNS-SDK provided by Amazon has this function for listing the applications:
listPlatformApplications(params, callback)
So what I naively tried to do was this:
var applications = [];
var loadApplications = function() {
sns.listPlatformApplications({}, function(err, data){
if (err) {
console.log(err);
} else {
return data['PlatformApplications'].map(function (app) {
return app['PlatformApplicationArn']
});
}
});
}
loadApplications();
And basically what happens is that some calls come in before this callback finishes, when the list is still empty.
How would I go about pre-loading this data, or any other data, before the server starts responding to requests?
Or maybe my reasoning for this is wrong, and there would be another approach to handle this on NodeJS that is more idiomatic
If you absolutely must use a callback instead of a promise (spoiler alert: you don't), you have to call the server startup command within the SNS callback. So you would do something like this:
var startServer = require('my-server.js');
var applications = [];
var loadApplications = function() {
sns.listPlatformApplications({}, function(err, data){
if (err) {
console.log(err);
} else {
var snsData = data['PlatformApplications'].map(function (app) {
return app['PlatformApplicationArn']
});
startServer(snsData)
}
});
}
loadApplications();
== RECOMMENDED SOLUTION ==
If instead, you can use promises (which you absolutely should!), you could start the SNS request at server start and await the result whenever needed. Your code would look something like this:
const startServer = require('my-server.js');
const loadApplications = async () => {
const data = sns.listPlatformApplications({}).promise();
const snsData = data['PlatformApplications'].map(function (app) {
return app['PlatformApplicationArn']
});
return snsData;
};
const applications = loadApplications();
startServer(applications);
// inside my-server.js
const startServer = async (applications) => {
const doSomethingWithApplications = await applications;
...
}

increase number response per second

I have an android game that has 40,000 users online. And each user send request to server every 5 second.
I write this code for test request:
const express = require('express')
const app = express()
const pg = require('pg')
const conString = 'postgres://postgres:123456#localhost/dbtest'
app.get('/', function (req, res, next) {
pg.connect(conString, function (err, client, done) {
if (err) {
return next(err)
}
client.query('SELECT name, age FROM users limit 1;', [], function (err, result) {
done()
if (err) {
return next(err)
}
res.json(result.rows)
})
})
})
app.listen(3000)
Demo
And for test this code with 40,000 requests I write this ajax code:
for (var i = 0; i < 40000; i++) {
var j = 1;
$.ajax({
url: "http://85.185.161.139:3001/",
success: function(reponse) {
var d = new Date();
console.log(j++, d.getHours() + ":" + d.getMinutes() + ":" + d.getSeconds());
}
});
}
SERVER detail(I know this is poor)
Questions:
this code (node js)only response 200 requests per second!
how can improve my code for increase number response per second?
this way(ajax) for simulate 40,000 online users is correct or not?
if i use socket is better or not?
You should use Divide&Conquer algorithm for solving such problems. Find the most resource inefficient operation and try to replace or reduce an amount of calls to it.
The main problem that I see here is that server open new connection to database on each request which possibly takes most of the time and resources.
I suggest to open connection when the server boots up and reuse it in requests.
const express = require('express')
const app = express()
const pg = require('pg')
const conString = 'postgres://postgres:123456#localhost/dbtest'
const pgClient
pg.connect(conString, function (err, client, done) {
if (err) {
throw err
}
pgClient = client
})
app.get('/', function (req, res, next) {
pgClient.query('SELECT name, age FROM users limit 1;', [], function (err, result) {
if (err) {
return next(err)
}
res.json(result.rows)
})
})
app.listen(3000)
For proper stress load testing better use specialized utilities such as ab from Apache. Finally, sockets are better for rapid, small data transfer but remember it has problems with scaling and in most cases became very inefficient at 10K+ simultaneous connections.
EDIT: As #robertklep pointed out, better use client pooling in this case, and retrieve client from pool.

Initialization of db connection - nodejs

I want to use gridfs-stream in a nodejs application.
A simple example is given in the documentation:
var mongoose = require('mongoose');
var Grid = require('gridfs-stream');
Grid.mongo = mongoose.mongo;
mongoose.connect('mongodb://localhost:27017/test');
// make sure the db instance is open before passing into `Grid`
mongoose.connection.once('open', function () {
var gfs = Grid(mongoose.connection);
// all set!
})
My problem is described by the comment:
make sure the db instance is open before passing into Grid
I try to use gfs in a post request. Now when the code gets initialized, the gfs variable is not defined yet.
api.post('/upload', function(req, res) {
req.pipe(gfs.createWriteStream({
filename: 'test'
}).on('close', function(savedFile){
console.log('file saved', savedFile);
return res.json({file: savedFile});
}));
})
Initializing my route from a callback seems kind of odd.
I read in this post (Asynchronous initialization of Node.js module) that require('') is performed synchronous, and since I rely on the connection being established, I'm kind of forced to wait
Basically I'm not sure if I should use a async pattern on startup now, or if I just miss a more elegant way to solve this.
I have a very similar problem with my server. In my case I am reading https certs asynchronously, the software version from git asynchronously and I want to make sure I have it all together by the time the user comes to log in so I can pass the software version back as a reply to login.
The solution is to use promises. Create the promises on user start up for each activity. Then in the code where you want to be sure its all ready, just call then on either the promise itself or Promise.all(array of promises).then()
Here is an example of what I am doing to read the ssl certs to start the server
class Web {
constructor(manager,logger) {
var self = this;
this.server = false;
this.logger = logger;
var key = new Promise((resolve,reject) => {
fs.readFile(path.resolve(__dirname, 'key.pem'),(err,data) => {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
var cert = new Promise((resolve,reject) => {
fs.readFile(path.resolve(__dirname, 'certificate.pem'), (err,data) => {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
Promise.all([key,cert]).then(values => {
var certs = {
key: values[0],
cert: values[1],
};
return certs;
}).then(certs => {
self.server = require('http2').createServer(certs,(req,res) => {
// NOW Started and can do the rest of the stuff
});
self.server.listen(...);
});
NEEDS SOME MORE CLOSING BRACKETS

Resources