Connect Node.js and Meteor on same server - node.js

I want to create an API that makes a lot of HTTP requests and processes the data of those. I want to present the result using a Meteor app, but Node.js seems better suited for the former. Is it a wise thing to have a Node.js application and a Meteor application running concurrently, or will there be an inevitable performance penalty? If it is OK, what would be the best way to connect them?
Make a HTTP request to the local Node.js server.
Have Meteor write an entry in a MongoDB database, which the Node.js application observes for changes.
Which of these is preferable, if this is at all, or are there other options?

There is an npm package to talk over DDP via a node application.
But, if you want to process HTTP requests, you can simply use meteor's WebApp package in order to handle the HTTP requests, and will react in a traditional "node-like" way. req is a Node request object, and res is a Node response object. Try something like this:
WebApp.connectHandlers.use('/api/v1/things', function (req, res, next) {
if (req.method === 'GET') {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.write(JSON.stringify(Things.find().fetch()));
res.end()
}
next()
});

Here is an example on how to use node.js and meteor together hopefully this is helpful
You should create a meteor package it will allow you to require npm modules, with Npm.require('name') and it's the meteor way to manage code something similar to npm package. http://docs.meteor.com/#/full/packagejs
here is a simple package:
/package_demo
/package_demo/package.js
/package_demo/server.js
package.js
// Standart package.js file with some of the options
Package.describe({
name: 'username:packagename',
summary: 'what this does',
version: '0.0.1'
});
// if you need any global/core or anything from npmjs.com
// it will make it available for us to use like Npm.require
// in the package files
Npm.depends({
'request': '2.62.0'
})
Package.onUse(function (api) {
// we are going to use mongo so we need to specify it
api.use('mongo', 'server');
// adding package files
api.addFiles('server.js', 'server');
// exporting the mongo collection to make it available to the meteor app
api.export('github', 'server');
});
server.js:
// we can require any npm modules we specify at package.js
var request = Npm.require('request')
// meteor code
// new mongo collection not needed if you defined it elsewhere
github = new Mongo.Collection('github');
// we wrap this with startup to run after the meteor server process is finished starting
// this makes sure github collection exists, if you have external collection
// that you connect with DDP or you know is available you don't have to use it.
Meteor.startup(function () {
console.log('server started');
// node js code - you can use the full power of node async
// query github api for "meteor" repo
request({
url: 'https://api.github.com/repos/meteor/meteor',
headers: {
'User-Agent': 'request'
}
},
// regular node js request callback
// but we made it compatible with meteor with Meteor.bindEnvironment(callback);
// it makes sure we have access inside the callback to github mongo collection
// always wrap callbacks to non-Meteor libraries with Meteor.bindEnvironment
// if you need access to meteor functions/objects etc... if we just wanted to
// console.log the information it can work without Meteor.bindEnvironment()
Meteor.bindEnvironment(function (error, response, body) {
if (!error && response.statusCode === 200 || response.statusCode === 304) {
var data = JSON.parse(body);
console.log(data.stargazers_count + ' Stars');
// meteor code
// insert it to meteor collection
github.insert({ id: data.id, fullName: data.full_name, stars: data.stargazers_count });
}
}));
});
I saw you also wanted to require local npm modules, I think you can probably hack you way and include it with Npm.require(GETPATHSOMEHOW'file.js') but I recommend against that because when you compile your project and go into production it's not reliable way to get the path and might break. So you don't really have to publish the npm module to require it you can also just install it on the machine globally
After you did npm init and created the npm package install globally the npm pacakge from your package root directory npm install . -g you can then verify if it exists globally npm ls -g after that you can include your own module inside meteor package same as above.
Also you might not need to create a node js package at all, You can also add more files to your meteor package api.addFiles(['server.js', 'server2.js'], 'server'); and the code inside can be node js code (no nodejs exports support), if you need to export an object or anything to be available globally on your meteor app you can use api.export();
Because package files are shared code for example:
server2.js
// code between the package files is shared
githubAPI.query();
and adding this to server.js
githubAPI = {
query: function () {
request({
url: 'https://api.github.com/repos/meteor/meteor',
headers: {
'User-Agent': 'request'
}
}, Meteor.bindEnvironment(function (error, response, body) {
if (!error && response.statusCode === 200 || response.statusCode === 304) {
var data = JSON.parse(body);
console.log(data.stargazers_count + ' Stars');
github.insert({ id: data.id, fullName: data.full_name, stars: data.stargazers_count });
}
}));
}
};
console output:
2 times logged + inserted to database. So it's like require you just add the files :)
28212 Stars
28212 Stars

Related

Automate Functional Testing in node.js

I checked some NPM libraries to test a webpages or web-services. But all of them expect that the server is already running. Since I want to automate functional testing, how can I setup NPM package in such a way that
It can start the server
Test the application
Stop the server
So that I can test it locally, as well as on online CI tools like travis-ci or circleci.
Case 1: Webservice
I wrote a NPM package which starts nodejs HTTP(s) server. It can be started from command line $stubmatic. Currently, I use 2 approaches to test it,
manual : I manually start it from command line. Then run the tests.
Automatic: I use exec module to run a unix command which can start the application and run pkill command to kill the application. But for this automation, my application need to be installed on testing machine.
Case 2: Website
I have create a NPM package: fast-xml-parser and created a demo page within the repo so that I can be tested in the browser. To test the demo page, I currently start a local server using http-server npm package manually. Test the application.
What can be the better way to write automate functional tests for node js applications?
Note:
I never used task runners like gulp or grunt. So I'm not sure if they can help in this case.
In case 1, my application starts node js native HTTP server. I'm not using any 3rd party application like express currently.
This question mentions a new Docker container system for Travis that could be duplicated locally. It might be a way: How to run travis-ci locally
Did you look at supertest (SuperAgent driven library for testing HTTP servers) and expect (Assertions library)(documented here) with mocha (Test Framework)?
I use them and I never had any kind of problems for all the tests I do until now.
The documention in the links contains all the informations you need to build up your test.
Case 1: Webservice
Problem 1
As nodejs server.close() was not working. I copied paste this snippet in every test file which is starting my webservice.
try{
server.setup(options);
server.start();
}catch(err){
console.log(err);
}
Once all the tests are completed, server stops.
**Problem 2
I was using chai-http incorrectly. Here is the complete working solution.
//Need to be placed before importing chai and chai-http
if (!global.Promise) {
global.Promise = require('q');
}
var server = require('.././lib/server');
var chai = require('chai')
, chaiHttp = require('chai-http');
chai.use(chaiHttp);
try{
server.setup(someoptions);
server.start();
}catch(err){
console.log(err);
}
describe('FT', function () {
describe('scenario::', function () {
it('responds to POST', function (done) {
chai.request("http://localhost:9999")
.post('/someurl')
.then(res => {
expect(res.status).toBe(200);
//console.log(res.text);
done();
}).catch( err => {
console.log(err);
done();
});
});
});
Case 2: Website This was quite simple.
I used http-server to start the server so my html files can be accessed.
I used zombie js for browser testing. (There are many other options available for browser testing)
Here is the code
process.env.NODE_ENV = 'test';
const Browser = require('zombie');
const httpServer = require('http-server');
describe("DemoApp", function() {
var browser = new Browser({site: 'http://localhost:8080'});
var server = httpServer.createServer();
server.listen(8080);
beforeEach(function(done){
browser.visit('/', done);
});
describe("Parse XML", function() {
it("should parse xml to json", function(done) {
browser.pressButton('#submit');
browser.assert.text('#result', 'some result text');
done();
});
});
afterEach(function(){
server.close();
})
});

Can/should we use/require node_modules included by my dependencies' node_modules?

Scenario:
Imagine you are building a web application using a framework such as hapi and you know that when you npm install hapi --save it installs several "utilities" in it's node_modules (e.g: boom, joi, hoek, etc.)
Question:
Can we avoid explicitly re-installing Joi in the project and use the module included in my_app/node_modules/hapi/node_modules/joi/lib/ e.g: in my server.js file:
var Hapi = require('hapi');
var Joi = require('./node_modules/hapi/node_modules/joi/lib/'); // good or bad idea? Why?
var server = new Hapi.Server();
server.connection({ port: 3000 });
server.route({
method: 'GET',
path: '/{name*}',
config: {
validate: { // validate using Joi
params: {
name: Joi.string().max(40).min(2).alphanum()
}
},
handler: function (req,reply) {
reply('Hello '+ req.params.name + '!');
}
}
});
server.start(function() {
console.log('Now Visit: http://localhost:3000/YOUR_NAME_HERE')
});
While the idea of not installing the same node_modules multiple times appears logical on the surface we would like to know the real world pitfalls of doing this?
The clear advantage is that when you update your dependency on Hapi you will also get all the latest versions of its' dependencies the potential downside is that you won't know when there's a breaking change in one of those dependencies ... what else...?
Note: I/we do not currently do this in any of our Node.js code, but someone asked me why not? today and I did not have a comprehensive answer for them...
"Because it's bad practice..." is not a good answer, we want to understand why... not have the 4 monkeys in a room "reason"

Include Node.JS + Socket.IO in Symfony2 application

I have done many researches and it seems I can't find the proper solution. I am confident in PHP. I also have done some tutorials on Node.JS and Socket.IO and I'm currently learning Symfony2 but I can't see how I can merge the two to achieve my goal.
My goal is to set up real-time notification for back-end users of my app. This app is a e-commerce website and I want the admin behind the scenes to be warned as soon as an order is made by a visual notification in the upper right corner of the admin panel. My server use FreeBSD.
My plan is to use Node.JS and Socket.IO to achieve this. If there is a better plan, I'm willing to hear about it. Otherwise, I cannot find proper resources to tell me how I can include Node.JS and Socket.IO to a Symfony2 app. I use composer to install bundles but I haven't used NPM with Symfony2.
I have found this question, this link and this other question to help me out but none of these tell me how I can install Node.JS in a Symfony2 app.
If someone could help me with the steps to complete to make me start developping this feature, I'd be glad.
Thanks!
For those who might be interested in the answer:
$ su -
Install Node.JS
$ cd /usr/ports/www/node
$ make install clean
Install NPM
$ cd /usr/ports/www/npm
$ make install clean
Install Socket.IO
$ cd /path/to/your/project/js/public/files
$ npm install socket.io
Develop the code
app.js
var http = require('http');
var fs = require('fs');
var server = http.createServer(function(req, res) {
fs.readFile('./index.html', 'utf-8', function(error, content) {
res.writeHead(200, {"Content-Type": "text/html"});
res.end(content);
});
});
var io = require('socket.io').listen(server);
io.sockets.on('connection', function (socket) {
socket.on('newOrder', function () {
socket.broadcast.emit('message', 'Nouvelle commande');
});
});
server.listen(4321);
Front-end
<script src="{{ asset('http://localhost:4321/socket.io/socket.io.js') }}"></script>
<script>
jQuery(function($) {
var socket = io.connect('http://localhost:4321');
$('form').on('submit', function() {
socket.emit('newOrder', '1');
});
});
</script>
Back-End
<script>
jQuery(function($) {
var socket = io.connect('http://localhost:4321');
socket.on('message', function(message) {
alert(message);
});
});
</script>
Launch server
$ node app.js
That's all!

Progmatically using npm from nodejs build script

I have a large project which contains multiple node application endpoints, each with their own package.json file.
I have a main build script (written in jake) which sets up a given environment, runs tests, packages apps etc.
So is there a way for the root build script to run "npm install" on the given directories.
I expect psudo code would be:
var npm = require("npm");
var package1Directory = "some-directory";
npm.install(packageDirectory);
Cannot find any documentation around this though so not sure if it is possible... so is it?
Yes, have a look at the docs:
var npm = require("npm")
npm.load(myConfigObject, function (er) {
if (er) return handlError(er)
npm.commands.install(["some", "args"], function (er, data) {
if (er) return commandFailed(er)
// command succeeded, and data might have some info
})
npm.on("log", function (message) { .... })
})
Also have a look at this example, which gives some more insights on how to use npm programmatically.

yeoman 1.0 - make development server accept POST calls

I'm using yeoman for my application which consists of 2 parts - client site with js/html/css and the rest service.
During development I start rest service in Eclipse and start server for my static files with
grunt server
The problem is that I have to do a post request to root url '/' (it's a fake login POST request to make browsers prompt to save passwords).
It worked with yeoman 0.9 but after updating I get:
Cannot POST /
Is there a way to configure grunt server task to accept POST requests?
Thanks!
Leonti
I think you want the connect-rest middleware.
https://github.com/imrefazekas/connect-rest
npm install connect-rest --save-dev
Edit Gruntfile.js, at the top
var restSupport = require('connect-rest');
restSupport.post( { path: '/savequestion'}, function(req, content, next){
next(null, {result: 'OK'});
});
In your connect or livereload middleware section:
livereload: {
options: {
middleware: function (connect) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
restSupport.rester( {'context': '/forms'} ),
rewriteRulesSnippet, // RewriteRules support
The key part is "restSupport.rester()", remove the context if you don't want it.
This simple function should just reply with the json object {result: 'OK'} to everything you post to /forms/savequestion . It should at least let you build out scaffolding in grunt server :9000 mode before you have build your templates. Without this you would have to $.get() each $.post() and then change it during or after the build.

Resources