I come from a completely non web-development background, but having seen the traction that mean.js is picking up, i really wanted to give it a shot.
I've followed tutorials online so I've basically started, run and modified the example app but am now trying to do something thats off of the tutorials. As a result I have a basic understanding of express and angular
I've been trying to integrate the activator npm package (https://www.npmjs.org/package/activator) into the app, and while I've managed to fit in the angular bits, I'm having trouble plugging in the express bits. Which brings me to a very fundamental doubt, the answers to which I haven't really been able to find. I know that in Mean, the angular code connects to the express code using REST API's created in express. And that I believe happens using angular services. But I don't understand how. For instance, the users module has the following service defined:
angular.module('users').factory('Users', ['$resource',
function($resource) {
return $resource('users', {}, {
update: {
method: 'PUT'
}
});
}
]);
Can anyone explain how this works ?
Also if I have some code on the express side say:
var sayHello = function(name){
return "Hello"+name;
}
How can I call this through angular? I know we use $resource for that from the ngResource module, but I dont really understand how.
Any help would be much appreciated.
Connecting these things together can be a bit confusing. I think the thing to understand is that when using Express on the server side, you need to model your API around a route, and handle communication with the req and res objects you'll be handed.
So first on the client side, taking a simple example, I generally use the $resource as a way of wrapping the HTTP/ajax details which I don't want to worry about. So I'll write my service as such:
"use strict";
angular.module("myModule").factory("UserService", ["$resource",
function($resource) {
var resource;
resource = $resource("/api/users", null, {
listUsers: {
method: "GET",
isArray: true
}
});
return resource;
}
]);
(Notice that I'm passing the isArray parameter to this resource since I expect an array of users to return -- which is not always the case with all APIs).
Then to take advantage of that resource, perhaps in my controller I'll have code like this:
"use strict";
angular.module("myModule").controller("UserCtrl", ["$scope", "UserService",
function($scope, userService) {
$scope.loadUsers = function() {
userService.listUsers(function(resource, headers) {
// this function is called on success, with the result
// stored within the `resource` variable
// ...
}, function(response) {
// this function is called on error
// ...
});
};
}
]);
Now assuming everything goes right on the server side, we'll receive our list of users to play around with passed in to the first function as the resource.
On the server side, we'll need to configure our routes (wherever those are configured) to include our users controller, which will serve as our users API. So perhaps within this app we have a routes directory which contains all our Express routes (see the app.route documentation for more information on Express routes). We also have a controllers directory which contains all our Express controllers that handle the logic for our routes. Keeping with the "users" example, we'll have a route defined that matches the /api/users $resource route we defined above in our Angular code:
"use strict";
var controller = require("../controllers/user");
module.exports = function(app) {
app.route("/api/users").get(controller.listUsers);
};
This code takes in the Express app as input, and defines a single route for /api/users as a GET HTTP request (notice the .get function called). The logic for this route is defined in the user controller, which would be something like this:
"use strict";
exports.listUsers = function(req, res) {
var users;
// ...somehow populate the users to return...
res.send(users);
};
We've left the details on how to populate that array of users, but hopefully this gives you the idea. This controller is passed the req (request) and res (response) HTTP objects as input, so it can query the request object for details on what the user passed in, and must send some response back to the user to complete the request/response loop. In this example I'm using the res.send function to simply send back our JavaScript array (which will be passed as JSON).
Does that make sense?
Related
I’m always coding backend api’s and I don’t really get how express does its bidding with my code. I know what the request and response objects offer, I just don’t understand how they come to be.
This simplified code for instance:
exports.getBlurts = function() {
return function(req, res) {
// build query…
qry.exec(function(err, results) {
res.json(results);
}
});
}
}
Then I’d call in one of my routes:
app.get('/getblurts/, middleware.requireUser, routes.api.blurtapi.getBlurts());
I get that the function is called upon the route request. It’s very abstract to me though and I don’t understand the when, where, or how as it pertains to the req\res params being injected.
For instance. I use a CMS that modifies the request object by adding a user property, which is then available globally on all requests made whether ajax or otherwise, making it easy at all times to determine if a user is logged in.
Are the req and res objects just pre-cooked by express but allow freedom for them to be modified to your needs? When are they actually 'built'
At its heart express is actually using node's default http-module and passing the express-application as a callback to the http.createServer-function. The request and response objects are populated at that point, i.e. from node itself for every incoming connection. See the nodeJS documentation for more details regarding node's http-module and what req/res are.
You might want to check out express' source code which shows how the express application is passed as a callback to http.createServer.
https://github.com/expressjs/express/blob/master/lib/request.js and https://github.com/expressjs/express/blob/master/lib/response.js show how node's request/response are extended by express specific functions.
So were trying to develop an application (or Service) with Node.js that provides each user a custom API that can be called from {theirUserName}.ourwebsite.com. Users will be able to change/edit/remote the endpoints of the API within the application through our editor. They can add params to the endpoints, add auth, etc.
Now my question is, how can we make the API online at first, then how can we change the endpoints online without stopping the API application and running again?
P.S: APIs configuration will be saved into a JSON that will be saved to the DB and once the configuration change an event will be raised that tells us the endpoints have changed.
Using Express, you can add routes after the server is listening, so it's not a problem. Beware of precedence as it will be added at the bottom of the stack.
I would advise to have a db storing routes, and when running the node app (before listening) load all the routes in db and add them to the router. In order to be able to scale your app as well as being able to restart it safely.
Then start listening, and have a route for adding routes, deleting routes, updating routes etc.
Here is a simple example of adding a route after listening :
const app = require('express')();
const bodyParser = require('body-parser');
app.use(bodyParser.json());
const someGenericHandler = function(req, res) {
return res.json({ message: 'foobar' });
};
// it creates a route
app.post('/routes', function(req, res) {
try {
const route = req.body;
app[route.method](route.path, someGenericHandler);
return res.json({ message: `route '${route.method}${route.path}' added` });
} catch(err) {
return res.status(500).json({ message: err.message || 'an error occured while adding the route' });
}
});
app.listen(process.env.PORT);
You can try this code, paste it in a file, let say index.js.
Run npm i express body-parser, then PORT=8080 node index.js, then send a POST request to http:/localhost:8080/routes with a json payload (and the proper content-type header, use postman)
like this: { method: 'get', path:'/' } and then try your brand new route with a GET request # http://localhost:8080/'
Note that if you expect to have hundreds of users and thousands of requests per minute, I would strongly advise to have a single app per user and a main app for user registering and maybe spawn a small VPS per app with some automation scripts when a user register, or have some sort of request limit per user.
Hope this helps
I'm coding my first "solo" nodejs webapp. Its based on a previous app (that I coded by following some kind of tutorial/course) which was an Express REST API that allows you to add/remove/update/list a Todo list. I've also implemented user authentication using jwt/bcrypt. All this is stored in a MongoDB database.
Also note that all the endpoints return JSON.
I'm now trying to add a front-end to the app. The API endpoints are at /api/endpoint1, /api/endpoint2, etc., and the views are rendered on /view1, /view2, etc. I'm doing this on purpose so that I can get the responses in plain JSON from the API, or show it in a webpage rendered.
I started by using jQuery's ajax to make the calls but I realized this was not the way I wanted to do this. I removed all the js scripts on my webpage and started working directly on the server, rendering the pages with the info fetched from the api.
This is what I have now:
server.js (main file) [sample]
// RENDER 'GET TODOs'
app.get('/todos', authenticate, (req, res) => {
let auth = req.cookies['x-auth'];
request({
url: 'http://localhost:3000/api/todos',
headers: {
'x-auth': auth
}
}, function (error, response, body) {
if (error || response.statusCode !== 200) {
return res.status(response.statusCode || 500).send('Error'); // TODO
}
let bodyJSON = JSON.parse(body);
res.render('todos', {
title: 'Todo App - Todos',
todos: bodyJSON.todos
});
});
});
// API endpoint to 'GET TODOs' (JSON)
app.get('/api/todos', authenticate, (req, res) => {
Todo.find({
_creator: req.user._id
}).then((todos) => {
res.send({todos});
}, (err) => {
res.status(400).send(err);
});
});
I don't know why, but all this looks weird to me. I'm wondering if this is how I'm supposed to do this. I mean, is this a good approach/practice on making a API+front-end node app ?
Also, I'm using an auth middleware twice: in the views and in the API itself. I guess this is OK?
It would probably be better to use React/Angular but this is such a small app and I just wanted to make a really simple front-end.
Just keep things simple.
If you go with server-side HTML rendering, you don't need a REST API, just drop it. You need an API in case of an ajax frontend or mobile app.
If you needed a combined approach (server-side rendering + mobile app or server side rendering with some ajax), at the very first step you would want to isolate your database querying code into a separate module (which is actually always a good idea) and use the module from your API and from your views directly, avoiding API usage from server-side views.
This way you will eliminate excessive auth and make debugging much easier, also your code will become cleaner, thus more maintainable.
Also, React is not that complex, i would definitely give it a shot :)
I am building an app that requires a login which if successful, passes you off to another page called events. However, Backbone works with the hash in the URL bar, therefore, the request that someone accessed that page is never sent to NodeJs Server.
The thing is, someone without login can access the page by just typing in http://www.mywebsite.com/#events
How can this be prevented?
Overwrite the 'execute' function in the backbone router.
From the docs:
router.execute(callback, args)
This method is called internally within the router, whenever a route
matches and its corresponding callback is about to be executed.
Override it to perform custom parsing or wrapping of your routes, for
example, to parse query strings before handing them to your route
callback, like so:
So, for example: (http://plnkr.co/edit/BqD4YfjQYz2RITWNhBKZ?p=preview)
var Router = Backbone.Router.extend({
execute: function(callback, args) {
if(!someLoginFunctionCheck()) {
this.navigate('#')
} else {
if (callback) callback.apply(this, args);
}
}
});
What is the best way to send POST request from node server which has received the request parameter from a client? Reason I am asking for best practice because it should not affect the response time if multiple clients are calling the node service.
Here is the Backbone Model which sends the request to node server:
var LoginModel = Backbone.Model.extend({
url:'http://localhost:3000/login',
defaults: {
email:"",
password:""
},
parse: function(resp) {
return resp;
},
login: function() {
console.log('Here in the model'+JSON.stringify(this));
this.save();
}
});
var loginModel = new LoginModel();
Node Server
var http = require('http'),
express = require('express');
var app = express();
app.listen(3000);
app.post('/login', [express.urlencoded(), express.json()], function(req, res) {
console.log('You are here'); console.log(JSON.stringify(req.body));
//Send the post request to third party service.
});
Should I use something like requestify inside app.post() function and make a call to third party service?
I like superagent personally but request is very popular. hyperquest is also worth consideration as it resolves some issues with just using the node core http module for this.
Reason I am asking for best practice because it should not affect the response time if multiple clients are calling the node service.
First, just get it working. After it's working you can consider putting a cache somewhere in your stack either between your clients and your api or between your server and the third party api. I'm of the opinion that if you don't know exactly where you need a cache, exactly why, and exactly how it will benefit your application, you don't need a cache, or at the very least, you aren't prepared instrumentation-wise to understand whether your cache is helping or not.