Implementing "side effects" in Express routes? - node.js

I'm looking for some advice on how to achieve something the "proper" way in Express.
When routes on my API are hit, I need to send a bunch of "side-effect" data to all clients via a websocket. All the websocket stuff is done and working, my question is mostly conceptual. So, for example, a POST is made to /message, after the route controller has handled the request and sent a response, I need to send some updated data regarding other data models via websocket to all clients.
I could, of course, just send the WS message from the route controller, but that feels haphazard and unstructured. I'm sure there must be a "proper" way to do it! I did wonder about creating a middleware that runs after the route controller that either examines the request and sends the appropriate updates, or takes something passed from the route controller and uses that to determine what to send. Does anyone have any suggestions?
Thanks!

Related

QR code best approach for POST request from REST API

I'm setting up a website that will be mobile focused and one of the features I wan't to implement is users to be able to log an entry by just scanning a QR code.
For what I read is not really possible to make a POST request directly from a QR code, so I was thinking in two different options:
1. Make a GET request and then redirect that inside my server to a POST route in my routes.
So the URL would be something like https://example.com/user/resources/someresourceid123/logs/new and then this would create a POST request to https://example.com/user/resources/someresourceid123/logs/ and create the new entry to then send a response to the user but I'm not really sure this is the best approach or if it's possible at all.
My POST request only requires the resourceid which I should be able to get from req.params and the userid which I get from my req.user.
2. Do my logic and log the entry to my DB using the GET request to https://example.com/user/resources/someresourceid123/logs/new.
This would mean that my controller for that request will do everything needed from the GET request without having to make an additional POST request afterwards. I should be able to get both the resourceid and userid from the req object but not sure if being a GET request limits what I can do with it.
If any of those are possible, which would be the best approach?
I'd propose to go with a second option simply for the sake of performance. But you need to make sure your requests are not cached by any proxy, which is usually the case with GET requests.

OPTIONS Preflight request executes POST's code - is that standard?

If I understand correctly, a preflight OPTIONS request is sent as a way of asking "what's allowed here?". Then, once the response comes, if allowed, the calling site sends the POST request (or GET but in my case it's a post). I have figured out that, at least with Azure Function Apps, the OPTIONS request is executing the code that I expected only the POST to execute. I believe this to be the case because once I added some null checking (since the OPTIONS request doesn't have a payload in the body) everything worked fine.
I'm wondering if this is standard.
Seems to me that if I had written the API without using Azure Function Apps, I'd have the OPTIONS request sent down a path that would set the appropriate headers and return a 200 response. And the POST request would be sent down a different path that would expect a payload in the body. If that's how it usually works then that means I've just found an idiosyncrasy of the Azure functionality. But if not it means that I have something to learn about the OPTIONS preflight request.
Thanks in advance for your advice.
Denise
As sideshowbarker mentioned, the OPTIONS request is sent automatically by the browser to check if the cross-origin request can be made.
In case of Azure Functions, this will handled by the Azure when running in the cloud.
If your function is being triggered, that would mean that you have "options" as a supported method for your HTTP Trigger
In the HTTPTrigger attribute for C# functions
In functions.json for non-C# functions
If you want to customize the CORS responses and/or running functions in a container, you could always include "options" as supported and respond differently when the incoming HTTP method is OPTIONS.
Also, if you are using Azure API Management with Azure Functions, you could offload CORS handling to it instead or even use Functions Proxies as shown here.
Thanks y'all! Sorry I was unclear. And sorry it took me a while to get back. Things have been a bit crazy on this end.
Yes, the function being called is mine. And now I understand the browser doesn't have much choice as to whether or not it makes the OPTIONS call.
And yes, I could make my Azure function handle an options call differently and thanks for that suggestion too. That's sort of what I ended up doing but basically I did it by handling an empty payload. I didn't follow that best practice originally because I thought any valid request would have a payload. Accordingly, any request that did not have a payload was invalid and should be turned away as a failure of some sort. This was before I knew that the OPTIONS call was actually executing that function.
My remaining question is if I had NOT been using Azure... if I had rolled my own solution and hosted it somewhere, I'd have a class or at least methods that handle calls to this particular API. (This is something I'm new to so bear with me if my terms aren't quite right and please do correct me). So if I'd done my own API, I'd have one method to handle a POST call and a different method to handle an OPTIONS call, wouldn't I? And the method that handles the OPTIONS call would return information about what's legally do-able with this API. And the method that handles a POST call would handle the payload sent with it. And the method that handles the POST wouldn't get executed when an OPTIONS request is sent. At least that's how I figured it would work. And that's my question -- is that how it's done when not letting something like Azure handle some of the infrastructure?
I'm just trying to learn if the OPTIONS request executing a POST's function is a standard practice or if it's some kind of idiosyncrasy to working with Azure functions.
Thanks again for the advice and for helping me understand these questions.

How do I handle a third party API callback in NodeJS?

Question
A NodeJS server is called by a client. This causes a further call to be made to a 3rd party API. The API then asynchronously calls-back to the NodeJS server. How do I make the client aware that the asynchronous callback has completed?
Details
I have an NodeJS server with these two routes (code is coffeescript):
app.get '/security/login/application/authorise', ->
applicationService.authorise()
app.get '/security/login/application/callback', (req, res) ->
applicationService.login req, res
The first route is called by my AngularJS client. Its purpose is to authorise the client and allow it to start-up (users will sign in later). The authorisation process involves making a call to a third party security API.
The security API does its thing and then calls-back to the NodeJS server via the /callback route shown above. The information passed to the callback allows a further call to be made to the API that determines if the original authorisation request will pass or failed.
The problem is that call to the first call to the /authorise route is, of course, asynchronous and so returns to the client right away. The client is then left in limbo, not sure if the NodeJS server has authorised it or not.
Please note that I cannot just nest these calls (imo) because the first call to the API simply returns 200 OK regardless. The process only continues when, sometime in the future, the third party API starts a new conversation by calling back into the NodeJS server via the /callback route.
Options
It seems I have a number of unpalatable options:
Stay Asynchronous. Return control to the client and then have the client poll the server, presumably with some unique, temporary 'callback-id', to determine if the callback has been completed.
Go Synchronous. Block the return with a hacky loop of some sort. Maybe promises can clean this up a bit somehow.
Go Bidirectional. Use sockets to allow a push notification from the server (but what about old browsers like IE8, which I have to support).
I think I have probably over-cooked this problem and the solution is most likely easier than I imagine. Your help would be gratefully received.

Express & Backbone Integration

Ok, I am new to web dev and here's a stupid question. I have been through a few tutorials for node, express and backbone individually, but I can't seem to wrap my head around how they are integrated. Particularly, consider this use case:
Person X opens the browser, types in a URL and hits enter->Express responds to the request and sends some data back to the browser.
My question is, where does backbone come into the picture here ? I know it's a mvc framework to organize your JS code. But, I can't find a place in this use-case where the server/browser interacts with backbone. Only thing I can think of is that the backbone saving the route and serving the page the next time. But what about the first time ? It would be best if someone could explain to me how the request gets routed from client browser to express/backbone to browser again.
Also, am I correct in assuming response.send() or response.json() will send the result to backbone when model.fetch() is called ? I mean, is there no additional code required ? Being new to web dev, I'm quite not used to the idea of the framework 'taking care' of everything once you send the response back.
EDIT : Here's what I have understood so far. Feel free to correct me if I am wrong. When I access websites like gmail, the server first sends a big html file including backbone.js code in it. The backbone.js code listens for events like clicking on links in the html file and handles them if the links are defined in it routes(routes are always relative to current route, accessing a completely different route sends request to the server). So, if I click compose, my url remains the same because backbone handles the request. However, if I click Maps/News services in the bar above, the server handles the request.
There is no special integration between backbone and node.js.
If you use the standard backbone sync method then all you need to do is:
Use the static middleware in express to serve up your static html/js/... files.
Define RESTfule routes in express that conform to what backbone is expecting.
Backbone does indeed make an http call when you do model.fetch. You could look in Chome network tab to see where it's sending the request to and then implement that route in express.

How can Socket.io and RESTFul work together?

(I'm not familiar to RESTFul, please correct me if my concept is wrong)
In RESTFul architecture, we map every action to an URL. If I click "post a article", may it's actually URL http://example.com/ and some data action=post&content=blahblah.
If I want to post, but not refresh the whole web page, I can use javascript's XMLHTTPRequest. I post it and then get it's content and insert it to a div in my page. These action is all asynchronous.
Then I know there is something named WebSocket and it's wrapper socket.io. It use "message" to communicate between client and server. When I click "post" the client just call socket.send(data) and wait for server's client.send(data). It's magical. But how about URL?
It's possible to use the two model both without repeating myself? In other word, every action has it's URL, and some of them can interact with user real-timely(by socket.io?)
Moreover, should I do this? In a very interactive web program(ex. games), the RESTFul is still meaningful?
You're defining a handler for actions that map to REST over http. POST and GET generally refer to update and query over an entity. There's absolutely no reason you can't just define a handler for generic versions of these CRUD operations that can be used in both contexts. The way I generally do this is by introducing the concept of a 'route' to the real-time transport, and mapping those back to the same CRUD handlers.
You have a session, you can impose the same ACL, etc.
 +---------------------------------+
 |                                 |
 |      BROWSER                    |
 |                                 |
 +--+--^-------------------+---^---+
    |  |                   |   |
    |  |                   |   |
 +--v--+---+            +--v---+---+
 |         |            |          |
 | HTTP    |            | SOCKET.IO|
 +--+---^--+            +--+---^---+
    |   |                  |   |
 +--v---+------------------v---+---+
 |                                 |
 |        ROUTING/PUBSUB           |
 +-+--^-------+--^-------+--^------+
   |  |       |  |       |  |
 +-v--+--+  +-v--+--+  +-v--+-+
 |       |  |       |  |      |
 | USERS |  | ITEMS |  |ETC   |
 +-------+  +-------+  +------+
     ENTITY CRUD HANDLERS
I posted this on my blog recently:
Designing a CRUD API for WebSockets
When building Weld, we are using both REST and WebSockets (Socket.io). Three observations on WebSockets:
Since WebSockets are so free-form, you can name events how you want but it will eventually be impossible to debug.
WebSockets don’t have the request/response form of HTTP so sometimes it can be difficult to tell where an event is coming from, or going to.
It would be nice if the WebSockets could fit into the existing MVC structure in the app, preferably using the same controllers as the REST API.
My solution:
I have two routing files on my server: routes-rest.js and routes-sockets.js
My events look like this example: "AppServer/user/create".
I use forward slashes (“/”) to make the events look like routing paths.
The first string is the target (~”host name” if this actually was a path).
The second string is the model.
The third string is the CRUD verb: i.e. create, read, update, delete.

Resources