Cannot request Iron Router server route twice without refreshing the client - node.js

I am calling a meteor method, which generates a file using fs. I wait for a callback giving me the path where the file exists, and then I request the file with a server route. The code is very similar to this SO answer. I have also tried using createReadStream (as demonstrated here) instead of passing the file directly to response.write.
This all works well on the client's first click of my export/download button. However, if for some reason they want to click the button more than once, the file will get generated but the file will not get served by Iron Router. There are no errors on the client or server. If the user refreshes the client, then the feature will work again (once).
Why do I need to refresh the browser in order to request the same server route a second time? Am I doing something wrong?
Example Application

Does the URL change when they click the first download? If so, and the second route is the same, you will not get redirected as you are already there. If this is the case, can you use the router hooks to send the user back to the route they came from?

Related

node js rendered twice

i am using node js with template enjine ejs, i cant figure why my server is running every request from the browser two times, vecause when i do the same request from postman its only loaded one time, if i see network i get this
ass u can see the page home is loaded two times, so the req is make 2 times, where the server process times the same request and every thing , but why
after if i look at the request headers made by the browser i see something strange, that the header change its content, in my first(normal)request i get this:
but in the second request(the one that should'nt be made:
you can see that the accept parameter is different it looks like if he is requesting for an image in the static files , but i dont know what can be, first i thought was theb server was requesting the favicon.ico file, but its not because i already added a handler for that,
here i gonna let the whole project github code if someone need it and can help:
main files are appjs and server js, and req are handled by the routers folder and routers handle to their controller, is an MVC architecture
https://github.com/jazdevv/social-media-tw

What is the right way to serve private routes?

I am trying to understand the following security aspect:
If I understand it right, in PHP the URL is routing directly to specific files on the server. e.g. https://mypage.com/secretFunctions.php will route to secretFunctions.php. If I protect this route trough basic authentication, the authentication will happen before the file is processed and served to the client. In result, only the authenticated user will ever get to see the rendered content.
In case of a client-side node.js app, the whole App.js code is sent to the client and processed there. The URL is always pointing at the same file. The routes defined in that file determine which code snippet will be executed and rendered. Even if I protect any specific route by any kind of authentication, the user will get the whole code anyway.
Is there a way to prevent this? To prevent clients receiving the private code of private routes, as long as they are not authenticated?
In node.js the whole App.js code is sent to the client and processed there.
No, that is not how it works at all. When you run node.js and code on your server that code doesn't go to the client - ever. It just runs on the server. The client has no access to it. If you run your node.js app with the command line node app.js, then the node executable engine is run and it executes app.js on the server.
In a node.js server, you would not typically define private routes and expect to use them only from your server implementation with no client access. While, it can be done (using some form of authorization token that only the server knows), you typically wouldn't use a private route at all on your public server. Instead, rather than requesting a private route on your own self, you would just put the relevant code in a function and call that function directly. Then, it's entirely private to the server implementation.
Then, app.js would start an http server process and define routes that it wants that server to handler. When a client comes along and requests a route from the server, then the app.js code (or other modules that it loads) will parse the requested route, map it to a defined handler for that route and execute just the node.js code that is supposed to process that route. When it's done processing that route, it will send a response. Depending upon what type of request it is, that response might be a web page that the browser will parse and display or it might be a piece of JSON that is the result of the request or it could be any number of other data types.
The only code that is sent to the client is Javascript code that is embedded in a web page and specifically meant to run inside the browser. That would not be node.js code and is not your server code. That would be browser Javascript just like one would use with any back-end framework (node.js is no different than any other framework in that regard).
Is there a way to prevent this? To prevent clients receiving the private code of private routes, as long as they are not authenticated?
This part of the question is mostly just misguided as it seems to be based on a misunderstanding of how a node.js back-end app server works. You prevent code going to the client that shouldn't by not sending it to the client in the first place. The client can't see anything from your server that you don't specifically have a route to handle and send a response. By default a node.js web server sends NOTHING to the client. No pages are sent by default. So, the only thing that is ever sent to a client is a response that you write code specifically to send. Note, this is different than other back-ends (like Apache) that may be configured to automatically send lots of things that are requested.
If I understand it right, in PHP the URL is routing directly to specific files on the server. e.g. https://mypage.com/secretFunctions.php will route to secretFunctions.php.
That is PHP. That is not how node.js web servers work. For the route https://mypage.com/secretFunctions.php to respond at all to the client, you have to define a route in your server code that was specifically coded to respond to the request /secretFunctions.php and then you'd define what code should run on the server when that request is received. No code is sent to the client unless you specifically write code in this route handler to send code to the client that you want to run in the browser.
Let me show you a very simple node.js app that responds to three routes (and only three routes). Let say this is app.js:
const app = require('express')();
app.get('/', function(req, res) {
res.send("Hello");
});
app.get('/name', function(req, res) {
res.send("My name is Bob");
});
app.get('/city', function(req, res) {
res.send("I am in San Francisco");
});
// start the server
app.listen(80);
You start this server with node app.js. When it runs app.js, it initializes an instance of the express framework, registers three route handlers and then starts the web server.
Now, you have a running web server. Suppose you have a web browser running on the same server and you type this into the URL bar: http://localhost. That will trigger the '/' route handler and will display "Hello" in the browser.
Then, you type http://localhost/name into the browser. That will display "My name is Bob" in the browser.
If you type any other route besides these three that are defined here, nothing will be sent to the browser. It doesn't matter if there's a contacts.html file sitting in the same directory. If there isn't a specific coded route to handle a request, nothing is sent. So, no private code is every sent by default.
Now, there are ways to instruct your server to automatically send some static files, but that takes a special kind of route that is told exactly what route prefixes to look for and what directory on your server to look for files that might match URLs that are requested. This can be done for static files that you specific locate in a directory by themselves (like .css files, for example) that you intend to send to the client when requested. And, you have to use the right code to make that happen. By default that never happens.

Refreshing with Browser History using React Router returns calls from my API

From what I understand, to use the browserHistory object in React Router properly, we need to use a wildcard * route so that the user is returned the single page application upon every refresh. Where this becomes a problem is when my client-side URL is the same as one of my api URLs.
For example, say I have a website called example.com and that one of the pages on this site is example.com/profile. If I refresh, the expected behavior is that it will simply reload the website example.com and route correctly to /profile, since the wildcard route should have returned the client the single page application. Where this doesn't work is if I have an API route /profile, i.e. example.com/profile. Now when the user refreshes the page, instead of being returned the the profile page, they are instead returned a JSON response from my API.
Depending on the order in which the wildcard route is declared, I can either have the API response returned on refresh, or the single page app on refresh, but not both when the API url is the same as the client side browser url.
Are there any suggestions around this issue? My current solution is to migrate all of my API routes to have /api/ prepended to them.
You need to make sure your client routes do not collide with your API routes, and prefixing API routes with /api is a pretty standard way to handle this.
There's no real way "around" the issue other than making sure they don't collide – various workarounds might be possible, but it's going to be more clear for everyone involved if routes are unambiguous.

Server side response to allow client side routing

I am developing a single page application that has a client side router. so although the base url to run the application will be either http:://example.com or http:://example.com/index.html - skipping the domain name that is routes '/' and '/index.html'
But somewhere in my application, because of my client side router, I may call up a route something like '/appointments/20160113 and the client router will redirect me to the appropriate "Appointments Page" inside my SPA passing the parameter of todays date.
But if the user calls directly http://example.com/appointments/20160113, I am led to believe that the server should respond directly with /index.html so the browser doesn't get a 404.
I am building the server using nodejs (specifically the http2 module, but I don't think that is very important, and my examples don't use https, although with http2 they do). I just tried changing the server so if its hit with an unknown url it responds with the index.html file.
However, the browser sees this as its first response and then makes requests for the rest of its attached files relative to the url (so for instance follows up with /appointments/20160113/styles/main.css). This leads to an infinite loop, as the server responds with another copy in index.html (and immediately gets a request back for /appointments/20160113/styles/styles/main.css ).
Its too early in the lifecycle of the page for the javascript to be running yet (and specifically the router software), so clearly the approach is too simplistic.
I have also written the client side router (using the history api) so I can change things if I need to.
How should this situation be handled. I am thinking perhaps a 301 redirect to /index.html or something and then the router's initial dispatch knows this and can do a popstate or something. I ideally want to support the passing of urls via external means between users, but until I actually tried to implement it I hadn't realise the implications.
I don't know if this is the best way or not, but having not received any answers on here, I decided to try a number of different ways and see which worked out the best.
Each way involved doing a 301 redirect to /index.html, and then providing the url from which I was redirecting via different mechanisms
This is what I tried
Setting a cookie with a short expiry date the value of which was the url
Adding a query string with a ?redirect= parameter with the url
Adding a #fragment after /index.html with the url
In the end I rejected 1) because chrome wasn't deleting the cookie after I had used it and making the value shorted lived depends on accurate timing between client and server. The solution appeared too fragile.
I tried 2) and it was looking good until I came to test it. Unfortunately setting window.location.search causes a page reload, and I was really struggling with finding out what was happening. However, what I discovered in 3) about mocking could well be provided to a solution based on 2) so it is one that could be used. I still might return to this solution as it "feels" right to me.
I tried 3) and it worked quite well. However I was struggling with timing issues in testing since my router element was using the #fragment during initialisation, but I couldn't set the window.location.hash until after the router was established in the test suite. I wanted to mock window.location.hash with sinon so I could control it, but it turns out you can't
The solution to this was for the router to wrap its own calls to window.location.hash in a library, so that I could mock the library. And that is what I did in the end and it all worked.
I could go back to using a query string and wrapping window.location.search in a library call, so I could stub that call and avoid the problems of page reloading.

Express & Backbone Integration

Ok, I am new to web dev and here's a stupid question. I have been through a few tutorials for node, express and backbone individually, but I can't seem to wrap my head around how they are integrated. Particularly, consider this use case:
Person X opens the browser, types in a URL and hits enter->Express responds to the request and sends some data back to the browser.
My question is, where does backbone come into the picture here ? I know it's a mvc framework to organize your JS code. But, I can't find a place in this use-case where the server/browser interacts with backbone. Only thing I can think of is that the backbone saving the route and serving the page the next time. But what about the first time ? It would be best if someone could explain to me how the request gets routed from client browser to express/backbone to browser again.
Also, am I correct in assuming response.send() or response.json() will send the result to backbone when model.fetch() is called ? I mean, is there no additional code required ? Being new to web dev, I'm quite not used to the idea of the framework 'taking care' of everything once you send the response back.
EDIT : Here's what I have understood so far. Feel free to correct me if I am wrong. When I access websites like gmail, the server first sends a big html file including backbone.js code in it. The backbone.js code listens for events like clicking on links in the html file and handles them if the links are defined in it routes(routes are always relative to current route, accessing a completely different route sends request to the server). So, if I click compose, my url remains the same because backbone handles the request. However, if I click Maps/News services in the bar above, the server handles the request.
There is no special integration between backbone and node.js.
If you use the standard backbone sync method then all you need to do is:
Use the static middleware in express to serve up your static html/js/... files.
Define RESTfule routes in express that conform to what backbone is expecting.
Backbone does indeed make an http call when you do model.fetch. You could look in Chome network tab to see where it's sending the request to and then implement that route in express.

Resources