Ghost Blog - Check Request Headers - node.js

I have a self-hosted Ghost blog running. I want to check for the presence of a custom header, for example X-Den-Was-Here.
What I want to implement is a conditional check, where:
If the header is present - load the blog cotnents.
If the header is not present - return a 401 Unauthorized.
Where would be the most appropriate place to perform this check within the Ghost infra?

According to Express 4.x API Reference you can access headers using req.get(headerName), and check if it returns undefined or something, e.g.:
app.get('/', function(req, res, next) {
if(req.get(headerName) == undefined){
//do not load modules
}else{
loadModules();
}
});

As it turns out, the solution to this (and I am open to have someone validate it and show me that I chose the wrong location for it) is to modify the caching layer to verify an inbound request header.
For that, you need \core\server\middleware\cache-control.js. Within the cacheControlHeaders function, you can just add the snippet below right before the next() call:
if (req.headers["den-was-here"] != "1")
{
return res.sendStatus(401);
}
This will effectively throw a 401 Unauthorized response for any request that does not carry the header.

Related

How to handle expressjs rest routes with empty path param with csurf returning 404 instead of 403?

I already asked this question to the author of csurf (Douglas Wilson) via Github Issues and he suggested to create a new post on stackoverflow.
The original issue is here: https://github.com/expressjs/csurf/issues/161
My situation is:
I have some apis in express (post, put, get and delete) also with path params.
Some examples:
DELETE /api/users/2376213786213
POST /api/users . (with a body)
and so on
I want to use csurf, but I also want to catch when someone call an api with an empty path param and return a 404.
In short, If you call DELETE /api/users/ (with an empty id as path param) I want to return 404, otherwise if you call DELETE /api/users/12121 I want to handle the csrf token and return 403 if not valid.
Is it possible? How?
I created a middleware to handle csrf like in the official csurf's example:
// error handler
app.use(function (err, req, res, next) {
if (err.code !== 'EBADCSRFTOKEN') return next(err)
// handle CSRF token errors here
res.status(403)
res.send('form tampered with')
})
And after this middleware, I created another one to handle 404, nothing special.
However, this api DELETE /api/users/ (with an empty id as path param) is trapped by the first middleware and return 403, instead of 404.
How can I fix this?
Thank you.

Change response cookies with node-http-proxy?

So, I am proxying my API requests through a node-http-proxy for several reasons.
The external API has a different origin than the actual client, so cookies are not being set correctly. The proxy obviously runs at the same origin, so I want to receive the response from the API, and inside the proxy, change the cookie value to reflect the proper origin.
Here's my current setup:
// Proxy to API server
app.use('/api', (req, res) => {
proxy.web(req, res, { target: targetUrl })
})
proxy.on('proxyRes', function (proxyRes, req, res) {
console.log('RAW Response from the target', JSON.stringify(proxyRes.headers, true, 2))
console.log('The original request', req.headers.host)
})
Basically, I need to modify the cookie to req.headers.host, as this is the correct origin.
I've seen Harmon, but this looks very involved and changes how you instantiate your entire app, if I understand correctly.
Is there a way to simply modify the proxyRes after receiving it, in a synchronous fashion?
It seems very strange that there is a proxyReq event that allows you to alter the proxy request before it's sent, but not an equivalent that allows you to alter the response...
For anyone facing the same issue, I found a solution. They just merged a PR a few days ago that hasn't made it into a new release yet.
This PR introduces a new option called cookieDomainRewrite that does exactly what it sounds like. Simply include this in your config and it's all taken care of.

Expose routes on different domains

I am struggling with something that doesn't look that hard : let's say I have 2 urls to access my server :
http://localhost:80/
and an external url
http://domain.com/internal/
Is there a way to do add a basepath internal if the forwarded host is equal to the external url host?
Something like :
app.use(function(req, res, next) {
if (req.headers['x-forwarded-host'] === 'domain.com') {
app.use('/internal', routes);
} else {
next();
}
})
There wont be any direct method as in a shortcut to work your way around for personal use cases.
I suggest this simple method though. Let's take example of app.get('/xyz') route.
This can be accessed locally via http://locahost:80/xyz or yourdomain.com/xyz via any application not hosted locally (Unless you make a call using your domainname in your own application).
Add a header element with every request when the call is internal.
Now, whenever our/xyz route is called check for that header element using a if condition and if the request is made internally you'll have that header element there and then you can simply use either res.redirect or any other method that you find useful (Exporting function in current route or anything else you find easy and needful).

How to push a sequence of html pages after one request using NodeJS and ExpressJS

I am turning around in stackoverflow without finding an answer to my question. I have used expressJS fur several days in order to make an access webpage that returns first an interstitial and then a webpage depending on several informations I can get from the requester IP and so on.
My first idea for the interstitial was to use this piece of code:
var interstitial = function(req, res, next) {
res.render('interstitial');
next();
}
router.get('/', interstitial, nextPage);
setting a timeout on the next nextPage callback function of router.get().
However it looks that I could not do that. I had an error "Error: Can't set headers after they are sent.". I suppose this is due to the fact that res.render already give a response to the request and in the philosophy of express, the next function is passing the req, res args for another reply to another function that possibly could do it. Am I right?
In that case, is there a way to give several answer, with timeout to one request? (a res.render, and after that in the next callback a rest.send...).
Or is this mandatory to force client to ask a request to give back another response? (using js on the client side for instance, or timers on client side, or maybe discussing with client script using socket.io).
Thanks
Not sure I fully understand, but you should be placing all your deterministic logic within the function of the handler you're using for your endpoint.
Kinda like so:
router.get('/', function(req, res){
var origin = request.origin;
if (origin == '11.22.33.44'){
res.send('Interstitial Page.');
}else{
res.send('Home Page');
}
});
You would replace the simple text responses with your actual pages, but the general idea is that once that endpoint is handled you can't next() it to secondary handler.

Express request is called twice

To learn node.js I'm creating a small app that get some rss feeds stored in mongoDB, process them and create a single feed (ordered by date) from these ones.
It parses a list of ~50 rss feeds, with ~1000 blog items, so it's quite long to parse the whole, so I put the following req.connection.setTimeout(60*1000); to get a long enough time out to fetch and parse all the feeds.
Everything runs quite fine, but the request is called twice. (I checked with wireshark, I don't think it's about favicon here).
I really don't get it.
You can test yourself here : http://mighty-springs-9162.herokuapp.com/feed/mde/20 (it should create a rss feed with the last 20 articles about "mde").
The code is here: https://github.com/xseignard/rss-unify
And if we focus on the interesting bits :
I have a route defined like this : app.get('/feed/:name/:size?', topics.getFeed);
And the topics.getFeed is like this :
function getFeed(req, res) {
// 1 minute timeout to get enough time for the request to be processed
req.connection.setTimeout(60*1000);
var name = req.params.name;
var callback = function(err, topic) {
// if the topic has been found
if (topic) {
// aggregate the corresponding feeds
rssAggregator.aggregate(topic, function(err, rssFeed) {
if (err) {
res.status(500).send({error: 'Error while creating feed'});
}
else {
res.send(rssFeed);
}
},
req);
}
else {
res.status(404).send({error: 'Topic not found'});
}};
// look for the topic in the db
findTopicByName(name, callback);
}
So nothing fancy, but still, this getFeed function is called twice.
What's wrong there? Any idea?
This annoyed me for a long time. It's most likely the Firebug extension which is sending a duplicate of each GET request in the background. Try turning off Firebug to make sure that's not the issue.
I faced the same issue while using Google Cloud Functions Framework (which uses express to handle requests) on my local machine. Each fetch request (in browser console and within web page) made resulted in two requests to the server. The issue was related to CORS (because I was using different ports), Chrome made a OPTIONS method call before the actual call. Since OPTIONS method was not necessary in my code, I used an if-statement to return an empty response.
if(req.method == "OPTIONS"){
res.set('Access-Control-Allow-Origin', '*');
res.set('Access-Control-Allow-Headers', 'Content-Type');
res.status(204).send('');
}
Spent nearly 3hrs banging my head. Thanks to user105279's answer for hinting this.
If you have favicon on your site, remove it and try again. If your problem resolved, refactor your favicon url
I'm doing more or less the same thing now, and noticed the same thing.
I'm testing my server by entering the api address in chrome like this:
http://127.0.0.1:1337/links/1
my Node.js server is then responding with a json object depending on the id.
I set up a console log in the get method and noticed that when I change the id in the address bar of chrome it sends a request (before hitting enter to actually send the request) and the server accepts another request after I actually hit enter. This happens with and without having the chrome dev console open.
IE 11 doesn't seem to work in the same way but I don't have Firefox installed right now.
Hope that helps someone even if this was a kind of old thread :)
/J
I am to fix with listen.setTimeout and axios.defaults.timeout = 36000000
Node js
var timeout = require('connect-timeout'); //express v4
//in cors putting options response code for 200 and pre flight to false
app.use(cors({ preflightContinue: false, optionsSuccessStatus: 200 }));
//to put this middleaware in final of middleawares
app.use(timeout(36000000)); //10min
app.use((req, res, next) => {
if (!req.timedout) next();
});
var listen = app.listen(3333, () => console.log('running'));
listen.setTimeout(36000000); //10min
React
import axios from 'axios';
axios.defaults.timeout = 36000000;//10min
After of 2 days trying
you might have to increase the timeout even more. I haven't seen the express source but it just sounds on timeout, it retries.
Ensure you give res.send(); The axios call expects a value from the server and hence sends back a call request after 120 seconds.
I had the same issue doing this with Express 4. I believe it has to do with how it resolves request params. The solution is to ensure your params are resolved by for example checking them in an if block:
app.get('/:conversation', (req, res) => {
let url = req.params.conversation;
//Only handle request when params have resolved
if (url) {
res.redirect(301, 'http://'+ url + '.com')
}
})
In my case, my Axios POST requests were received twice by Express, the first one without body, the second one with the correct payload. The same request sent from Postman only received once correctly. It turned out that Express was run on a different port so my requests were cross origin. This caused Chrome to sent a preflight OPTION method request to the same url (the POST url) and my app.all routing in Express processed that one too.
app.all('/api/:cmd', require('./api.js'));
Separating POST from OPTIONS solved the issue:
app.post('/api/:cmd', require('./api.js'));
app.options('/', (req, res) => res.send());
I met the same problem. Then I tried to add return, it didn't work. But it works when I add return res.redirect('/path');
I had the same problem. Then I opened the Chrome dev tools and found out that the favicon.ico was requested from my Express.js application. I needed to fix the way how I registered the middleware.
Screenshot of Chrome dev tools
I also had double requests. In my case it was the forwarding from http to https protocol. You can check if that's the case by looking comparing
req.headers['x-forwarded-proto']
It will either be 'http' or 'https'.
I could fix my issue simply by adjusting the order in which my middlewares trigger.

Resources