I want to create a nodejs server which is acting a proxy to download files i.e. user clicks
on the a download button, call get from nodejs server, nodejs server fetches link from a different
remote server and starts the download (in terabytes). This download is then forwarded to the user.
The terabyte file should not be stored on the nodejs server and then sent.
Here is my attempt:
function (request, response) {
// anything related to the remote server having the file
var options= {
path: "./bigData",
hostname:"www.hugeFiles.net"
}
// get the file from the remote server hugefiles and push to user's response
https.get(options, function(downFile)) {
downFile.pipe(response)
}
}
Before I was using res.download(file, function(err)) {} but file has to be downloaded completely from the remote server
You're very close, you're sending the right http body but with the wrong http headers.
Here's a minimal working example:
const express = require('express');
const http = require('http');
const app1 = express();
app1.get('/', function (req, res) {
res.download('server.js');
});
app1.listen(8000);
const app2 = express();
app2.get('/', function (req, res) {
http.get({ path: '/', hostname: 'localhost', port: 8000}, function (resp) {
res.setHeader('content-disposition', resp.headers['content-disposition']);
res.setHeader('Content-type', resp.headers['content-type']);
resp.pipe(res);
});
});
app2.listen(9000);
Though I would say you should take a look at modules like https://github.com/nodejitsu/node-http-proxy which take care of the header etc . . . for you.
There is no way for your server to provide a file to the client without the server downloading it first.
What you may want to do instead is provide the client with a download link to the huge file. To make it seem automatic, you can create html which starts a download from the content provider automatically and serve that to the client.
In other words, in the scenario you are describing, the server is acting as a middleman between your client and the content provider. Unless the server needs to process the data or the client isn't allowed to retrieve the data themselves, it makes more sense to cut out the middleman.
Related
It seems that express default behaviour is to normalize URLs containing up directory (/../)
For the code below, if I request a URL like this
http://localhost:8080/foo/../../bar
the request gets redirected to
http://localhost:8080/bar
I couldn't find any detailed documentation on this behavior.
my questions are:
is it a guaranteed behavior
in case I am not serving from a file system is there a way to preserve the original URL "path" in case i am using other processing?
by
const express = require("express")
const app=express()
app.get("/*", (req,res) => {
console.log("url:",req.url);
console.log("path:",req.path);
res.send('echo for url='+req.url+'; path='+req.path')
}
const port=8080;
app.listen(port,() =>{
console.log(`listening on port ${port}`);
});
The request is not redirected, but rather the client rebuilds the URL before making the request
>curl -v http://localhost:8080/foo/../../bar
* Rebuilt URL to: http://localhost:8080/bar
Therefore the server never sees the "original URL path".
This URL rebuilding is part of the resolution process. See also https://url.spec.whatwg.org/#concept-basic-url-parser.
A malicious client (e.g., a telnet client) could, however, send an HTTP request with "unresolved" URL. The following middleware demonstrates how to rebuild the URL on the server:
function(req, res) {
res.json({path: req.path,
rebuilt_path: new URL("s://" + req.path).pathname});
}
The malicious request
GET /foo/../../bar HTTP/1.1
Host: localhost:8080
then returns
{"path": "/foo/../../bar", "rebuilt_path": "/bar"}
(Things get more complicated if req.baseUrl is also involved.)
Ive been trying to deploy a Twitch like application using react, redux, node media server and json server module to Heroku. However, I keep running into a issue when trying to connect my react client and express server via a api request, during production.
Im trying to make the actual request through my action creators and by using axios with a base url of http://localhost:4000, however that only works on my local machine.
const response = await streams.get("/streams");
dispatch({ type: FETCH_STREAMS, payload: response.data });
};
You can view my full repo at https://github.com/XorinNebulas/Streamy
You can also view my current deployed version of the site on Heroku at
https://streamy-app.herokuapp.com/
here is my api/server.js file. My express server will be watching on a random port equal to process.env.PORT, so I have no way of knowing how to make a network request via my action creators to that random port during production.
const path = require("path");
const cors = require("cors");
const jsonServer = require("json-server");
const server = jsonServer.create();
const router = jsonServer.router("db.json");
const middlewares = jsonServer.defaults({
static: "../client/build"
});
const PORT = process.env.PORT || 4000;
// Set default middlewares (logger, static, cors and no-cache)
server.use(cors());
server.use(middlewares);
if (process.env.NODE_ENV === "production") {
// Add custom routes before JSON Server router
server.get("*", (req, res) => {
res.sendFile(
path.resolve(__dirname, "../", "client", "build", "index.html")
);
});
}
// Use default router
server.use(router);
server.listen(PORT, () => {
console.log(`JSON Server is listening on port ${PORT}`);
});
I expected the request to go thru and load up some data from api/db.json, with a resquest url of https://streamy-app.herokuapp.com/streams but instead i got a request url of http://localhost:4000/streams, which of course leads to the CORS issue below
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://localhost:4000/streams. (Reason: CORS request did not succeed).
I would truly appreciate any suggestions, been working on this for days.
Alright looks like I figured it out. I simply went into streams/client/package.json and added
"proxy":"http://localhost:4000"
I then went into streams\client\src and deleted the api folder which contained my custom axios with a base url. using instead axios out of the box for my action creators
const response = await axios.get("/streams");
dispatch({ type: FETCH_STREAMS, payload: response.data });
};
Now while running locally in development mode, I'm able to make a request to http://localhost:4000/streams, but after deploying my node app to Heroku I successfully make a request over to https://streamy-app.herokuapp.com/streams
hope this helps someone with slimier issues.
First, you should know that Heroku doesn't allow to expose multiple ports, which means you should change the approach of multiple ports to something else (see this answer).
Second, the file client/src/apis/streams.js is hard-coded configured to send requests to http://localhost:4000/ - which is not a good idea.
Whatever approach you choose - even deploying to another host server - you will need to dynamically configure the API endpoint, per environment.
I would also recommend you to:
Change the way you deploy react, as explained here.
After doing the above, consider consolidating your API service with the static server, so that you don't need multiple ports, and then everything becomes easier.
I have a node express server responding to http traffic:
const http = require("http");
const express = require("express");
const app = express();
const server = http.createServer(app);
app.use(function(req,res,next){
console.log(`logging: req: ${util.inspect(req)}`);
next();
});
and all that works fine. I'd like to have a program on my node server inject emulated http traffic into the express stack, without a network connection. I can't just magic up a (req,res) pair and call a middleware function like the one in app.use above, because I don't have a next to give it, and my req and res will not be the ones next passes on to the next middleware in the stack.
Edit: What I actually have is a websocket connection sending data packets of a different format, different data contents from http traffic that can also carry the same information. I can take those websocket packets and build from those a request that is in the same format that the http traffic uses. I would like to pass that transformed request through the express http middleware stack and have it processed in the same way. Going all the way back to create an http request having just dealt with a ws request seems a bit far.
What's the simplest way to emulate some traffic, please? Can I call a function on app? Call some express middleware, or write a middleware of my own to inject traffic? Call a function on server?
Thanks!
Emulation traffic by calling some Express.js internal functions isn't the right way. Much easier is to trigger the server by HTTP request from the same process
const http = require('http');
const util = require('util');
const express = require('express');
const app = express();
const server = http.createServer(app);
app.use(function(req, res, next) {
console.log(`logging: req: ${util.inspect(req)}`);
next();
});
const port = 8081;
server.listen(port);
http.request({ port }).end();
From your question
I'd like to have a program on my node server inject emulated http traffic into the express stack, without a network connection
Can you clarify, why without a network connection?
A few things:
You need to make an endpoint
You need to host your server somewhere
You need something to send requests to your server
Express provides you a way to receive requests (req, res) (might be from a browser, might not be), perform some operations, and return responses (req, res) to the requester.
The expression
app.use(function(req,res,next){
console.log(`logging: req: ${util.inspect(req)}`);
next();
});
is actually a middleware function. This will take every request to your server and change the request object created by express into a string, and print it in your server log.
If you want a testable endpoint, you would add this to the bottom of the snippet you posted
app.get('/test', function (req, res) {
res.json({success:true})
})
This tells your app to allow GET requests at the endpoint /test
Next you're going to need to host your express server somewhere you can send requests to it. Your local machine (localhost) is a good place to do that. That way, you don't need an internet connection.
Pick a port you want to host the server on, and then it will be reachable at http://localhost:<Your Port>.
Something like this will host a server on http://localhost:3000. Add this below the route we declared above:
server.listen(3000, function() {
console.log('Server running on port 3000');
});
Finally, you'll need a way to send requests to the server on localhost. Postman is a great tool for testing express routes.
I would suggest installing Postman and using that to emulate http traffic.
Once your server is running, open postman and send a GET request to your server by entering the server address and port, and hitting the blue send button (You'll be sending the request to http://localhost:3000/test).
Here's an image of what postman should look like if all goes well
You should also see your middleware fire and print out the request object in your terminal.
Good Luck!
My Node.js folders hirarchy looks like the next image:
My folders hirarchy
While app.js it's the Node.js main file, routes it's the Node.js routes and src it's the client public html files.
This is the code in app.js:
var express = require('express');
var app = express();
var server = require('http').createServer(app);
global.io = require('socket.io').listen(server);
var compression = require('compression');
var helmet = require('helmet');
var session = require('express-session');
var bodyParser = require('body-parser');
app.use(bodyParser.json()); // support json encoded bodies
app.use(bodyParser.urlencoded({ extended: true })); // support encoded bodies
app.use(express.static(__dirname + '/src'));
app.use(helmet());
app.use(compression());
app.use('/rides', require('./routes/ridesServer'));
app.use('/user', require('./routes/userServer'));
app.use('/offers', require('./routes/offersServer'));
app.use('/notifications', require('./routes/notificationsServer'));
server.listen("8080", function() {
console.log("Connected to db and listening on port 8080");
});
This is another API in routes/userServer.js file:
router.post('/verifytoken', function(req, res, next) {
// some functions here
});
And this another HTTP REQUEST I am doing from client side, in page: ride.js:
$.ajax({
method: "POST",
headers: {
"Content-Type": "application/json"
},
url: "user/verifytoken",
data: JSON.stringify(something),
success: function(response) {
// some code
},
error: function(error) {
// some code
}
});
As you can see, client files and Node.js server files are on the same server, and Node.js serves those static files via this command:
app.use(express.static(__dirname + '/src'));
I think, that it should be avoided, and there is a better way!
If you are a Node.js expert and familier with best practices, please, tell me if the next method of working is correct, if it does not, please correct me:
I thought about putting static files on public_html directory
and Node.js files in server directory which is under public_html directory.
Then run pm2 start app.js --watch or, node app.js on the app.js which is located in server directory, and not in public_html.
In result, index.html file will be served as just as another static file without any relation to Node.js server, and Node.js will be in its own folder, not dealing with any kind of client the side.
In other words, seperate Node.js and static files and put Node.js files as a sub directory and not main directory.
Then the HTTP REQUEST will be looking like this:
$.ajax({
method: "POST",
headers: {
"Content-Type": "application/json"
},
url: "server/user/verifytoken",
data: JSON.stringify(something),
success: function(response) {
// some code
},
error: function(error) {
// some code
}
});
Please note that I have added SERVER directory.
Furthermore, I can exchange the
url: "server/user/verifytoken",
to an IP from a remote app (like Ionic):
url: "123.123.123.123:443/server/user/verifytoken",
And then my HTTP REQUESTS will be served via HTTPS (because I am sending for port 443), I can create multiple apps on the same server and I have no struggles with any Node.js express static folders.
What do you think?
Thanks!
First let me say I'm not an expert. But I have 3 years of continuous development of Node.js based solutions.
In the past I have created solutions mixing client side code and server side code on the same project and it has work. At least for a while. But in the long run is a bad idea for many possible reasons. Some of them are:
Client side code and server side code may require different processes to produce working code. For example client side code may require trans compiling from ES6 to more compatible ES5 using something as gulp or Webpack. This is normally not the case for server side code because the runtime is more targeted.
Mixing client side code and an API server may prevent you from horizontally scaling one of them without the other.
This is like a mono repo. And having a mono repo without a CI process tailor for this scenario may produce very long development times.
What we currently do at my work is as follow:
Create a separated API server project. This way you can concentrate on developing a good API while working on this specific project. Let cross-cutting concerns (like authentication) outside the API server.
Create a separated project for your client side code (SPA maybe). Set your dev environment to proxy API requests to a running API server (may be running locally).
Create a separated project for the deployment of the entire solution. This project will put together the serving of the client code, proxying requests to the API and implementing cross-cutting concerns like authentication, etc.
Having your code separated in this way makes easy developing each pieces and fast evolution. But it may introduce some complexities:
This multi-project structure require you to be able to trigger testing the hole product each time one of the project changes.
It surface the need of integration testing
Some other considerations are:
API server and Website server may run on the same machine but in different ports.
You may secure your API server using SSL (on node using the standard https module) but notice that in all cases you need another actor in front of the API server (a website proxying requests to the actual API server of a API gateway that implement cross-cutting concerns like authentication, rate limiting, etc). In the past I pose the same question you have made yourself regarding the apropriate of using SSL in this scenario and the answer is here. My answer is: depends on the deployment conditions.
I have made a couchdb design document which works perfectly on the following url
http://localhost:5984/db/_design/app/index.html
Now the problem is i am trying to fetch the page contents and display it from node js but only the html page is displayed the linked css and js files are not working and when i tried to narrow down the problem i found that the css and js files are suppose to have the login credentials of the couchdb and is not linking I even tried adding the auth header in the response parameter but still no luck
var http = require('http');
var json;
var root = new Buffer("admin:pass").toString('base64');
http.createServer(function(req, res) {
res.setHeader('Authorization', root);
res.writeHead(200, { 'Content-Type':'text/html' });
couchPage();
res.end(json);
}).listen(8080);
function couchPage() {
var options = {
hostname: 'localhost',
port: 5984,
path: '/db/_design/app/index.html',
auth: 'admin:pass',
method: 'GET'
};
var req = http.request(options, function(res) {
res.setEncoding('utf8');
res.on('data', function (chunk) {
json = chunk;
});
});
req.end();
}
could any one please guide me where am i wrong
I think this has nothing to do with couchdb authorization. The problem is that you do not perform any routing on your nodejs server. That is, the browser makes a request to localhost:8080 and receives the content of /db/_design/app/index.html as an answer. Now, the browser detects a link to a stylesheet, say "style.css". It performs a request to localhost:8080/style.css but your nodejs server simply ignores the "style.css" part of the request. Instead, the client will receive the content of /db/_design/app/index.html again!
If you want to serve attachments of your design document through nodejs, you have to parse the request first and then retrieve the corresponding document from couchdb. However, I don't think that you actually want to do this. Either you want to use couchdb in a traditional way behind nodejs (not directly accessible from the client) and then you would just use it as a database with your html (or template) files stored on disk. Or you want to directly expose couchdb to the client and make nodejs listen to events via the couchdb _changes feed.