Sniff/intercept and change http(s) request data - node.js

A few days ago I've been researching how I can listen for https requests in my browser and overwrite some data (headers, body) from the request before it reaches the final destination (site).
I tried many times. I came to use the modules
mockttp, web-sniffer, hoxy, among others.
As the application I'm developing is with selenium, I tried to create a proxy server with the modules mentioned above and make them go through my application before being "dispatched", however, none of them were successful.
Note: I was able to listen to requests, change their response, and other things. But I didn't get what I really wanted: change the request data before it was sent to the site. Something similar to the breakpoint functionality present in Fiddler, Charles, HTTP Tool Kit and others applications.
What do I really hope?
I'd like to hear my browser's requests -> pause HTTPS requests (breakpoint), change something anyway -> forward/follow/dispatch with/the request.
One of the failed attempts:
The new header is not really inserted in the request! (browser was configured with proxy server and I could see the requests being shown in the console)
import mockttp from "mockttp";
import fs from "fs";
import * as Sniffer from "web-proxy-sniffer";
(async () => {
// Create a proxy server with a self-signed HTTPS CA certificate:
const https = await mockttp.generateCACertificate();
fs.writeFile("key.pem", https.key, console.log);
fs.writeFile("cert.pem", https.cert, console.log);
const proxy = Sniffer.createServer({
certAuthority: {
key: fs.readFileSync(`./key.pem`),
cert: fs.readFileSync(`./cert.pem`),
},
});
proxy.intercept(
{
// Intercept before the request is sent
phase: "request",
},
(request, response) => {
request.headers.test = "a simple test";
return request;
}
);
proxy.listen(8001);
})();
Would this be possible with NodeJS?
(Sorry for this "rude" english)

Related

Progress bar for express / react communicating with backend

I want to make a progress bar kind of telling where the user where in process of fetching the API my backend is. But it seems like every time I send a response it stops the request, how can I avoid this and what should I google to learn more since I didn't find anything online.
React:
const {data, error, isError, isLoading } = useQuery('posts', fetchPosts)
if(isLoading){<p>Loadinng..</p>}
return({data&&<p>{data}</p>})
Express:
app.get("api/v1/testData", async (req, res) => {
try {
const info = req.query.info
const sortByThis = req.query.sortBy;
if (info) {
let yourMessage = "Getting Data";
res.status(200).send(yourMessage);
const valueArray = await fetchData(info);
yourMessage = "Data retrived, now sorting";
res.status(200).send(yourMessage);
const sortedArray = valueArray.filter((item) => item.value === sortByThis);
yourMessage = "Sorting Done now creating geojson";
res.status(200).send(yourMessage);
createGeoJson(sortedArray)
res.status(200).send(geojson);
}
else { res.status(400) }
} catch (err) { console.log(err) res.status(500).send }
}
You can only send one response to a request in HTTP.
In case you want to have status updates using HTTP, the client needs to poll the server i.e. request status updates from the server. Keep in mind though that every request needs to be processed on the server side and will take resources away which are then not available for other (more important) requests from other clients. So don't poll too frequently.
If you want to support long running operations using HTTP have a look at the following API design pattern.
Alternatively you could also use a WebSockets connection to push updates from the server to the client. I assume your computation on the backend will not be minutes long and you want to update the client in real-time, so probably WebSockets will be the best option for you. A WebSocket connection has, once established, considerably less overhead than sending huge HTTP requests/ responses between client and server.
Have a look at this thread which dicusses abovementioned and other possibilites.

How can I get CORS to work for API calls with a Hapi v20 server with HTTPS?

Basic problem: you've followed the tutorial, you've fired up the Hapi server, it's running... but it doesn't work. A direct call via curl will get something, using a web browser to directly load the API call will get something... but using that API endpoint within an app, say, Angular or React, and it bombs out with an error message like:
Access to XMLHttpRequest at 'https://localhost:3000/server/ping' from origin 'http://localhost:5000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
And it's true: you check the headers, and Access-Control-Allow-Origin is not on that list at all. So your app, having gotten blocked here in the preflight request, isn't even going to make the actual GET/POST call.
Here's the full file of a fully working Hapi v20.2.0 server, in TypeScript:
'use strict'
import * as fs from 'fs'
import * as util from 'util'
import * as path from 'path'
import * as os from 'os'
import * as Hapi from '#hapi/hapi'
import * as Http2 from 'http2'
const strFullNameCert:string=path.resolve(
os.homedir(),
'ssl',
'domain.crt')
const strFullNameKey:string=path.resolve(
os.homedir(),
'ssl',
'domain.key')
const key :Buffer = fs.readFileSync(strFullNameKey)
const cert:Buffer = fs.readFileSync(strFullNameCert)
const sslDetails ={key,cert}
const server = new Hapi.server({
listener: Http2.createSecureServer(sslDetails), // optional: remove this line for http1.1
host: 'localhost',
port: 3000,
tls: sslDetails,
routes: {
cors: true
},
})
console.log(`Host : ${server.info.host}`)
console.log(`Port : ${server.info.port}`)
console.log(`Method : ${server.info.protocol}`)
console.log(`Hapi : v${server.version}`)
server.route({
method: 'GET',
path:'/server/ping',
handler: async (request, reply) => {
console.log(`>>>ROUTE>>> : ${request.route.path}`);
const response = reply.response({ qSuccess: true, qResult: 'pong' })
return response;
}
})
server.start()
To reiterate, this code will "work", it will serve up a response if you load the /server/ping route in an independent way. If you were building a web server to serve pages, this would likely be sufficient to get going.
This code will still fail CORS validation in a web app. Why? Because the request to /server/ping is never even going to be made. The app will send a preflight OPTIONS request first. And there's nothing in this code to handle that. So nothing you do in the server.route area, messing with route options, or adding headers, is going to fix this. Ten jillion different setups in the main server instantiation of routes:cors wont fix this, because they also don't address the actual problem.
I added these, at the top of my middleware, to respond to options(pre-flight) request.
Might cause problems in other areas of the app that use the options, but worked for my case/issue.
async function(request: Request, h: ResponseToolkit) {
if (request.method === "options") {
const response = h.response({});
response.header("Access-Control-Allow-Origin", "*");
response.header("Access-Control-Allow-Headers", "*");
return h.response(response).takeover();
}
// more checks....
},
The problem is there isn't a route set up that's dealing with the OPTIONS request that Chrome/Firefox will send before they attempt the GET/POST to the API being served up by Hapi. Add this code to the above file, just above server.start()
server.route({
method : 'OPTIONS',
path: '/{any*}',
handler : async (request, reply) => {
console.log(`======options: ${request.route.path}`)
const response = reply.response({})
response.header('Access-Control-Allow-Origin', '*')
response.header('Access-Control-Allow-Headers', '*')
return response;
}
})
Now you'll see that when you attempt to use the API from your app, it's going to make two calls: first, the options call, and then if that passes (and with these headers it now will) it will make the actual call you're interested in.
The "cors:true" route option, it really feels like it's going to "enable cors". I mean, yes... but really, no. Think of it more like it permits Hapi to do cors... you still have to do the work of "handling cors".

webRequest onAuthRequired never catch 407 Proxy Authentication Required

I'm trying to handle proxy authentication though chrome extensions.
On the one hand I have chrome extension (with all permissions established) that sends CONNECT request with onAuthRequired handler (background.js)
chrome.webRequest.onAuthRequired.addListener(
(details, callback) => {
console.log('onAuthRequired', details) // <- this has never been displayed
callback({
authCredentials: {
username: 'someid',
password: 'somepwd'
}
})
},{
urls: ['<all_urls>']
},
['asyncBlocking']
)
const config = {
mode: "pac_script",
pacScript: {
data: "function FindProxyForURL(url, host) {\n if (shExpMatch(host, \"*.pandora.com\"))\n return 'PROXY 127.0.0.1:8124';\n return 'DIRECT';\n }"
}
}
chrome.proxy.settings.set({
value: config,
scope: 'regular',
}, function(){})
And on the other hand I have NodeJS proxy server that always sends the 407 status code as described in the specifications
const http = require('http');
const proxy = http.createServer()
proxy.on('connect', (req, clientSocket, head) => {
clientSocket.write('HTTP/1.1 407 Proxy Authentication Required')
clientSocket.write('Proxy-Authenticate: Basic realm="Access to site"\r\n\n')
});
proxy.listen(8124)
Finally, the browser returns ERR_PROXY_AUTH_UNSUPPORTED which means that the status code is sent correcly...
The fact is onAuthRequired is never triggered, can anyone tell me why ?
Thank you in advance
Chrome aggressively caches the authentication calls to proxy servers. So you will only see your console.log call once per browser session (you need to restart Chrome completely, if you've got more than one profile open you'll need to close ALL instances of Chrome before the authentication cache is cleared).
I'm currently trying to figure out how to clear or empty said cache.
See also (How to delete Proxy-Authorization Cache on Chrome extension?)

How to keep alive a connection behind proxy in node js

I am sending a https GET Type request to an external API.
First, I created the https agent as below:
import https from 'https';
const KeepAliveAgent = new https.Agent( {
keepAlive: true
} );
then set the request options as below:
let options = {
url: 'externalapiurl',
method: "GET",
qs: queryString,
agent: KeepAliveAgent
};
I just mentioned sample strings for url and qs, in the original request I am using the actual api url and querystring, then I am sending the request as below:
console.time( "requestTime" );
request( options, ( err, response, body ) => {
if ( err ) {
logger.warn( err.message );
}
console.timeEnd( "requestTime" );
});
This is working fine, but I am printing the time taken for response above and this time is much more when I send the request behind a proxy, when I doesn't use proxy it's taking less than half second but with proxy it's taking around 3 secs, so it seems the "keep alive" is not working behind the proxy, how to make this work?
I tried same request using https-proxy-agent node module, but still the issue persists, appreciate any help.
All about measuring performance of HTTP/HTTPS Proxies. https://blog.thousandeyes.com/measuring-performance-with-http-proxies/][1]
Code given by you seems correct.
I suggest use socketio in nodejs for such implementation you are doing.

Nodejs Request module -- how to set global keepalive

I am using request npm module in my app, to make to create a http client, as this.
var request = require('request');
And each time, I make a request to some server, I pass the options as below:
var options = {
url: "whateverurl...",
body: { some json data for POST ... }
}
request(options, cb(e, r, body) {
// handle response here...
})
This was working fine, until I started testing with high load, and I started getting errors indicating no address available (EADDRNOTAVAIL). It looks like I am running out of ephemeral ports, as there is no pooling or keep-alive enabled.
After that, I changed it to this:
var options = {
url: "whateverurl...",
body: { some json data for POST ... },
forever: true
}
request(options, cb(e, r, body) {
// handle response here...
})
(Note the option (forever:true)
I tried looking up request module's documentation about how to set keep-alive. According to the documentation and this stackoverflow thread, I am supposed to add {forever:true} to my options.
It didn't seem to work for me, because when I checked the tcpdump, the sever was still closing the connection. So, my question is:
Am I doing something wrong here?
Should I not be setting a global option to request module, while I am "require"ing it, instead of telling it to use {forever:true}, each time I make a http request? This is confusing to me.

Resources