I have a server which looks like this:
const http2 = require('http2');
const {
HTTP2_HEADER_METHOD,
HTTP2_HEADER_PATH,
HTTP2_HEADER_STATUS,
HTTP2_HEADER_CONTENT_TYPE
} = http2.constants;
const fs = require('fs');
const server = http2.createSecureServer({
key: fs.readFileSync('./ssl/localhost-privkey.pem'),
cert: fs.readFileSync('./ssl/localhost-cert.pem')
});
server.on('error', (err) => {
console.error(err);
});
server.on('stream', (stream, headers,flags) => {
stream.respond({
'content-type': 'text/html',
[HTTP2_HEADER_STATUS]: 200,
[HTTP2_HEADER_CONTENT_TYPE]: 'text/plain'
});
stream.end('<h1>Hello World 2</h1>');
});
server.on('request', (msg) => {
/* THIS IS BEING FIRED TWICE*/
console.log('request:' + JSON.stringify(msg) );
});
server.on('session', (msg) => {
/* THIS IS ALSO BEING FIRED TWICE*/
console.log('session:' + JSON.stringify(msg) );
});
server.listen(8443);
From my browser I type into the url https://myserver:8443. On my server I can see the session event is consoled log twice. Why is this happening since I am only making one request? As well everytime I refresh the page the request event is being fired twice instead of only once. I am using nodejs 11.0.0
You should log the URL that is being requested with console.log(msg.url). You will likely find that one of the requests is for the favicon.ico as this is something that a browser will request when it doesn't already have a favicon cached for a particular domain.
All requests of a web server have to look at the actual resource being requested and respond appropriately based on the exact resource being requested.
Related
I'm trying to set up HTTP2 for an Express app I've built. As I understand, Express does not support the NPM http2 module, so I'm using SPDY. Here's how I'm thinking to go about it-I'd appreciate advice from people who've implemented something similar.
1) Server setup-I want to wrap my existing app with SPDY, to keep existing routes. Options are just an object with a key and a cert for SSL.
const app = express();
...all existing Express stuff, followed by:
spdy
.createServer(options, app)
.listen(CONFIG.port, (error) => {
if (error) {
console.error(error);
return process.exit(1)
} else {
console.log('Listening on port: ' + port + '.')
}
});
2) At this point, I want to enhance some of my existing routes with a conditional PUSH response. I want to check to see if there are any updates for the client making a request to the route (the client is called an endpoint, and the updates are an array of JSON objects called endpoint changes,) and if so, push to the client.
My idea is that I will write a function which takes res as one of its parameters, save the endpoint changes as a file (I haven't found a way to push non-file data,) and then add them to a push stream, then delete the file. Is this the right approach? I also notice that there is a second parameter that the stream takes, which is a req/res object-am I formatting it properly here?
const checkUpdates = async (obj, res) => {
if(res.push){
const endpointChanges = await updateEndpoint(obj).endpointChanges;
if (endpointChanges) {
const changePath = `../../cache/endpoint-updates${new Date().toISOString()}.json`;
const savedChanges = await jsonfile(changePath, endpointChanges);
if (savedChanges) {
let stream = res.push(changePath, {req: {'accept': '**/*'}, res: {'content-type': 'application/json'}});
stream.on('error', function (err) {
console.log(err);
});
stream.end();
res.end();
fs.unlinkSync(changePath);
}
}
}
};
3) Then, within my routes, I want to call the checkUpdates method with the relevant parameters, like this:
router.get('/somePath', async (req, res) => {
await checkUpdates({someInfo}, res);
ReS(res, {
message: 'keepalive succeeded'
}, 200);
}
);
Is this the right way to implement HTTP2?
I'm trying to use SSE with node + express: I intercept requests using an express route, then I initiate a SSE session by directly writing headers:
res.writeHead(200, {
"content-type": "text/event-stream",
"cache-control": "no-cache"
});
I proceed with writing intermittent payloads using "res.write()"s.
This works well with Chrome's EventSource, up until the time when I call ".close()" to end the session. Then, the connection keeps hanging: Chrome doesn't reuse the connection to initiate additional EventSource requests (or any other requests), and node never triggers a "close" event on the IncomingMessage instance.
My question is: How do I handle "eventSource.close()" properly using node's http API?
It's worth noting that:
Since I don't set a "content-length", Node automatically assumes "chunked" transfer encoding (this shouldn't be a problem AFAIK). It also defaults to "connection: keep-alive".
The session terminates OK when I'm using Firefox.
When the browser closes the event source it does let the server side know. On the server side, the response socket object (res.socket) will generate an end event followed by a close event. You can listen to this event and respond appropriately.
E.g.
res.socket.on('end', e => {
console.log('event source closed');
sseResponses = sseResponses.filter(x => x != res);
res.end();
});
If your server is trying to write to a socket closed on the browser, it should not raise an error, but will return false from res.write.
If both your server side code and client side code are hanging after you close the event source, you may have bugs on both sides.
More complete prototype, with your writeHead code from above.
var app = new (require('express'));
var responses = [];
app.get("/", (req, res) => {
res.status(200).send(`
<html>
<script>
var eventsource = null;
function connect() {
if (!eventsource) {
eventsource = new EventSource("/sse");
eventsource.onmessage = function(e) {
var logArea = window.document.getElementById('log');
logArea.value += e.data;
};
}
}
function disconnect() {
if (eventsource) {
var myeventsource = eventsource;
eventsource = null;
myeventsource.close();
}
}
</script>
<div>
<span>
Connect
Disconnect
<span>
</div>
<textarea id="log" style="width: 500px; height: 500px"></textarea>
</html>`);
});
app.get("/sse", (req, res) => {
res.writeHead(200, {
"content-type": "text/event-stream",
"cache-control": "no-cache"
});
res.socket.on('end', e => {
responses = responses.filter(x => x != res);
res.end();
});
responses.push(res);
});
app.listen(8080);
setInterval(() => {
responses.forEach(res => {
res.write('data: .\n\n');
});
}, 100);
I have a Node.js server that manages list of users. When new user is created, all the clients should display immediately the added user in the list.
I know how to send data to clients without request - using Websocket, but in this implementation, Websocket is not allowed.
Is it possible to update all the client's user-list without using Websocket, when new user is added in the server?
// Client side
const subscribe = function(callback) {
var longPoll = function() {
$.ajax({
method: 'GET',
url: '/messages',
success: function(data) {
callback(data)
},
complete: function() {
longPoll()
},
timeout: 30000
})
}
longPoll()
}
// Server Side
router.get('/messages', function(req, res) {
var addMessageListener = function(res) {
messageBus.once('message', function(data) {
res.json(data)
})
}
addMessageListener(res)
})
Long polling is where the client requests new data from the server, but the server does not respond until there is data. In the meantime, the client has an open connection to the server and is able to accept new data once the server has it ready to send.
Ref: http://hungtran.co/long-polling-and-websockets-on-nodejs/
There is a third way: Push Notifications
Your application should register in a Push Notification Server (public or proprietary) and then your server will be able to send messages asynchronously
You can use server-sent events with an implementation like sse-express:
// client
let eventSource = new EventSource('http://localhost:80/updates');
eventSource.addEventListener('connected', (e) => {
console.log(e.data.welcomeMsg);
// => Hello world!
});
// server
let sseExpress = require('./sse-express');
// ...
app.get('/updates', sseExpress, function(req, res) {
res.sse('connected', {
welcomeMsg: 'Hello world!'
});
});
I'm piping to a file an HTTPS request, it works ok 99.9% of calls, but occasionally (maybe when server or network are not available) hangs indefinitely...
This obviously cause my application to stop working and requiring a manual restart...
I have other https connections that used to occasionally hang that always complete now using the following error code on the request object, as suggested on node documentation:
request.on('socket', function(socket) {
socket.setTimeout(10000);
socket.on('timeout', function() { request.abort(); });
});
request.on('error', function(e) {
// Handle the error...
console.error("FAILED!");
});
... but it seems that timeouts on the request are ignored if the destination is piped to a file stream, maybe I should handle an error with a timeout on the filesystem object, but the documentation is not clear if there is an event I have to wait for except for 'finish'...
Here is the sample code, I hope someone can help me:
var https = require('https'),
fs = require('fs');
var opts = {
host: 'www.google.com',
path: '/',
method: 'GET',
port: 443
};
var file = fs.createWriteStream('test.html');
var request = https.request(opts, function(response) {
response.pipe(file);
file.on('finish', function() {
file.close(function(){
console.log("OK!");
});
});
});
request.on('socket', function(socket) {
socket.setTimeout(10000);
socket.on('timeout', function() { request.abort(); });
});
request.on('error', function(e) {
console.error("FAILED!");
});
request.end();
If you wanna try the hang, change host and path with a huge file and disconnect the network cable during the transfer, it should time out after 10 seconds, but it doesn't...
I set up a demo node.js http server that sends a very slow answer and a client similar to your sample code.
When I start the client and then stop the server while sending the response then I also don't get a timeout event on the socket but I get a end event on the response within the client:
var request = https.request(opts, function(response) {
response.pipe(file);
file.on('finish', function() {
file.close(function(){
console.log("OK!");
});
});
response.on('end', function() {
// this is printed when I stop the server
console.log("response ended");
});
});
```
Maybe you could listen to that event?
node 5.4.0
socket.io: ^1.3.7
I'm unsure how to initialize data for a stream of posts while also setting a listener for updates(new posts).
How I Currently Initialize The Stream:
In my example below I have a client that first requests the initial payload and once the payload arrives it sets the listener to listen for updates.
I designed it that way so that push events for updates wouldn't happen
until the initial payload return.
The Problem:
As the response returns from the server there is a window of time
where potential updates are missed(especially if the volume of updates is large).
Question:
Is there a way to handle the initial payload request and set the update listener without the possibility of missing updates or receiving them prematurally?
Server Side- Sends out initial payload and updates when requested
io.on('connection', function(socket){
console.log('a user connected to');
});
app.get('/initialData/',function(req, res){
var initialPayload = fetchPayload();//fetch from db
res.json(initialPayload);
}
app.post('/update',function(req, res){
var updateData = req.body.json;
updateData(update ,function(){
var socket = io();
socket.emit('update data', updateData);
res.status(200).end();
}
});
});
ClientSide- Calls for payload, an when succceeds sets listener
import io from 'socket.io-client';
var getInitialPayload = function() {
var socketURL = "http://localhost:3000/"
var socket = io(socketURL);
return new Promise(function(resolve, reject) {
var url = 'http://localhost:3000';
$.ajax({
url: url,
dataType: 'json',
success: function(initialPayload) {
ServerActionCreators.receiveAllPayload(initialPayload);
socket.on('update data', data);
},
error: function(xhr, status, err) {
//print error
}
});
});
}