aws x-ray tracing breaks on outgoing requests in Node.js - node.js

Hey I'm trying to trace outgoing requests from an express app, but I can't get it to work.
When I dont use the AWSXRAY.captureHttpsGlobal function everything works fine with incoming requests and I can see my application in "Service Map" and my incoming request traces coming in on AWS, but I want to trace outgoing requests and as soon as I add AWSXRAY.captureHttpsGlobal then nothing works and I get no exception or anything, and my Daemon doesnt print the usual "Successfully sent batch of 1 segments (0.058 seconds)"
This is my code.
var AWSXRay = require('aws-xray-sdk');
const express = require("express");
var app = express();
app.use(AWSXRay.express.openSegment('MyApp'));
AWSXRay.captureHTTPsGlobal(require('https')); // works when i comment this out
var http = require('https');
app.get('/', function (req, res) {
http.get("https://google.com", (resp) => {
res.send("googlefetched")
});
//res.send("hello world")
});
app.use(AWSXRay.express.closeSegment());
app.listen(3000, () => console.log('Example app listening on port 3000!'))

could you share which node runtime version your code is running at and which X-Ray SDK version you are using so we can try to reproduce this issue on our side?
At the meantime I would like to share a previous issue that has been fixed since v1.2.0 https://github.com/aws/aws-xray-sdk-node/issues/18 where if the response body is not consumed then the entire segment will never be flushed to the daemon.
Please let me know.

Related

Express-Socket.IO App isn't working with my Azure WebApp

For educational purposes I try to deploy an Express Server that is using Socket.IO. The Server should be able to deliver a static HTML Site that was built with React, answer with a "Hello Azure!" message whenever I make a GET Rest Call to http://localhost:4000/api/azure and whenever a new client connects to the site, all the other clients get a message announcing the new client.
const path = require('path');
const express = require('express');
const app = express();
const server = require('http').createServer(app);
const io = require('socket.io')(server);
const router = require('./api/azure');
const PORT = process.env.PORT || 4000;
io.on('connection', () => {
console.log('A new user has connected!')
io.emit('broadcast', 'A new user has connected');
});
app.use(express.json());
app.use('/api/azure', router);
app.use(express.static(path.join(__dirname, 'build')));
app.use(express.static('public'));
app.use('/', (_, res) => {
res.sendFile(path.join(__dirname, 'build', 'index.html'));
});
server.listen(PORT, () => {
console.log(`Listening to http://localhost:${PORT}`);
});
All this tasks are fulfilled without problems in localhost. The problem begins after this app is uploaded to one of my Azure WebApps.
Instead of delivering the message "Hello Azure!" when I call the https://mydomain.azurewebsites.net/api/azure it responses back with the HTML file.
The typical Socket.IO GET method for polling
https://mydomain.azurewebsites.net/socket.io/?EIO=4&transport=polling&t=SomeString
responses back with the HTML file, too.
Everything url extension that I give, gives me back the HTML file.
I barely know the basic stuff about WebApps. Maybe there is a configuration that I am forgetting? By the way I haven't done anything in the configuration except that I enabled the Websockets in the WebApp config.
This never happened before. The only difference is that right now I am using a free-tier just to test. Could it be that? If not, what am I doing wrong?
Thank you for your time!
To begin with, try turning the Web Socket config off as it applies to an IIS setting which tends to contradict with the Node.js websocket implementation.
If this doesn't help, try and force the transport layer to use Websockets and SSL.
io.configure(function() {
// Force websocket
io.set('transports', ['websocket']);
// Force SSL
io.set('match origin protocol', true);
});
Also, you cannot use arbitrary ports (port 4000 in your case) on services like App Service. Your app will be provided a port via process.env.PORT. So ensure that you are refering to the correct port from your log message. You should be able to see these in your log stream.
Also note, that Azure has launched a fully managed service called Web PubSub to power your apps with Web Sockets. The app service web socket implementation does not scale horizontally, this where Web PubSub will help you.
https://azure.microsoft.com/en-in/blog/easily-build-realtime-apps-with-websockets-and-azure-web-pubsub-now-in-preview/

Google Chrome queued request one by one when multiple requests to a localhost node application

Recently, I was testing the asynchronous behaviour of a nodejs express web application. My code was very simple
const express = require('express');
const app = express();
const port = 3000;
app.get('/', (req, res) => {
console.log(`hello main start`);
setTimeout(() => {
const date = new Date();
console.log(date);
res.send(`hello work done at ${date}!`);
}, 20000);
console.log(`hello main end`);
});
app.listen(port, () => {
console.log(`Example app listening on port ${port}`);
})
I found the experience that if I just open 2 tabs simultaneously to the endpoint url http://localhost:3000/ in my chrome browser without the developer tools opening, the request are fired one by one. How can I know they are are fired one by one is that when I observe the server console log, the second request only log start after when the request 1 finish. And hence I need 40 seconds to complete my 2 requests.
However I don't expect that above behaviour happens. So I try to do the same actions with postman.
But in this time, I found my postman will fire the 2 requests simultaneously and my server log the 2 request immediately as well.
What's more weird is that if I opens the 2 tab with chrome developer tool opening, the behaviour will be same as what I saw with the postman.
Can anyone have the explanation for this behaviour for the chrome? is it Google doing it on purpose on chrome?
Yes, this is Chrome's behavior. Identical requests to the same host will stall. This is apparent if you watch the network tab in the developer tools.

Express JS route continues to execute endlessly

I am new to Express JS and Node in general, and I am having a slight problem here that I don't understand its reason.
So basically, this is my code for index.js:
const express = require('express')
const app = express();
app.get('/', (req, res) => {
console.log("Continuous message!");
res.send("Hello World");
});
app.listen(3000, () => console.log('Example app listening on port 3000!')
)
Everything works great except for one thing. In my console the message "Continuous message!" keeps showing up endlessly in my console. Am I doing anything wrong here? Or is this the normal behavior here?
Thank you!
Update: the reason turned out to be the redirection from port 80 to port 3000. So I edited server's site-available and the repetition no longer occurs. Question is now, how do I configure my AWS EC2 server to redirect traffic from port 80 to port 3000 without previous problem?
Update: I tried the EC2 Load Balancer and iptables commands. The problem is still there. Console messages and route code continues to execute every like 5 seconds. Which causes problems with my code flow.

Google Cloud Pub/Sub: unable to get request PUSH from GAE endpoint URL

I have deployed myapp to Google App Engine, with setting
runtime: nodejs
env: flex
My app domain is
https://myapp.appspot.com
but it is auto redirected to
https://myapp.appspot-preview.com
I have also created Google Pub/Sub topic and add subscription, set Push endpoint url to
https://myapp.appspot-preview.com/_ah/push-handlers/sample
tested this endpoint_url with postman and it's sure work well, however when I publish message to Pub/Sub but nothing works.
MyApp (Node.js) endpoint handler:
var cors = require('cors');
var express = require('express');
var app = express();
var bodyParser = require('body-parser');
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.use(cors());
app.post('/_ah/push-handlers/sample', function (req, res){
console.log('PUBSUB_____',req.body); //this should be printed out
res.status(200).send();
});
//Listener
io.on('connection', function (socket){
// do something
});
var server = http.listen(8080, function(){
console.log('App listening on port %s', server.address().port);
});
This is the logs of GAE, it seems pub/sub does send request to my endpoint but not go to my handler
I appreciate any help :)
I found a similar issue to mine "Cannot receive push message in appengine flexible environment"
It does not work for us. I created a push subscription to https://{project-id}.appspot-preview.com/push/db/istock ant posted a message to this topic. After that I can see a lot of nginx /push/db/istock POST requests in the GAE App log, all with 307 code as before. Nothing changed. No requests passed to the GAE app
https://code.google.com/p/cloud-pubsub/issues/detail?id=49
The latest answer from dev team is
We have a fix that will be rolling out over the next couple of weeks. As an ETA this is subject to change and can be extended, but should be rolled out by January 21.

Node.js Express block

My Problem is, that I'm planning to use express to cache all requests which I receiver for a certain amount of time until I send all responses at once.
But unfortunately I can't receive a second request until I've responded to the first one. So I guess node / express is somehow blocking the further processing of other requests.
I build a minimal working example for you, so you can see better what I'm talking about.
var express = require('express');
var app = express();
var ca = [];
app.get('/hello.txt', function(req, res){
ca.push(res);
console.log("Push");
});
setInterval(function(){
while (ca.length) {
var res = ca.shift();
res.send('Hello World');
console.log("Send");
}
},9000);
var server = app.listen(3000, function() {
console.log('Listening on port %d', server.address().port);
});
When I'm sending just one request to localhost:3000 and wait for 9sec I'm able to send a second one. But when I send both without waiting for the callback of the interval, the second one is blocked until the first interval triggered.
Long Story short: Why is this blocking happening and what ways are there to avoid this blocking.
PS: It seems that the default http package shows another behavior http://blog.nemikor.com/2010/05/21/long-polling-in-nodejs/
try it with firefox and chrome to prevent serializing the requests...
OK, I've got the solution.
The issue wasn't in my code, it was caused by Chrome. It seems that Chrome is serializing all requests, which target the same URL. But nevertheless it sends both request and won't serve the second request with the response of the first.
Anyway, thanks for you help!

Resources