Restify not sending complete xml response - node.js

I have a small restify api that talks to sql server and returns xml back. xml response can be quite large and is consumed by Adobe InDesign. Api call works in the browser but when called from InDesign, I get incomplete xml response.
InDesign uses a proprietary scripting language called ExtendScript which uses sockets to communicate. Not sure in Streams is an option.

I have a very similar problem and after a lot of playing around I added a few seconds pause between connection write and read and this resolved the problem for now. I am not a fan of this fix. Anyone has a better idea? I suspect that as the files get bigger you will have to start increasing the timeout.
var conn = new Socket;
if (conn.open (host + ":7000","UTF-8"))
{ conn.write("GET /report/" + ReportID +" HTTP/1.0\n" +
"Content-Type: application/xml; charset=UTF-8\n\n");
$.sleep(2000);
reply = conn.read (999999);
conn.close();
}

You really need to use callbacks and the ->net<- library.
net library doc
var net = require('net');
var client = net.connect({port: 8124},
function() { //'connect' listener
console.log('connected to server!');
client.write('world!\r\n');
});
client.on('data', function(data) {
console.log(data.toString());
client.end();
});
client.on('end', function() {
console.log('disconnected from server');
});
I copied that example from the oficial docs, as you can see you really need to use callbacks on the events, in this case de "data" event, when you get the complete response THEN you proceed to send your response.
Your XML is incomplete because you send the response before the Socket finish to recieve the data.
Remember javascript is non-blocking, and this things can happen, thats why you need to learn how to use events and callbacks.
PLEASE use the net library.
Hope it helps.
PD: im sorry about my poor english but im trying to be helpful.

Related

How to return real-time JSON without HTML code

I am trying to make an API that will send back a real-time JSON. I am using NodeJS with ExpressJS, and Socket.io, and the problem is that res.send can not be sent more than one time; And, I really don't know how to send my (real-time) data without asking the refresh of my page.
Basically, I made a timer that changes the value every second.
I also tried to send a file, but I can't use this method, because my iOS app is asking a JSON data without HTML code
setInterval( function() {
var msg = Math.random().toString();
io.emit('message', msg);
console.log(msg);
res.send(msg);
}, 1000);
Maybe, there is another framework than Express than I could use and could refresh my data automatically? The console.log line works well and my data is updated every 1000ms.
Thank you in advance

Connecting to socket.io 1.x manually using websockets, capacity testing

I am working with a nodejs express server which uses socket.io to communicate an iOS client, and am having a little trouble trying to test how many clients can connect and exchange data at any one time.
My goal is to be able to run a script which connects to socket.io with thousands of different sessions, as well as send and receive data to understand our system's scale. Currently we are using a single dyno on Heroku but will likely be considering other options on AWS soon.
I have found code which should do what I am trying to do for earlier versions of socket.io, such as this, but have had issues since it seems v1.x has a very different handshake protocol. I tried out using the socket.io-client package, but trying to connect multiple times only simulates use of one session, I need to simulate many in independent users.
I have been picking apart the socket.io-client code, but have only gotten so far as creating a connection - I am stuck on the sending data part. If anyone has any knowledge or could point to some written resources on how data is sent between a client and a socket.io server, it would help me out a lot.
Here's what I have so far:
var needle = require('needle'),
WebSocket = require('ws'),
BASE_URL = 'url-to-socket-host:5002';
var connectionNo = 0;
needle.get('http://' + BASE_URL + '/socket.io/?EIO=3&transport=polling&t=1416506501335-0', function (err, resp) {
// parse the sid
var resp = JSON.parse(resp.body.toString().substring(5, resp.body.toString().length));
// use the sid to connect using websockets
var url = 'ws://' + BASE_URL + '/socket.io/?EIO=3&transport=websocket&sid=' + resp.sid;
console.log(connectionNo + ' with sid: ' + resp.sid);
var socket = new WebSocket(url, void(0), {
agent: false
});
socket.on('open', function () {
console.log('Websocket connected: ' + connectionNo);
// I don't understand how to send data to the server here,
// from looking at the source code it should use some kind
// of binary encoding, any ideas?
socket.on('message', function (msg) {
console.log(msg);
});
});
});
I will continue deconstructing the socket.io-client code but if anyone has any clues or recourses that may help, let me know. Thanks.
I ended up setting for using the socket.io-client npm package which has the ability to connect to a new session on every connection. I found an example benchmark in this issue.
There is not so much need for me to manually connect to socket.io using pure websockets and HTTP, but thanks to Yannik for pointing out the parser in use. The spec of the inner workings of v1.x can be found here.
Thanks!
The problem my reside in the fact that you are not using socket.io in your client code. You have imported ('ws') which is another module whose docs are here: https://www.npmjs.org/package/ws.
You probably want to ws.send('something');. When you receive a message in ws, it also comes with an object with a property indicating whether it is binary data or not. If it is, you will need to concatenate the chunks incrementally. There is a canonical way to do this which you can find via google. But it looks a little like this:
var message;
socketConnection.on('data', function(chunk){ message += chunk});

Node.js Outgoing Http request connection limit (cannot make connections more than five)

I'm building a data transfer proxy server using node.js.
It pipes client's request to swift object storage server using http(s) REST API.
It works fine for the individual request but when the outgoing
ESTABLISHED tcp connection for the same destination and port(443)
reaches five, it cannot create any new connection.
It does not seem to be a problem of O/S, because I've tried to create more than 10 connections using java servlet and it works fine.
I've tried to set maximum sockets for globalAgent like below, but it does not change anything.
http.globalAgent.maxSockets = 500;
https.globalAgent.maxSockets = 500;
Here is a part of my source code.
app.post('/download*', function(req, res){
/***********************************************************
* Some codes here to create new request option
***********************************************************/
var client = https.request(reqOptions, function(swiftRes) {
var buffers = [];
res.header('Content-Length', swiftRes.headers['content-length']);
res.header('Content-Type', swiftRes.headers['content-type']);
res.header('range', swiftRes.headers['range']);
res.header('connection', swiftRes.headers['connection']);
swiftRes.pipe(res);
swiftRes.on('end', function(err){
res.end();
});
});
client.on('error', function(err) {
callback && callback(err);
client.end(err);
clog.error('######### Swift Client Error event occurred. Process EXIT ');
});
client.end();
});
I hope I can get the solution for this problem.
Thanks in advance.
Usually, the change of the maxSockets should solve your problem, try it with a value a little bit lower.
https.globalAgent.maxSockets=20;
If that does not solve your problem, try to turn off pooling for the connections. Add the key agent with the value false to the options to the request. Keep in mind that Node.js uses the pooling to use keep-alive connection.
//Your option code
reqOptions.agent=false;

Sending an http response outside of the route function?

So, I have a route function like the following:
var http = require('http').createServer(start);
function start(req, res){
//routing stuff
}
and below that,I have a socket.io event listener:
io.sockets.on('connection', function(socket){
socket.on('event', function(data){
//perform an http response
}
}
When the socket event 'event' is called, I would like to perform an http response like the following:
res.writeHead(200, {'Content-disposition': 'attachment; filename=file.zip'})
res.writeHead(200, { 'Content-Type': 'application/zip' });
var filestream = fs.createReadStream('file.zip');
filestream.on('data', function(chunk) {
res.write(chunk);
});
filestream.on('end', function() {
res.end();
});
This last part, when performed within the routing function works just fine, but unfortunately when it is called from the socket event, it of course does not work, because it has no reference to the 'req' or 'res' objects. How would I go about doing this? Thanks.
Hmmm... interesting problem:
It's not impossible to do something like what you're trying to do, the flow would be something like this:
Receive http request, don't respond, keep res object saved somewhere.
Receive websocket request, do your auth/"link" it to the res object saved earlier.
Respond with file via res.
BUT it's not very pretty for a few reasons:
You need to keep res objects saved, if your server restarts a whole bunch of response objects are lost.
You need to figure out how to link websocket clients to http request clients. You could do something with cookies/localstorage to do this, I think.
Scaling to another server will become a lot harder / will you proxy clients to always be served by the same server somehow? Otherwise the linking will get harder.
I would propose a different solution for you: You want to do some client/server steps using websockets before you let someone download a file?
This question has a solution to do downloads via websocket: receive file via websocket and initiate download dialog
Sounds like it won't work on older browsers / IE, but a nice option.
Also mentions downloading via hidden iframe
Check here whether this solution is cross-browser enough for you: http://caniuse.com/#feat=datauri
Another option would be to generate a unique URL for the download, and only append it to the browser's window (either as a hidden iframe download, or as a simple download button) once you've done your logic via websocket. This option would be more available cross-browser and easier to code.

socket.io - works first time, not second time onwards

When I start my node.js server and client gets connected, I am able to send a request from client (socket.emit) and get a response (socket.on('rentsAround'....)). But when I connect 2nd time onwards, the client is able to send, but server can not send or emit. So I have to restart the server again. I understand that it is working as expected, but somehow my understanding is wrong somewhere... Would someone please point out.
client side:
========
var socket = new io.Socket();
socket = io.connect();
socket.on('rentsAround', function(data){
registration.handleRentsAround(data);
});
socket.on('locationDetailsRes', function(data){
registration.handleRentsAround(data);
});
socket.on('connect', function(data){
alert('inside connect on client side');
});
socket.on('disconnect', function(){
// do something, if you want to.
});
.............
socket.emit("searchRent", {"lat":lat, "lng":lng});
server side:
========
socket.sockets.on('connection', function(client){
client.on('searchRent', function(msg){
console.log('inside on connection');
// do something and reply back
client.emit('rentsAround',{"totalRents":docs.length, "rents":docs});
});
client.on('disconnect', function(){
sys.puts("client disconnect");
mongoose.disconnect();
});
Socket.io 0.7 onwards will try and reuse connections to the same host. In my experience I've found this behaviour can be a little flakey.
I can't tell from the small code sample you provided, but I suspect the problem is that the second call to connect() is trying to reuse the first (closed) connection.
The workaround is to pass the 'force new connection' option when you call connect(). Eg:
io.connect("http://localhost", {'force new connection': true});
Your second line discards the object created in the first line. Simply doing this should work:
var socket = io.connect();
The problem with first send and second fail could be due to browser/protocol. I have seen such behaviour with Internet Explorer and XHR transport, and also with Opera using JSONP.
If you are using IE, try switching to JSONP and it should work properly. This can be done on the server side, by supplying the transport list to configuration. Just make sure JSONP comes before XHR. Something like this:
sio.set('transports', [
'websocket'
, 'flashsocket'
, 'jsonp-polling'
, 'xhr-polling'
, 'htmlfile'
]);
As of socket.io 1.0, two settings control this behaviour:
"force new connection":true, on the client connect() call.
"cookie":false, on the server Server() constructor.
Apparently, both produce the exact same behaviour.
The second method is, as of today, undocumented. However, looking at the source code you can see that all options passed to socket.io Server() are passed to the internal Engine.io library Server() constructor, which lets you change any of the options there. These options are documented here:
https://github.com/LearnBoost/engine.io

Resources