Sending text to the browser - node.js

I have managed to get file uploading work in Node.js with Express, and in the code i'm checking whether it's an image or not that the user is trying to upload.
If the file was successfully uploaded I want to show a message to the user, directly to the HTML page with the uploading form. The same should be if the file the user tried to upload wasn't an image, or something else happened during the upload.
The code below works (res.send...) but it opens up a new page containing only the message.
My question is: How can I change my code so that the message is sent directly to the HTML page instead? If it could be of any use, i'm using Jade.
Thanks in advance!
app.post('/file-upload', function(req, res, next) {
var fileType = req.files.thumbnail.type;
var divided = fileType.split("/");
var theType = divided[0];
if (theType === "image"){
var tmp_path = req.files.thumbnail.path;
var target_path = './public/images/' + req.files.thumbnail.name;
fs.rename(tmp_path, target_path, function(err) {
if (err) throw err;
fs.unlink(tmp_path, function() {
if (err) {
throw err;
res.send('Something happened while trying to upload, try again!');
}
res.send('File uploaded to: ' + target_path + ' - ' + req.files.thumbnail.size + ' bytes');
});
});
}
else {
res.send('No image!');
}
});

from what I understand you are trying to send a message to an already open browser window?
a few things you can do,
Ajax it, send the post, and process the return info.
Submit it as you are doing now, but set a flash message (look at http://github.com/visionmedia/express-messages) and either res.render the form page, or res.redirect to the form function
now.js or a similar solution. This would let clientside use serverside functions and serverside code to run clientside functions. So what you would do would be on submit, pass the post values to a serverside function, which will process it and trigger a clientside function (display a message)
For my money option #2 is probably the safest bet, as clients without javascript enabled will be able to use it. As for usability #1 or #3 would give a more streamlined appearance to the end user.

You can use WebSockets. I recommend using Socket.IO, it's very easy to work with. On the client-side you would have an event-handler which would use JavaScript to append the new information to that page.
You could then have the server for example say:
socket.emit('error', "Something happened while trying to upload, try again!");
and the client would use:
socket.on('error', function(data){
//alert?
alert(data);
});
http://socket.io/#how-to-use

Related

how to pass data and redirect in express

My code is
main.post('/userlist', function(req, res, next) {
// where did you get this?
// var db = req.db;
var query = connection.query("SELECT name,zaman,giriscikis FROM giriscikis where date_format(zaman,'%Y-%m-%d') between ? and ?", [req.body.bas, req.body.bitis], function(err, result) {
if (err) throw err;
console.log(result);
res.send(httpResponse(result));
return res.redirect('/zamansorgu');
//console.log(req.body.bas)
//console.log(req.body.bitis)
});
});
I want to fecth data from database and redirect to same page in the code(zamansorgu.html)
But I get an error
Cannot set headers after they are sent to the client
How can I solve this problem
thank you for your helps
You are attempting to send back JSON data and redirect to a different page. That's not possible. Each endpoint request can have one response, not more. You can either send back the data, or redirect. That's because redirecting really does send back data too (the html of the new target page).
Think about it from the caller's point of view. If it did allow this how would it work? If someone uses this link from a browser should the browser show the JSON data you returned, or should it take the user to the new page?
The error is saying "hey, I already sent back data. I can't redirect now because we are already down the path of returning some JSON".
If you want to use the data to format the output that can be done, or if you want to redirect to a new location and pass the data in the url, that's also possible. Consider code like this:
main.post('/userlist', function(req, res, next) {
// var db = req.db;
var query = connection.query("SELECT name,zaman,giriscikis FROM giriscikis where date_format(zaman,'%Y-%m-%d') between ? and ?", [req.body.bas, req.body.bitis], function(err, result) {
if (err) return next(err);
if (result.urlparam) {
// this builds a new url using the query value
const nextUrl = `/zamansorgu?param=${result.urlparam}`;
return res.redirect(nextUrl);
}
else {
// this builds html here
const html = `<html><body><h1>${result.title}</h1></body></html>`;
return res.send(html);
}
});
});
I also ran into this, in my case it was quite a deceptive little bug. A node-inspector session helped me pinpoint the problem quickly however. The problem in my case was pretty bone-headed, the res.end call in the sample below is the offending line.
res.writeHead(200, {"Content-Type": "application/json"});
res.end(JSON.stringify(someObject));
someObject did not exist after a refactor and that was causing a ReferenceError to get thrown. There is a try-catch in Router.prototype._dispatch that is catching the ReferenceError and passing it into next
res.status(301).redirect(`/zamansorgu?name=${"John"}&email=${"john#email.com"}`)
So, this is something I explored but it will be dependent on the structure of your application. You could always pull the data out using query params and hydrate your application.

How finish res without change page?

With res.end() or res.send() the result is a blank page, so how to finish without changing the page? My code is the following:
router.post('/subirArchivo/:idProducto', function (req, res){
var idProducto = req.params.idProducto;
var form = new formidable.IncomingForm();
var dir = '../../../../uploads/'+idProducto+'/';
form.parse(req);
form.on('fileBegin', function (name, file){
checkDirectorySync(path.join(__dirname, dir));
file.path = path.join(__dirname,dir, file.name);
});
form.on('file', function (name, file){
console.log('Uploaded ' + file.name);
});
res.end;});
This is a client-side issue, not a server issue because it is the client that determines how this works.
If you let the browser submit a form on its own (normal form post submission), then it will be expecting a response back from the POST that it will show in the browser. You cannot change that if it is an automatic form submission.
If you use an Ajax call to post the data to the server and you prevent the default form post, then the response comes back to your Javascript and you can do anything you want with the returned response (including nothing). The page contents will not change on their own.

Node spawn process

Unable to find out the issue in following script, what i want to achieve with the script is to have a node log server that would listen to post requests with log title and log details as query parameters, write to a file and then throw back as json on get request.
Problem:
It constantly shows loader sometime and gives the required log sometime.
Note:
The process spawning is done to update the browser during the logging, if someone has better solution, plz suggest
Post Call:
http://127.0.0.1:8081/log?title="test"&detail="test detail"
Code:
var express = require("express");
var spawn = require('child_process').spawn;
var fs = require("fs");
var srv = express();
var outputFilename = '/tmp/my.json';
function getParamsObject(context) {
var params = {};
for (var propt_params in context.params) {
params[propt_params] = context.params[propt_params];
//define(params, propt_params, context.params[propt_params]);
}
for (var propt_body in context.body) {
params[propt_body] = context.body[propt_body];
//define(params, propt_body, context.body[propt_body]);
}
for (var propt_query in context.query) {
params[propt_query] = context.query[propt_query];
//define(params, propt_query, context.query[propt_query]);
}
return params;
}
srv.get("/", function(req, res) {
res.send("Hello World From Index\n");
});
srv.get("/Main", function(req, res) {
res.send("Hello World From Main\n");
});
srv.get("/ReadFile", function(req, res) {
fs.readFile("example_one.txt", function(err, data) {
if(err) throw err;
res.send(data.toString());
});
});
srv.get("/ReadFileJSON", function(req, res) {
fs.readFile("example_one.txt", function(err, data) {
if(err) throw err;
res.setHeader("content-type", "application/json");
res.send(new Parser().parse(data.toString()));
});
});
srv.post("/log", function(req, res) {
var input = getParamsObject(req);
if(input.detail) {
var myData = {
Date: (new Date()).toString(),
Title: input.title,
Detail: input.detail
}
fs.writeFile(outputFilename, JSON.stringify(myData, null, 4), function(err) {
if(err) {
console.log(err);
}
});
}
res.setHeader("content-type", "application/json");
res.send({message:"Saved"});
});
srv.get("/log", function(req, res) {
var child = spawn('tail', ['-f', outputFilename]);
child.stdout.pipe(res);
res.on('end', function() {
child.kill();
});
});
srv.listen(8081);
console.log('Server running on port 8081.');
To clarify the question...
You want some requests to write to a log file.
You want to effectively do a log tail over HTTP, and are currently doing that by spawning tail in a child process.
This isn't all that effective.
Problem: It constantly shows loader sometime and gives the required log sometime.
Web browsers buffer data. You're sending the data, sure, but the browser isn't going to display it until a minimum buffer size is reached. And then, there are rules for what will display when all the markup (or just text in this case) hasn't loaded yet. Basically, you can't stream a response to the client and reliably expect the client to do anything with it until it is done streaming. Since you're tailing a log, that puts you in a bad predicament.
What you must do is find a different way to send that data to the client. This is a good candidate for web sockets. You can create a persistent connection between the client and the server and then handle the data immediately rather than worrying about a client buffer. Since you are using Node.js already, I suggest looking into Socket.IO as it provides a quick way to get up and running with web sockets, and long-polling JSON (among others) as a fallback in case web sockets aren't available on the current browser.
Next, there is no need to spawn another process to read a file in the same way tail does. As Trott has pointed out, there is an NPM package for doing exactly what you need: https://github.com/lucagrulla/node-tail Just set up an event handler for the line event, and then fire a line event on the web socket so that your JavaScript client receives it and displays it to the user immediately.
There are a couple of things that seem to stand out as unnecessary complications that may be the source of your problem.
First, the spawn seems unnecessary. It appears you want to open a file for reading and get updated any time something gets added to the file. You can do this in Node with fs.watch(), fs.watchFile(), or the node-tail module. This may be more robust than using spawn() to create a child process.
Second (and less likely to be the source of the problem, I think), you seem to be using query string parameters on a POST request. While not invalid, this is unusual. Usually, if you are using the POST method, you send the data via post, as part of the body of the request. If using the GET method, data is sent as a query string. If you are not using the body to send data, switch to GET.

how to publish a page using node.js

I have just begun to learn node.js. Over the last two days, I've been working on a project that accepts userinput and publishes a ICS file. I have all of that working. Now consider when I have to show this data. I get a router.get to see if I am at the /cal page and..
router.get('/cal', function(req, res, next)
{
var db = req.db;
var ical = new icalendar.iCalendar();
db.find({
evauthor: 'mykey'
}, function(err, docs) {
docs.forEach(function(obj) {
var event2 = ical.addComponent('VEVENT');
event2.setSummary(obj.evics.evtitle);
event2.setDate(new Date(obj.evics.evdatestart), new Date(obj.evics.evdateend));
event2.setLocation(obj.evics.evlocation)
//console.log(ical.toString());
});
});
res.send(ical.toString());
// res.render('index', {
// title: 'Cal View'
// })
})
So when /cal is requested, it loops through my db and creates an ICS calendar ical. If I do console.log(ical.toString) within the loop, it gives me a properly formatted calendar following the protocol.
However, I'd like to END the response with this. At the end I do a res.send just to see what gets published on the page. This is what gets published
BEGIN:VCALENDAR VERSION:2.0
PRODID:calendar//EN
END:VCALENDAR
Now the reason is pretty obvious. Its the nature of node.js. The response gets sent to the browser before the callback function finishes adding each individual VEVENT to the calendar object.
I have two related questions:
1) Whats the proper way to "wait" till the callback is done.
2) How
do I use res to send out a .ics dynamic link with
ical.toString() as the content. Do I need to create a new view for
this ?
edit: I guess for number 2 I'd have to set the HTTP headers like so
//set correct content-type-header
header('Content-type: text/calendar; charset=utf-8');
header('Content-Disposition: inline; filename=calendar.ics');
but how do I do this when using views.
Simply send the response, once you got the neccessary data! You are not required to end or send directly in your route but can do it in a nested callback as well:
router.get('/cal', function(req, res, next) {
var db = req.db;
var ical = new icalendar.iCalendar();
db.find({
evauthor: 'mykey'
}, function(err, docs) {
docs.forEach(function(obj) {
var event2 = ical.addComponent('VEVENT');
event2.setSummary(obj.evics.evtitle);
event2.setDate(new Date(obj.evics.evdatestart), new Date(obj.evics.evdateend));
event2.setLocation(obj.evics.evlocation)
});
res.type('ics');
res.send(ical.toString());
});
});
I also included sending the proper Content-Type by using res.type.
Also: Don't forget to add proper error handling. You can for example use res.sendStatus(500) if an error occured while retrieving the documents.

Valums file-uploader on nodejs - multiple file upload

I'm using valums ajax file-uploader
My nodejs server side looks like this:
fileStream = fs.createWriteStream(__dirname+'/../public/images/houses/'+rndname);
req.pipe(fileStream);
req.on('end', function() {
body = '{"success":"true", "name": "'+rndname+'"}';
res.writeHead(200,
{ 'Content-Type':'text/plain'
, 'Content-Length':body.length
});
res.end(body);
});
client side:
function createUploader(){
var uploader = new qq.FileUploader({
element: document.getElementById('homepic'),
action: '/server/upload',
allowedExtensions: ['jpg', 'png', 'gif'],
multiple:true,
onComplete: function(id, fileName, responseJSON){
$("#homepic").append("<img src='/images/houses/"+responseJSON.name+"' class='mediumpic' /> ");
}
});
}
window.onload = createUploader;
This all works for single file upload just great!
So imagine - i press on upload button, chose pic, it uploads really fast, shows up in screen.
Now i want to upload another one. I choose pic, it uploads on server fast (i see it on server), i get back the response of new filename and success, i put the picture on screen with my append. But the picture does not show up. I try open in new tab just the pic and still nothing even though i see it on the server standing in the right dir. After like 3-5 min of waiting it just shows up, without even page refresh needed. Whats causing this behavior? Is it the piping and i need to call Mario to fix it or something else? :)
Changed my req.pipe to req.on('data') and it started to work. Somehow it seems like my req.pipe didnt close connection and was "expecting" for more data to come even though all file was uploaded. That is why i could not call GET to file and it was in "pending" status.
This fixed my problems:
req.on('data', function(data) {
ws.write(data);
});
req.on('end', function(data) {
var body = '{"success":"true", "name": "'+rndname+'"}';
res.writeHead(200,
{ 'Content-Type':'text/plain'
, 'Content-Length':body.length
});
res.end(body);
});
Even though i found solution to my problem if anyone knows why req.pipe didnt close connection and was stuck hanging for like 2-3mins till the pic appeared, let me know.
I've created an Express 3.x middleware component that allows you to access valums/fineuploader uploaded files through the req.files collection.
All you need to do is add the middleware during your Express configuration, like so:
var fineuploaderExpressMiddleware = require('fineuploader-express-middleware');
app.use(express.cookieParser());
app.use(express.bodyParser());
app.use(fineuploaderExpressMiddleware({ uploadDir: '/tmp ' }));
The component is available here: https://github.com/suprememoocow/fineuploader-express-middleware

Resources