So, bit of an odd problem. I have a bunch of media files saved as base64 strings in mongo, some are images, some are videos.
I made an API for getting the media files:
app.get('/api/media/:media_id', function (req, res) {
media.findById(req.params.media_id)
.exec(function (err, media) {
if (err) {
res.send(err);
}
var file = new Buffer(media.file, 'base64');
res.writeHead(200, {'Content-Type': media.type, 'Content-Transfer-Encoding': 'BASE64', 'Content-Length': file.length});
res.end(file);
});
});
Now, images have no problems. They load just fine, both directly from the API, and when I call the API from a front-end (for example <img src="/api/media/23498423">)
THE PROBLEM
If I fetch a video from a front-end, like the images - but with a video- or object-tag:
<video src="/api/media/3424525" controls></video>
there's no problem, but if I load the video in a browser directly from the API:
http://localhost:8080/api/media/3424525
the server process crashes, no errors. It simply just freezes up. And we're not talking about huge video files - it's a 1.5MB video.
The media type in the header for all the videos I'm testing with is video/mp4. Oh, and just to be clear: if I do the same with images, everything works perfectly.
EDIT:
Okay, so as suggested by #idbehold and #zeeshan I took a look at gridfs and gridfs-stream, and for the purpose of my app, this certainly is what I should have used in the first place. However, after implementing gridfs in my app, the problem still persists.
app.get('/api/media/:media_id', function (req, res) {
gfs.findOne({ _id: req.params.media_id }, function (err, file) {
if (err) {
return res.status(400).send(err);
}
if (!file) {
return res.status(404).send('');
}
res.set('Content-Type', file.contentType);
res.set('Content-Disposition', 'inline; filename="' + file.filename + '"');
var readstream = gfs.createReadStream({
_id: file._id
});
readstream.on("error", function (err) {
console.log("Got an error while processing stream: ", err.message);
res.end();
});
readstream.pipe(res);
});
});
When I call the media file (be it image or video) from a front-end, within a HTML tag, everything works out fine. But if I load a video (again, smallish videos from 1.5mb to max 6mb total size) directly in the browser, the server process freezes. To be a bit more clear: I am testing on windows, and the server app (server.js) is run in console. The console and the process it is running is what freezes. I cannot load any more pages/views in the node app, and I cannot even stop/kill/shutdown the node app or the console.
Streaming videos directly to/from GridFS using gridfs-stream either with mongodb-native db instance or mongoose.
var mongo = require('mongodb'),
Grid = require('gridfs-stream'),
db = new mongo.Db('yourDatabaseName', new mongo.Server("127.0.0.1", 27017)),
gfs = Grid(db, mongo);
//store
app.post('/video', function (req, res) {
req.pipe(gfs.createWriteStream({
filename: 'file_name_here'
}));
res.send("Success!");
});
//get
app.get('/video/:vid', function (req, res) {
gfs.createReadStream({
_id: req.params.vid // or provide filename: 'file_name_here'
}).pipe(res);
});
for complete files and running project:
Clone node-cheat direct_upload_gridfs, run node app followed by npm install express mongodb gridfs-stream.
Truly an odd problem...
I could be way off, but it's worth a shot:
One of the differences when opening a url directly from the browser is that the browser will also try to fetch http://localhost:8080/favicon.ico (while trying to find the tab icon). Maybe the problem is not related to your video code, but rather to some other route, trying to handle the /favicon.ico request?
Have you tried using wget or curl?
I don't know the answer, maybe this is a dumb suggestion, but what is the browser you are using? Maybe something from Microsoft causes the problem...
Related
I am fairly new to Node.js, and I am using Express and Busboy-Connect to create a simple file upload form, for wav files only.
Here is what I am trying to do :
- start the upload
- if the mimetype is not wav, redirect to an error page
- else : write the file on the server and redirect back.
If the mimetype is valid, everything works fine, but if it isn't I cannot redirect and the browser is just hanging and eventually times out.
My understanding of it is that the browser doesn't want to redirect because it is waiting for the upload to finish, but how can I cancel the upload then within my js code ?
I could work around the issue and write the file then delete it if it's not the right mimetype, but I think it's a bit stupid to do that, I'd rather find a way to trigger an event that will stop it and redirect immediately.
Here is (a snippet of) my app code :
app.get('/', function (req, res) {
res.render(__dirname + '/public/index.ejs', {error: 0});
});
app.get('/error', function (req, res) {
res.render(__dirname + '/public/index.ejs', {error: 1});
});
app.post('/upload', function (req, res) {
var timestamp = new Date().getTime().toString();
//console.log(timestamp);
var fstream;
req.pipe(req.busboy);
req.busboy.on('file', function (fieldname, file, filename, encoding, mimetype) {
if ("audio/wav" != mimetype)
{
console.log("invalid mimetype"); // that prints ok
// req.busboy.end(); // I tried that but it doesn't work
res.redirect('/error');
}
else
{
console.log("Uploading: " + mimetype);
fstream = fs.createWriteStream(__dirname + '/tmp/' + timestamp + filename);
file.pipe(fstream);
fstream.on('close', function () {
res.redirect('back');
});
}
});
});
Can anyone point me in the right direction?
Thank you for your help !
Alright I found it in the docs of npm, if you think anyone could be interested in finding this answer from a google search you can leave it resolved, otherwise feel free to close/remove this post.
Basically there is a function on the filestream that need to be called to unblock busboy, so all I had to do to make it work is to add
file.resume();
before redirecting to the error page.
I am learning node.js at the moment and I am creating a little application where I can upload images and display them in a gallery.
Currently I have a form which uploads the image to the server via POST
extends ../layout
block content
.col-sm-6.col-sm-offset-3
h1 control panel
form(action="/upload", method="POST",
enctype="multipart/form- data")
input(type="file", name='image')
input(type="submit", value="Upload Image")
The file is then inserted to a mongodb using mongoose
exports.upload = function (req, res) {
fs.readFile(req.files.image.path, function (err, data) {
var imageName = req.files.image.name;
if(!imageName){
console.log("seems to be an
error this file has not got a name");
res.redirect("/");
res.end();
}else{
var newimage = new Image();
newimage.img.data = fs.readFileSync(req.files.image.path);
newimage.img.name = imageName;
newimage.save(function (err, a) {
if (err){
console.log("There was an error saving the image")
res.redirect("/");
res.end();
}
res.redirect("/gallery");
});
}
});
}
In my gallery controller I query the database for all the images and pass them to front-end.
exports.gallery = function (req, res) {
Image.find({}, function(err, image){
if (err)
res.send(err);
else
res.render("site/gallery", {images: image });
});
}
And then in my gallery I try create a new image tag for each of the images
extends ../layout
block content
h1 Gallery
each image in images
img(src='#{image.img.data}')
My problem is that I keep getting a 404 error because the browser cannot find the image.
But I have a feeling that I might be going about this the wrong way. I have seen GridFS but I feel that it is not suitable for this app as the amount of images in the gallery will be less than 20 max. Am I going about the right way to do this or should I be storing the images on the server and retrieving them that way?
You would typically upload the images to your server's machine filesystem or to a static assets cloud hosting service like AWS S3, and store only the URLs of the images in your database.
You could also use a solution like Cloudinary.
I wrote a node.js app that displays in real time pictures from Instagram with a specific hashtag. The app works very well and is very quick. What i would like to add, is the possibility to save all displayed images into a specific folder (on my computer) for printing (then adding a script to do that).
Do you know if there is a library that would allow me to save those images into a folder ? Thank you
The fact that it is an Instagram photo and you are getting it in a real-time is not really important in saving the photo on your local hard drive.
Regardless of you using any npm module or simply using your own code some where at the end you should have a URL to the photo like this:
"standard_resolution": {
"url": "http://distillery.s3.amazonaws.com/media/2011/02/01/34d027f155204a1f98dde38649a752ad_7.jpg",
"width": 612,
"height": 612
}
So you can simply use:
var http = require('http-get');
var options = {url: 'http://distillery.s3.amazonaws.com/media/2011/02/01/34d027f155204a1f98dde38649a752ad_7.jpg'};
http.get(options, '/path/to/where/you/want/myPic1.jpg', function (error, result) {
if (error) {
console.error(error);
} else {
console.log('Photo successfully downloaded and saved at: ' + result.file);
}
});
or I have seen this too (I can't remember where but I have it as snippet):
var http = require('http')
, fs = require('fs')
, options
options = {
host: 'www.google.com'
, port: 80
, path: '/images/logos/ps_logo2.png'
}
var request = http.get(options, function(res){
var imagedata = ''
res.setEncoding('binary')
res.on('data', function(chunk){
imagedata += chunk
})
res.on('end', function(){
fs.writeFile('logo.png', imagedata, 'binary', function(err){
if (err) throw err
console.log('File saved.')
})
})
})
Since it is an http request the respond time depends on so many factors (from the speed of your internet, Instagram server respond time, all the way to your hard drive speed).
So if you can make a queue with Redis or other message queue tools you can make sure all the photos eventually will be saved locally if your app is running 24x7 with lots of subscriptions.
Good luck!
Update: I found the URL to my code snippets: Writing image to local server
Appreciate these guys for their codes! I just pointed out.
I think this should be a straight forward thing, but I can't fing a solution :s
I'm trying to figure out the best way to show images stored on amazon S3 on a website.
Currently I'm trying to get this to work (unsuccessful)
//app.js
app.get('/test', function (req, res) {
var file = fs.createWriteStream('slash-s3.jpg');
client.getFile('guitarists/cAtiPkr.jpg', function(err, res) {
res.on('data', function(data) { file.write(data); });
res.on('end', function(chunk) { file.end(); });
});
});
//index.html
<img src="/test" />
Isn't it maybe possible to show the images directly from amazon ?
I mean, the solution that lightens the load on my server would be the best.
This is a typical use case for streams. What you want to do is: request a file from Amazon S3 and redirect the answer of this request (ie. the image) directly to the client, without storing a temporary file. This can be done by using the .pipe() function of a stream.
I assume the library you use to query Amazon S3 returns a stream, as you already use .on('data') and .on('end'), which are standard events for a stream object.
Here is how you can do it:
app.get('/test', function (req, res) {
client.getFile('guitarists/cAtiPkr.jpg', function(err, imageStream) {
imageStream.pipe(res);
});
});
By using pipe, we redirect the output of the request to S3 directly to the client. When the request to S3 closes, this will automatically end the res of express.
For more information about streams, refer to substack's excellent Stream Handbook.
PS: Be careful, in your code snippet you have two variables named res: the inner variable will mask the outer variable which could lead to hard to find bugs.
if you set the access controls correctly (on a per-key basis. not on the whole Bucket.) then you can just use <img src="http://aws.amazon.com/myBucket/myKey/moreKey.jpg"> (or another appropriate domain if you're using something other than us-east-1) wherever you want to display the image in your html. remember to set the MIME type if you want browsers to display the image instead of downloading the image as attachment when someone opens it.
aws-sdk docs
s3: PUT Object docs
var AWS = require('aws-sdk');
AWS.config.update({
accessKeyId: "something",
secretAccessKey: "something else",
region:'us-east-1',
sslEnabled: true,
});
var fileStream = fs.createReadStream(filename);
s3.putObject({
Bucket:'myBucket',
Key: 'myKey/moreKey.jpg',
Body: fileStream,
ContentType:'image/jpeg',
ACL: 'public-read',
}, function(err, data){
if(err){ return callback(err) };
console.log('Uploaded '+key);
callback(null);
});
Im trying to build a web gui for a canon eos 7d using node.js (0.10.20), libgphoto2 (2.5.2.) and gphoto2 module for node on a Raspberry Pi (latest raspbian).
Everything seems to work fine except for saving the files in node.
im using the following code snippet:
app.get('/shoot', function(req, res){
camera.takePicture({download:true}, function(er, data){
res.header('Content-Type', 'image/jpeg');
res.send(data);
fs.writeFile("public/images/sampleImg.jpg", data);
});
});
The file created is unreadable/not a valid jpg image
using the cli tool for libgphoto creates a valid image:
pi#raspi /srv/node/eos $ gphoto2 --capture-image-and-download
so i assume the error is somewhere in the node-code for saving data
how would i properly save the data in node to a .jpg file?
I am working on something very similar.
I seem to remember needing to specify that the data from the camera was binary, something like:
app.get('/shoot', function(req, res){
camera.takePicture({download:true}, function(er, data){
res.header('Content-Type', 'image/jpeg');
res.send(new Buffer(data, 'binary'));
fs.writeFile(
"public/images/sampleImg.jpg",
new Buffer(data, 'binary'),
function (err){}
);
});
});
Shoot me an email if you want to collaborate.
http://tonyspiro.com/uploading-and-resizing-an-image-using-node-js/
http://lists.basho.com/pipermail/riak-users_lists.basho.com/2011-May/004270.html