How to save the ID of a saved file using GridFS? - node.js

I am trying to save the ID of the file that I send via GridFS to my MongoDB (working with Mongoose). However I cant seem to find out how to get the created ID in fs.files with code?
var writestream = gfs.createWriteStream({
filename: req.file.originalname
});
fs.createReadStream(req.file.path).pipe(writestream);
writestream.on('close', function (file) {
// do something with `file`
console.log(file.filename + 'Written To DB');
});
I cant seem to be able to save the ID of the File that I wrote via the writestream.
The file is created in the lists etc. but how do I manage to save the file so that I can save it in one of my other MongoDB Documents?

This is old and I think you already solved you problem.
If I correctly understood, you were trying to get the id of the saved file.
You can simply do this:
console.log(writestream.id);

Related

GridFS pipe readable stream and return _id of streamed document

I am trying to get back the _id of a document that has been streamed to MongoDB via GridFS. Is there a way to do this?
The code is very simple. I am running the following to insert the document into MongoDB:
readableStream.pipe(bucket.openUploadStream('myFile.pdf');
Looking for a way to get the _id back from this stream - if anyone knows a way to do this please advise.
We can listen to the finish event of bucket upload stream
const uploadStream = bucket.openUploadStream('myFile.pdf');
uploadStream.on("finish", (file) => {
console.log(file); // data here
});
readableStream.pipe(uploadStream);

Discord Bot (node.js) : read data from external file

I set up my discord BOT using node.js. For my advantage, I would need to store some data on a external file, but I don't seem to be able to access it from my index.js file (the main Bot file).
I've tried having one static array in the external js/json files, but I can only retrieve undefined/empty values. Additionally, when I tried with a .txt file, once retrieved the content, I found it unable to call functions such as string.split().
Did I miss something in the package content perhaps?
Assuming the data you are storing is in UTF-8 encoding:
var fs = require('fs');
fs.readFile('path/to/file', 'utf8', function(err, contents) {
// code using file data
});
Assuming no errors contents will be a string of the data that is inside that file.
https://code-maven.com/reading-a-file-with-nodejs

How can I display files that have been saved as a Buffer?

I am saving files as Buffers in my mongo database (using mongoose, nodejs, electron). For now, I'm keeping it simple with text-only files. I read a file in using
fs.readFile(filePath, function(err, data) {
if (err) {console.log(err);}
typeof callback == "function" && callback(data);
});
Then I create a new file in my database using the data variable. And, now I have something that looks like BinData(0,"SGVsbG8gV29ybGQK") stored in my mongodb. All is fine so far.
Now, what if I wanted to display that file in the UI? In this case, in Electron? I think there are two steps.
Step 1 The first is bringing this variable out of the DB and into the front-end. FYI: The model is called File and the variable that stores the file contents is called content.
So, I've tried File.content.toString() which gives me Object {type: "Buffer", data: Array[7]} which is not the string I'm expecting. What is happening here? I read here that this should work.
Step 2 Display this file. Now, since I'm only using text files right now, I can just display the string I get once Step 1 is working. But, is there a good way to do this for more complex files? Images and GIFs and such?
You should save the file mime.
And then set response.header MIME
response.setHeader("Content-Type", "image/jpeg");
response.write(binary)
response.end()

Extra data in Skipper stream

Currently writing code to make a skipper gridfs adapter. When you call upload, files are passed in the callback and I would like to add an extra field to the files that contain the metadata of a gridfs store and the ID of a gridfs store.
I'm looking through the Upstream code in skipper and see something called stream.extra. I'm guessing it's for passing extra data, how would I go about using this?
Thanks for working on this! You can add extra metadata to your stream by putting it on the __newFile object in the receiver's _write method. For example, in the bundled s3Receiver, you can see this on line 61:
__newFile.extra.fsName = fsName;
which is adding the newly generated filename as metadata on the uploaded file object. In your controller's upload callback, you can retrieve the extra data from the returned file objects:
req.file('myFile').upload(function (err, files) {
var newFileName = files[0].extra.fsName;
});

Synchronous issue to save a file into GridFS - Node.JS/Express

I'm trying to save a profile photo from a social network as Facebook in my "fs.files" and "fs.chunks" as well, I can achieve this but the way I found to do it is not properly one.
My steps are:
Make user logon on the Facebook using Passport;
Saving the picture file on the disk (an internal application folder);
Open and read the picture file and store it on the properly collections (files and chunks from Mongo).
The problem happens between steps 2 and 3, because saving a file on the disk in this case is not the best idea and a synchronous issue and latency happens when attempt to store it on the DB.
I had used setTimeout javascript function to make it happens. But it is so bad, I know. I wonder to know somehow a way to get some kind of file stream and store it directly on the GridFS or to make the the actual process more efficient:
The codes:
Get the picture from an URL and save it (disk step)
// processing image 'http://graph.facebook.com/' + user.uid + '/picture';
// facebook -> disk
var userProfilePhoto = user.provider + '_' + user.uid + '.png';
request(user.photoURL)
.pipe(fs.createWriteStream(configApp.temp_files + userProfilePhoto));
Get the picture saved on the disk and store it on the GridFS
setTimeout(function() {
mongoFile.saveFile(user, true, userProfilePhoto, mimeType.mime_png,
function(err) {
if (!err) {
user.save();
}
}
);
}, 5000);
Unfortunately I had to use setTimeout to make it possible, without it the mongo just insert on "fs.files" and skip "fs.chunks" because the file is not ready yet - seems to be not saved yet.
I did it!
I replaced request plugin for http-get so when the file is ready I can store it in MongoDB.

Resources