unwanted caching in nodejs fs.readFile - node.js

Im having a strange issue in my NodeJS app. Im using
function getFile(filename, cb){
fs.readFile(filename, 'utf8', (err, data) => {
if(err){
throw err;
return;
}
return cb(data);
});
}
to open an html file, get its contents and send to the client for different modal screens. However, the first html file it opens seems to be getting cached, somehow. I know there's no real caching in the nodejs fs, so Im unsure as to where the problem could be stemming from. Ive ensured that the correct values are in fact being passed into the function, but the output is always the first file opened.
How can I stop this from happening?

Related

NodeJS FTP not throwing error when trying to delete non-existent file

I am working with ftp-npm and I am currently facing a weird bug...
I have a method called refreshStore that contains this part of code :
c.delete('/pathToMyFileOnFTPServer', function(err) {
console.log('117');
if (err) throw err;
});
This is not throwing error and even not logging the '117' string even though my file does not exist on the server... Why is this like this ?
Thanks all!

Node.js fs.writeFile returning unexpected token syntax error

The following node.js test code fragment works perfectly when run for the first time (creating the file), but fails to overwrite the file once it's already created and instead generates a syntax error when the code is run a second time: 'SyntaxError: Unexpected token'. The Node docs say that fs.writeFile "Asynchronously writes data to a file, replacing the file if it already exists. data can be a string or a buffer." Not sure what I'm doing wrong or missing regarding this, thanks! I'm on Node 4.2.2
fs.writeFile('message.txt', 'Hello Node.js', 'utf8', function (err) {
if(err){
throw err;
}
else{
console.log('It\'s saved!');
}
});
Based on the stack trace you've supplied in your comment, the error you're seeing is a problem with another part of your code, not the snippet you've posted.
Furthermore, it appears that you (or some other function is doing so behind the scenes) are attempting to JSON.parse() some string, but the string is not actually (valid) JSON (perhaps it is HTML or some other type). If you are getting this data from an HTTP response, you may want to check the value of res.headers['content-type'] first because attempting to use JSON.parse().

fs.readFileSync - Error: EMFILE, too many open files 'D:\Workspace\some.json'

I've searched here for a long time, but did not get the answer, I just simply want to read 4000 json files in a loop and do something later,
try {
data = JSON.parse(fs.readFileSync(file));
} catch (err) {
console.error(err);
next();
return;
}
this is a so simple problem, why I can't find answer?
I tried graceful-fs, still got the same problem.
Any suggestion?
thanks very much!
I gave up openFileSync and use openFileSync instead.
fs.readFile(file, function read(err, data) {})
I had this same problem where i was traversing folders and uploading files. I was only able to solve it by using the queueing up the files and reading them from a queue. I eventually went with async library
You can use following option to avoid this problem....
var Filequeue = require('filequeue');
var fq = new Filequeue(200); // max number of files to open at once
fq.readdir('/path/to/files/', function(err, files) {
if(err) {
throw err;enter code here
}
files.forEach(function(file) {
fq.readFile('/path/to/files/' + file, function(err, data) {enter code here`
// do something besides crash
}
Make sure you installed filequeue npm here.
Sync functions are not covered by graceful-fs. EMFILE, which means the current process is out of file descriptors, is impossible to deal with during a Sync function. So graceful-fs does not make a difference.
It's weird though: readFileSync is supposed to open the file, read it, then close it. You have probably encountered an fd leak in your version of Node. It has probably been fixed between 2015 and now (2022), but as there is no version information nor the code for the actual looping part it is difficult to tell.

node.js File upload configurations

No Matter how much I've searched and changed and toyed over the last 24 hours, I simply can not find the right combination of settings that allows Node.js to upload multiple files.
My setup is quite simple - I have a form interface that posts multipart content (files) to an endpoint.. lets call it: /theendpoint
and this end point is supposed to parse the multiple files. However, during the parsing, there are various events that need to be called once the file is uploaded.
I'm currently using the express.bodyParser({ uploadDir:'/tmp', keepextensions:true, defer:true}); in the app configuration.
Using the following method, I am trying to parse the file, but the problem is
Only 2 files will begin uploading, and will not complete (ie. the progress bar hangs near the end without fully completing).
The other files to be uploaded by the form (item 3+) do not even begin to upload to the server.
It seems to be some sort of asynchronus holdup, however I can't properly interpret the problem. Some of the code used at the upload endpoint are as follows:
// This applies to /theendpoint route. Using Express.
exports.theendpoint = function(req,res){
console.log(req.files);
fs.readfile(uploadPath, function(err,data){
if(err) throw err;
fs.writeFile(newFilePath, data, function(err){
// Series of checks and definitions
// Database connection
// Conditional executions
fs.unlink(req.files.file.path, function(err){
if(err) throw err;
console.log('Deleted');
});
});
});
};
Obviously I've left out some of the code here. If anyone can help - is this structure workable?
You should know that items in the commented section.. ie DB connection etc. are asynchronus tasks.
after
fs.unlink(req.files.file.path, function(err){
if(err) throw err;
console.log('Deleted');
});
add
res.redirect("back");
and it works!

Write file directly from the request

I am having the following logic:
//Defense mechanism code is before the fs operations...
fs.readFile(req.files.image.path, function (err, data) {
if (err) {
} else {
fs.writeFile(pathLocation, data, function (err) {
if (err) {
return res.send(err, 500);
}
As far as I can tell, I am having fs.read and then fs.write... question is can I avoid first fs.read?, in another words read directly from the stream (req.files.image.path)...
I am trying to optimize the code as much as possible.
req.files.image is not a stream. It has already been buffered and written to disk via middleware (presumably the connect bodyParser). You can just rename it to it's final FS location via fs.rename. The readFile/writeFile is unnecessary.
You could avoid the write the rename by truly streaming it to disk. Remove the bodyParser middleware and directly do: req.pipe(fs.createWriteStream(pathLocation)) in your route handler.
Note since you mentioned it's an image going to S3, you could actually stream straight from the browser, through your app server without hitting the filesystem, up to S3. This is technically possible, but it's brittle, so most production deployments do use a temporary file on the app server to increase reliability.
You can also upload straight from the browser to S3 if you like.

Resources