I'm using fs.writeFile to write a number of files to a directory. I'd like to use the name of the file being written in a callback (ex. printing "file x has been written").
How can I get the name of the file being written (x)?
Using the filename variable I passed to writeFile will not work due to writeFile being asynchronous.
Thanks!
So I think you're doing something like this (just guessing, since you're not posting code):
var files = [ 'file1.txt', 'file2.txt', 'file3.txt' ];
for (var i = 0; i < files.length; i++) {
var filename = files[i];
fs.writeFile(filename, CONTENTS, function(err) {
console.log('File written:', filename);
});
}
That's not going to work because filename will be reused and overwritten for each iteration.
The easiest solution would be to use forEach:
files.forEach(function(filename) {
fs.writeFile(filename, CONTENTS, function(err) {
console.log('File written:', filename);
});
});
That will create a newly scoped filename variable for each iteration, which won't be overwritten.
Related
still new to JSON and while ive searched the net and created a function to create a init file if none exists i'm coming up blank for search and retrive the data of the new existing file or how I add new entries or update new entries
so far i can do a read file and export the resits in a console log so i know the assignment work, its a global variable so the data should persist out of the read file loop but when i try and access it later to make the local array i'll pull data from and use for updating later it reads as undefined.
fs.readFile(path, 'utf8', (error, data) => {
if(error){
console.log(error);
return;
}
//console.log(JSON.parse(data));
JSONData = JSON.parse(data);
for (let i = 0; i < JSONData.length; i++) {
console.log(i+": ["+JSONData[i].unique+"] "+JSONData[i].name);
}
});//fs.readFile
var playerKey = "KuroTO";
playerKey = playerKey.toLowerCase();
for (let i = 0; i < JSONData.length; i++) {
if (JSONData[i].unique.toLowerCase() == playerKey){
console.log("["+i+"] "+JSONData[i].unique.toLowerCase()+": "+playerKey);
PlayerCard1.push(JSONData[i].userid);//0
PlayerCard1.push(JSONData[i].username);//1
PlayerCard1.push(JSONData[i].unique);//2
PlayerCard1.push(JSONData[i].name);//3
PlayerCard1.push(JSONData[i].avatarurl);//4
PlayerCard1.push(JSONData[i].level);//5
PlayerCard1.push(JSONData[i].Rank);//6
PlayerCard1.push(JSONData[i].henshined);//7
PlayerCard1.push(JSONData[i].Strength);//8
PlayerCard1.push(JSONData[i].Perception);//9
PlayerCard1.push(JSONData[i].Endurance);//10
PlayerCard1.push(JSONData[i].Wisdom);//11
PlayerCard1.push(JSONData[i].Intelligence)//12;
PlayerCard1.push(JSONData[i].Luck)//13;
PlayerCard1.push(JSONData[i].Agility)//14;
PlayerCard1.push(JSONData[i].Flexability)//15;
PlayerCard1.push(JSONData[i].RatedSpeed)//16;
};//if unique matches
};//for
this is ther psudo code concept im trying to do
if (JSONData.stringify.unique == {SearchUID}){toonname = JSONData.stringify.name;}
as i understand it you cant really apend just rewrite the file over again with new data and i think i can figure that out on my own once i cand figure out how to real the file into an array i can search like above
To read JSON, simply require the file.
JSON:
{
"key": "H"
}
JS:
let jsonFile = require("./path/to/json");
console.log(jsonFile.key); // H
Editing is just as simple.
let jsonFile = require("./path/to/json");
jsonFile.key = "A"
console.log(jsonFile.key) // A
Saving edits requires use of FileSystem:
const fs = require("fs")
let jsonFile = require("./path/to/json");
jsonFile.key = "A"
// first argument is the file path
// second argument is the JSON to write - the file is overwritten already
// due to above, so just JSON.stringify() the required file.
// third argument is an error callback
fs.writeFile("./path/to/jsonFile", JSON.stringify(jsonFile), (err) => {
if (err) throw new Error(err);
});
This can also be used to slightly clean up your current init function if you wanted, but that's up to you of course.
I'm trying to upload multiple files with an HTTP post, and then NodeJS handles:
save files' info to database
move files from tmp folder to permanent folder
if any file move fails, delete the file from tmp folder
My two issues are described in the comments within code snippet below:
path.resolve isn't working
iterator isn't working within fs.rename
for (i = 0; i < req.files.length; i++) {
const file = new fileSchema({
_userId: req.body._userId,
_companyId: req.body._companyId,
_meetingId: response._id,
originalFilename: req.files[i].originalname,
savedFilename: req.files[i].filename,
});
file.save().then((response) => { }).catch(error => { console.log(error) });
const currentPath = path.resolve(temp_folder, req.files[i].filename);
const newPath = upload_folder +"/"+ req.body._userId +"/"+ req.body._companyId +"/"+ response._id +"/"+ req.files[i].filename;
// 1. why doesn't path.resolve work with the inputs on the line above? I have to concat a string as in line above?
fs.rename(currentPath, newPath, function(err) {
if (err) {
console.log("Error moving files");
try { removeTempFiles(temp_folder, req.files[i]); } // helper function which works written elsewhere
// 2. req.files[i] is undefined (even though req.files works) so the line above fails - i.e. the iterator isn't captured within rename?
catch(err) { console.log(err); }
} else {
console.log("Successfully moved the file!");
}
});
}
Any help appreciated, thanks.
Change this
for (i = 0; i < req.files.length; i++) {
to this:
for (let i = 0; i < req.files.length; i++) {
The addition of let will create a separate i for each iteration of the for loop so it will stay valid inside your fs.rename() callback.
And, path.join(), is probably a better choice than path.resolve() for combining path segments.
Let's say we have:
a file data.json or data.txt with this content: {"data":[]}
and an array of paths: ["C:\\path1", "C:\\path2", "C:\\path3"]
Question
How would we append the array of paths into this file with node.js data stream (or whatnot) so that we get this in the end:
{"data":["C:\\path1", "C:\\path2", "C:\\path3"]}
Code
let filePath = 'C:\test\data.json'
let paths = ["C:\\path1", "C:\\path2", "C:\\path3"]
for (let index = 0; index < paths.length; index++) {
// ... streaming paths to the file one by one
}
I cannot put paths in the file without a loop - in my project I have walkdir(drive, options, (path) => {}) instead of for loop. It also returns paths one by one like the above for loop, it's just for demonstration.
Because it is JSON, you can't actually append to the file. You have to read out the entire document, parse the JSON to a POJO, make your changes, stringify the JSON, and write it back.
import { readFile, writeFile } from 'fs';
readFile(filePath, (err, data) => {
if (err) throw new Error(err);
const json = JSON.parse(data);
paths.forEach(path => json.data.push(path));
writeFile(filePath, JSON.stringify(json), err => { /* handle err */ });
});
If it was a plaintext file, you could do an append by writing to it and setting the flag option to a (for append).
writeFile(filePath, { flag: 'a' }, 'text to append');
I'm learning a lot about Node.js by rewriting some utility tools I had in C# for the fun of it. I have either found something that is not a good idea to write in Node.js or I'm completely missing a concept that will make it work.
The goal of the program: Search a directory of files for a file with data that matches some criteria. The files are gzipped XML, and for the time being I'm just looking for one tag. Here's what I tried (files is an array of file names):
while (files.length > 0) {
var currentPath = rootDir + "\\" + files.pop();
var fileContents = fs.readFileSync(currentPath);
zlib.gunzip(fileContents, function(err, buff) {
if (buff.toString().indexOf("position") !== -1) {
console.log("The file '%s' has an odometer reading.", currentPath);
return;
}
});
if (files.length % 1000 === 0) {
console.log("%d files remain...", files.length);
}
}
I was nervous about this when I wrote it. It's clear from the console output all of the gunzip operations are asynchronous and decide to wait until the while loop is complete. That means when I finally do get some output, currentPath doesn't have the value it had when the file was read, so the program is useless. I don't see a synchronous way to decompress the data with the zlip module. I don't see a way to store the context (currentPath would do) so the callback has the right value. I originally tried streams, piping a file stream to a gunzip stream, but I had a similar problem in that all of my callbacks happened after the loop had completed and I'd lost useful context.
It's been a long day and I'm out of ideas for how to structure this. The loop is a synchronous thing, and my asynchronous stuff depends on its state. That is bad. What am I missing? If the files weren't gzipped, this would be easy because of readFileSync().
Wow. I didn't really expect no answers at all. I got in a time crunch but I spent the last couple of days looking over Node.js, hypothesizing why certain things were working like they did, and learning about control flow.
So the code as-is doesn't work because I need a closure to capture the value of currentPath. Boy does Node.js like closures and callbacks. So a better structure for the application would look like this:
function checkFile(currentPath, fileContents) {
var fileContents = fs.readFileSync(currentPath);
zlib.gunzip(fileContents, function(err, buff) {
if (buff.toString().indexOf("position") !== -1) {
console.log("The file '%s' has an odometer reading.", currentPath);
return;
}
});
}
while (files.length > 0) {
var currentPath = rootDir + "\\" + files.shift();
checkFile(currentPath);
}
But it turns out that's not very Node, since there's so much synchronous code. To do it asynchronously, I need to lean on more callbacks. The program turned out longer than I expected so I'll only post part of it for brevity, but the first bits of it look like this:
function checkForOdometer(currentPath, callback) {
fs.readFile(currentPath, function(err, data) {
unzipFile(data, function(hasReading) {
callback(currentPath, hasReading);
});
});
}
function scheduleCheck(filePath, callback) {
process.nextTick(function() {
checkForOdometer(filePath, callback);
});
}
var withReading = 0;
var totalFiles = 0;
function series(nextPath) {
if (nextPath) {
var fullPath = rootDir + nextPath;
totalFiles++;
scheduleCheck(fullPath, function(currentPath, hasReading) {
if (hasReading) {
withReading++;
console.log("%s has a reading.", currentPath);
}
series(files.shift());
});
} else {
console.log("%d files searched.", totalFiles);
console.log("%d had a reading.", withReading);
}
}
series(files.shift());
The reason for the series control flow is it seems if I set up the obvious parallel search I end running out of process memory, probably from having 60,000+ buffers worth of data sitting on the stack:
while (files.length > 0) {
var currentPath = rootDir + files.shift();
checkForOdometer(currentPath, function(callbackPath, hasReading) {
//...
});
}
I could probably set it up to schedule batches of, say, 50 files in parallel and wait to schedule 50 more when those are done. Setting up the series control flow seemed just as easy.
Trying to run a script that opens a bunch of files asynchronously and reads their content. I am getting an error where fs.readFile's callback comes in with no data, yet the file is there, and is not currently being opened by anything else. Totally confused.
The error is:
Error: OK, open
'D:\Workspace\fasttrack\public\cloudmade\images\998\256\6\31\61.png'
More info:
The program runs through a loop that has a bunch of objects in it that look like this:
newObject = {
filePath:filePath,
scanPixels:function(parent){
....
}
}
The loop calls each object's scanPixels function, which then does an fs.readFile on the parent.filePath
Here is the for loop
for(var index=0;index<objects.length;index++){
objects[index].scanPixels(objects[index]);
}
The scanPixels function is essentially this:
scanPixels:function(parent){
png_js.decode(parent.filePath, function(pixels){
...more stuff
And in the png_js file:
PNG.decode = function(path, fn) {
return fs.readFile(path, function(err, file) {
var png;
png = new PNG(file);
return png.decode(function(pixels) {
return fn(pixels);
});
});
};
the problem is that fs.readFile does not return a value. You would want fs.readFileSync for that
var buffer1 = fs.fileReadSync('./hello1.txt');
var buffer2;
fs.fileRead('./hello2.txt', function (err, file) {
buffer = file;
});