Why am I getting a NOENT using Node core module 'fs' - node.js

This a repeat question (not yet answered) but I have revised and tightened up the code. And, I have included the specific example. I am sorry to keep beating this drum, but I need help.
This is a Node API. I need to read and write JSON data. I am using the Node core module 'fs', not the npm package by the same name (or fs-extra). I have extracted the particular area of concern onto a standalone module that is shown here:
'use strict';
/*==================================================
This service GETs the list of ids to the json data files
to be processed, from a json file with the id 'ids.json'.
It returns and exports idsList (an array holding the ids of the json data files)
It also calls putIdsCleared to clear the 'ids.json' file for the next batch of processing
==================================================*/
// node modules
const fs = require('fs');
const config = require('config');
const scheme = config.get('json.scheme')
const jsonPath = config.get('json.path');
const url = `${scheme}${jsonPath}/`;
const idsID = 'ids.json';
const uri = `${url}${idsID}`;
let idsList = [];
const getList = async (uri) => {
await fs.readFile(uri, 'utf8', (err, data) => {
if (err) {
return(console.log( new Error(err.message) ));
}
return jsonData = JSON.parse(data);
})
}
// The idea is to get the empty array written back to 'ids.json' before returning to 'process.js'
const clearList = async (uri) => {
let data = JSON.stringify({'ids': []});
await fs.writeFile(uri, data, (err) => {
if (err) {
return (console.log( new Error(err.message) ));
}
return;
})
}
getList(uri);
clearList(uri)
console.log('end of idsList',idsList);
module.exports = idsList;
Here is the console output from the execution of the module:
Error: ENOENT: no such file or directory, open 'File:///Users/doug5solas/sandbox/libertyMutual/server/api/ids.json'
at ReadFileContext.fs.readFile [as callback]
(/Users/doug5solas/sandbox/libertyMutual/server/.playground/ids.js:24:33)
at FSReqWrap.readFileAfterOpen [as oncomplete] (fs.js:235:13)
Error: ENOENT: no such file or directory, open 'File:///Users/doug5solas/sandbox/libertyMutual/server/api/ids.json'
at fs.writeFile
(/Users/doug5solas/sandbox/libertyMutual/server/.playground/ids.js:36:34)
at fs.js:1167:7
at FSReqWrap.oncomplete (fs.js:141:20)
I am being told there is no such file or directory. However I can copy the uri (as shown in the error message)
File:///Users/doug5solas/sandbox/libertyMutual/server/api/ids.json
into the search bar of my browser and this is what is returned to me:
{
"ids": [
"5sM5YLnnNMN_1540338527220.json",
"5sM5YLnnNMN_1540389571029.json",
"6tN6ZMooONO_1540389269289.json"
]
}
This result is the expected result. I do not "get" why I can get the data manually but I cannot get it programmatically, using the same uri. What am I missing? Help appreciated.

Your File URI is in the wrong format.
It shouldn't contain the File:// protocol (that's a browser-specific thing).
I'd imagine you want C://Users/doug5solas/sandbox/libertyMutual/server/api/ids.json.

I solved the problem by going to readFileSync. I don't like it but it works and it is only one read.

Related

Adding files in directory to an array

I am really new to node.js. I need to read .json files from a directory and then add them to an array and return it. I am able to read each file separately by passing the address:
const fs = require("fs");
fs.readFile("./fashion/customer.json", "utf8", (err, jsonString) => {
if (err) {
console.log("Error reading file from disk:", err);
return;
}
try {
const customer = JSON.parse(jsonString);
console.log("Customer address is:", customer.address); // => "Customer address is: Infinity Loop Drive"
} catch (err) {
console.log("Error parsing JSON string:", err);
}
});
But the same fashion folder has multiple json files. I want to add these files to an array and then return it. I tried using readdirSync but that just returned the file names. Is it possible to add json files to an array and return it?
Basically I require an array of this format:
Array[{contents of json file1}, {contents of json file2}, .....]
Any help is appreciated!
Here is a simple solution to your question:
const fs = require("fs");
const jsonFolder = './fashion'
var customerDataArray = []
fs.readdirSync(jsonFolder).forEach(file => {
let fileData = JSON.parse(fs.readFileSync(jsonFolder+'/'+file))
customerDataArray.push(fileData)
});
console.log(customerDataArray)
readdirSync returns an array with all the file names or objects in the directory. You can use forEach to iterate through every item in the array, which will be the file names in this scenario. To read the contents of each file, use readFileSync and specify the path to the file as the name of the directory plus the name of the file. The data is returned as a buffer and needs to be parsed using JSON.parse(), and then it is pushed to the customerDataArray.
I hope this answers your question!

fs.createReadStream getting a different path than what's being passed in

I'm using NodeJS on a VM. One part of it serves up pages, and another part is an API. I've run into a problem, where fs.createReadStream attempts to access a different path than what is being passed into the function. I made a small test server to see if it was something else in the server affecting path usage, for whatever reason, but it's happening on my test server as well. First, here's the code:
const fs = require('fs');
const path = require('path');
const csv = require('csv-parser');
const readCSV = (filename) => {
console.log('READ CSV GOT ' + filename); // show me what you got
return new Promise((resolve, reject) => {
const arr = [];
fs.createReadStream(filename)
.pipe(csv())
.on('data', row => {
arr.push(row);
})
.on('error', err => {
console.log(err);
})
.on('end', () => {
resolve(arr);
});
}
}
// tried this:
// const dir = path.relative(
// path.join('path', 'to', 'this', 'file),
// path.join('path', 'to', 'CONTENT.csv')
// );
// tried a literal relative path:
// const dir = '../data/CONTENT.csv';
// tried a literal absolute path:
// const dir = '/repo/directory/server/data/CONTENT.csv';
// tried an absolute path:
const dir = path.join(__dirname, 'data', 'CONTENT.csv');
const content = readCSV(dir)
.then(result => {console.log(result[0]);})
.catch(err => {console.log(err);});
...but any way I slice it, I get the following output:
READCSV GOT /repo/directory/server/data/CONTENT.csv
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '/repo/directory/data/CONTENT.csv'
i.e., is fs.createReadStream somehow stripping out the directory of the server, for some reason? I suppose I could hard code the directory into the call to createReadStream, maybe? I just want to know why this is happening.
Some extra: I'm stuck on node v8.11, can't go any higher. On the server itself, I believe I'm using older function(param) {...} instead of arrow functions -- but the behavior is exactly the same.
Please help!!
Code is perfect working.
I think you file CONTENT.csv should be in data folder like "/repo/directory/data/CONTENT.csv".
I'm answering my own question, because I found an answer, I'm not entirely sure why it's working, and at least it's interesting. To the best of my estimation, it's got something to do with the call stack, and where NodeJS identifies as the origin of the function call. I've got my server set up in an MVC pattern so my main app.js is in the root dir, and the function that's being called is in /controllers folder, and I've been trying to do relative paths from that folder -- I'm still not sure why absolute paths didn't work.
The call stack goes:
app.js:
app.use('/somepath', endpointRouter);
...then in endpointRouter.js:
router.get('/request/file', endpointController.getFile);
...then finally in endpointController.js:
const readCSV = filename => {
//the code I shared
}
exports.getFile = (req, res, next) => {
// code that calls readCSV(filename)
}
...and I believe that because Node views the chain as originating from app.js, it then treats all relative paths as relative to app.js, in my root folder. Basically when I switched to the super unintuitive single-dot-relative path: './data/CONTENT.csv', it worked with no issue.

Generating a json for a icon cheatsheet

I'm trying to generate a json file containing the filenames of all the files in a certain directory. I need this to create a cheatsheet for icons.
Currently I'm trying to run a script locally via terminal, to generate the json. That json will be the input for a react component that will display icons. That component works, the create json script doesn't.
Code for generating the json
const fs = require('fs');
const path = require('path');
/**
* Create JSON file
*/
const CreateJson = () => {
const files = [];
const dir = '../icons';
fs.readdirSync(dir).forEach(filename => {
const name = path.parse(filename);
const filepath = path.resolve(dir, filename);
const stat = fs.statSync(filepath);
const isFile = stat.isFile();
if (isFile) files.push({ name });
});
const data = JSON.stringify(files, null, 2);
fs.writeFileSync('../Icons.json', data);
};
module.exports = CreateJson;
I run it in terminal using
"create:json": "NODE_ENV=build node ./scripts/CreateJson.js"
I expect a json file to be created/overridden. But terminal returns:
$ NODE_ENV=build node ./scripts/CreateJson.js
✨ Done in 0.16s.
Any pointers?
You are creating a function CreateJson and exporting it, but you are actually never calling it.
You can get rid of the module.exports and replace it with CreateJson().
When you'll execute the file with node, it will see the function declaration, and a call to it, whereas with your current code there is no call.

Scan a google document line by line

so basically, I'm trying to use node.js to scan a google document, then if a ROBLOX id is on there it tracks it. When it tracks it, if it joins one of the groups in the id list, it auto-exiles it.
Any help?
I'm a little stuck on the scanning a google document line by line.
I am not sure about how to do it from a google doc, but if you are willing to move to using text files(.txt) I think I could be of assistance.
Using Nodes FS we can read lines using a Line reader
import * as fs from 'fs'
import { createReadStream } from 'fs'
import { createInterface } from 'readline'
const lineReader = createInterface({
input: createReadStream('data/input.txt')
})
const output: string[] = []
lineReader.on('line', (item: string) => {
output.push(If you wanna output something to an output file put it here)
})
// Down here is the output of what you put in the output.push
lineReader.on('close', () => {
fs.writeFile('data/output.txt', output.join('\n'), err => {
if (err) throw err
console.log('The file has been saved!')
})
})
So the above code is in typescript, but typescript can be compiled down into javascript. If this code doesn't work for you I at least hope it gave some knowledge that helped you find your answer.

nodejs/fs: writing a tar to memory buffer

I need to be able to tar a directory, and send this to a remote endpoint via HTTP PUT.
I could of course create the tar, save it to disk, then read it again and send it.
But I'd rather like to create the tar, then pipe it to some buffer and send it immediately. I haven't been able to achieve this.
Code so far:
var tar = require('tar');
var fs = require("fs");
var path = "/home/me/uploaddir";
function getTar(path, cb) {
var buf = new Buffer('');
var wbuf = fs.createWriteStream(buf);
wbuf.on("finish", function() {
cb(buf);
});
tar.c({file:""},[path]).
pipe(wbuf);
}
getTar(path, function(tar) {
//send the tar over http
});
This code results in:
fs.js:575
binding.open(pathModule._makeLong(path),
^
TypeError: path must be a string
at TypeError (native)
at Object.fs.open (fs.js:575:11)
I've also tried using an array as buffer, no joy.
The following solution
creates the tar, then pipes it to some buffer and sends it immediately
and with great speed thanks to the tar-fs library:
First install the libraries request for simplified requests and tar-fs, which provides filesystem bindings for tar-stream: npm i -S tar-fs request
var tar = require('tar-fs')
var request = require('request')
var fs = require('fs')
// pack specific files in the directory
function packTar (folderName, pathsArr) {
return tar.pack(folderName, {
entries: pathsArr
})
}
// return put stream
function makePutReq (url) {
return request.put(url)
}
packTar('./testFolder', ['test.txt', 'test1.txt'])
.pipe(makePutReq('https://www.example.com/put'))
I have renamed the function names to be super verbose.

Resources