what is wrong in this section - node.js

app.get('/dir/:dirname', (req, res) => {
const isFile = fileName => {
return fs.lstatSync(fileName).isFile();
}
var retString = '';
var dir = `d:\\${req.params.dirname}`;
console.log(dir);
retString+='<table>';
fs.readdirSync(dir).map(fileName => {
console.log(fileName);
//retString+=`<tr><td>${dir}</td><td><a href='${path.join(dir, fileName)}>${fileName}</a></td></tr>`;
retString+=`<tr><td>${dir}</td><td>${fileName}</td></tr>`;
}).filter(isFile);
retString += '</table>';
res.send(retString);
res.end();
});
it delivers the file names, but runs into error after the end of the list.
What did I miss out?

Your .map() was not returning anything from the callback. That means you pass an array of undefined to .filter() which then tries to pass undefined to fs.lstatSync() which causes your error.
You don't call both res.send() and res.end() because res.send() already ends the response so when you then call res.end() again, that can cause an error.
res.end() is used when you are using res.write() which can be called multiple times and does not end the response.
Also, your .filter(isFile) is not doing anything useful. You're building the HTML before you filter and then not saving of using the result of the filter. You need to filter before you map as in:
fs.readdirSync(dir).filter(isFile).map(...)
Here's how your code looks using asynchronous file I/O and using the withFileTypes option and inserting some error handling:
app.get('/dir/:dirname', async (req, res) => {
try {
let retString = '';
// this is potentially dangerous as ANY user of this server
// can browse anywhere on your d: drive
const dir = `d:\\${req.params.dirname}`;
console.log(dir);
retString += '<table>';
let entries = await fs.promises.readdir(dir, { withFileTypes: true });
for (let entry of entries) {
if (entry.isFile()) {
console.log(entry.name);
retString += `<tr><td>${dir}</td><td>${entry.name}</td></tr>`;
}
}
retString += '</table>';
res.send(retString);
} catch (e) {
console.log(e);
res.sendStatus(500);
}
});
Other Issues:
This is potentially dangerous code as it allows anyone with access to your server to browse anywhere they want on your d: drive with no restrictions
This presumably should return a fully formed web page, not just an HTML table.

Thank you jfriend - this did it.
I am beginning to understand node.js
Dont worry about security - this is running on local network only and will be limited before getting to be used elsewere.
I am aware of this leak.
But thank you for this hint as well, because others might need this reminder.

Related

Get return value of module in calling file

my root node file requires a module called q1 (not including all the required libraries as not relevant)
const analyzeSentiment = function(message) {
sentiment.getSentiment(message).then(result => {
return (result.vote === 'positive') ? handlePositive() : handleNegative();
});
}
const handlePositive = function() {
return `That's great, we have an opening next Friday at 3pm. Would that work for you?`;
}
const handleNegative = function() {
return `That's okay. Thanks for you time. If you change your mind, give us a call at (xxx) yyy-zzzz.`;
}
exports.analyzeSentiment = analyzeSentiment;
I call it like this: const message = require('q1').analyzeSentiment('text string');
With console logging I can see that it makes it down into the proper handlePositive or handleNegative methods, but nothing comes back. I've tried a few different ways but can't get it to work. Anyone have any suggestions, or see something blatantly wrong I'm doing? This is my first time working with node.
Your function analyzeSentiment not returning anything (see explanation further down).
Try this:
const analyzeSentiment = function(message) {
return sentiment.getSentiment(message).then(result => {
return (result.vote === 'positive') ? handlePositive() : handleNegative();
});
}
And in your caller:
require('q1').sentimentAnalyzer('text string').then(message => {
// Do your thing with the message here
});
Alternatively, if you are in an async context you can use await on the caller:
const message = await require('q1').sentimentAnalyzer('text string');
You might be wondering why the return (result.vote === ... isn't returning from your analyzeSentiment-function. The reason is that the you are creating an anonymous function with the arrow-expression result => ... in the then-block.

various axios call relative at previous one

i have an array of variable number of urls and i must merge the data get with axion
the problem is then every axios call is relative to the data of the previus
if i have a fixed number of ulrs i can nest axion calls and live with that
i think to use something like this
var urls = ["xx", "xx", "xx"];
mergeData(urls);
function mergeData(myarray, myid = 0, mydata = "none") {
var myurl = "";
if (Array.isArray(mydata)) {
myurl = myarray[myid];
// do my stuff with data and modify the url
} else {
myurl = myarray[myid];
}
axios.get(myurl)
.then(response => {
// do my stuff and get the data i need and put on an array
if (myarray.length < myid) {
mergeData(myarray, myid + 1, data);
} else {
// show result on ui
}
})
.catch(error => {
console.log(error);
});
}
but i dont like it
there is another solution?
(be kind, i'm still learning ^^)
just to be clear
i need to optain
http request to "first url", parse the the json, save some data(some needed for the output)
another http request to "second url" with one or more parameter from previous data, parse the the json, save some data(some needed for the output)
... and so on, for 5 to 10 times
If your goal is to make subsequent HTTP calls based on information you get from previous calls, I'd utilize async/await and for...of to accomplish this instead of relying on a recursive solution.
async function mergeData(urls) {
const data = [];
for (const url of urls) {
const result = await axios.get(url).then(res => res.data);
console.log(`[${result.id}] ${result.title}`);
// here, do whatever you want to do with
// `result` to make your next call...
// for now, I am just going to append each
// item to `data` and return it at the end
data.push(result);
}
return data;
}
const items = [
"https://jsonplaceholder.typicode.com/posts/1",
"https://jsonplaceholder.typicode.com/posts/2",
"https://jsonplaceholder.typicode.com/posts/3"
];
console.log("fetching...")
mergeData(items)
.then(function(result) {
console.log("done!")
console.log("final result", result);
})
.catch(function(error) {
console.error(error);
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/axios/0.19.2/axios.min.js"></script>
Using async/await allows you to utilize for...of which will wait for each call to resolve or reject before moving onto the next one.
To learn more about async/await and for...of, have a look here:
developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function
developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for...of
Hope this helps.

How to handle sync browser emulations in node.js

I'm writing a script that is intended to load some stuff from .txt files and then perform multiple ( in a loop) requests to a website with node.js` browser emulator nightmare.
I have no problem with reading from the txt files and so no, but managing to make it run sync and without exceptions.
function visitPage(url, code) {
new Promise((resolve, reject) => {
Nightmare
.goto(url)
.click('.vote')
.insert('input[name=username]', 'testadmin')
.insert('.test-code-verify', code)
.click('.button.vote.submit')
.wait('.tag.vote.disabled,.validation-error')
.evaluate(() => document.querySelector('.validation -error').innerHTML)
.end()
.then(text => {
return text;
})
});
}
async function myBackEndLogic() {
try {
var br = 0, user, proxy, current, agent;
while(br < loops){
current = Math.floor(Math.random() * (maxLoops-br-1));
/*...getting user and so on..*/
const response = await visitPage('https://example.com/admin/login',"code")
br++;
}
} catch (error) {
console.error('ERROR:');
console.error(error);
}
}
myBackEndLogic();
The error that occurs is:
UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'webContents' of undefined
So the questions are a few:
1) How to fix the exception
2) How to make it actually work sync and emulate everytime the address ( as in a previous attempt, which I didn't save, I fixed the exception, but the browser wasn't actually openning and it was basically skipped
3) (Not so important) Is it possible to select a few objects with
.wait('.class1,.class2,.validation-error')
and save each value in different variables or just get the text from the first that occured? ( if no any of these has occurred, then return 0 for example )
I see a few issues with the code above.
In the visitPage function, you are returning a Promise. That's fine, except you don't have to create the wrapping promise! It looks like nightmare returns a promise for you. Today, you're dropping an errors that promise returns by wrapping it. Instead - just use an async function!
async function visitPage(url, code) {
return Nightmare
.goto(url)
.click('.vote')
.insert('input[name=username]', 'testadmin')
.insert('.test-code-verify', code)
.click('.button.vote.submit')
.wait('.tag.vote.disabled,.validation-error')
.evaluate(() => document.querySelector('.validation -error').innerHTML)
.end();
}
You probably don't want to wrap the content of this method in a 'try/catch'. Just let the promises flow :)
async function myBackEndLogic() {
var br = 0, user, proxy, current, agent;
while(br < loops){
current = Math.floor(Math.random() * (maxLoops-br-1));
const response = await visitPage('https://example.com/admin/login',"code")
br++;
}
}
When you run your method - make sure to include a catch! Or a then! Otherwise, your app may exit early.
myBackEndLogic()
.then(() => console.log('donesies!'))
.catch(console.error);
I'm not sure if any of this will help with your specific issue, but hopefully it gets you on the right path :)

Stop function from being invoked multiple times

I'm in the process of building a file upload component that allows you to pause/resume file uploads.
The standard way to achieve this seems to be to break the file into chunks on the client machine, then send the chunks along with book-keeping information up to the server which can store the chunks into a staging directory, then merge them together when it has received all of the chunks. So, this is what I am doing.
I am using node/express and I'm able to get the files fine, but I'm running into an issue because my merge_chunks function is being invoked multiple times.
Here's my call stack:
router.post('/api/videos',
upload.single('file'),
validate_params,
rename_uploaded_chunk,
check_completion_status,
merge_chunks,
record_upload_date,
videos.update,
send_completion_notice
);
the check_completion_status function is implemented as follows:
/* Recursively check to see if we have every chunk of a file */
var check_completion_status = function (req, res, next) {
var current_chunk = 1;
var see_if_chunks_exist = function () {
fs.exists(get_chunk_file_name(current_chunk, req.file_id), function (exists) {
if (current_chunk > req.total_chunks) {
next();
} else if (exists) {
current_chunk ++;
see_if_chunks_exist();
} else {
res.sendStatus(202);
}
});
};
see_if_chunks_exist();
};
The file names in the staging directory have the chunk numbers embedded in them, so the idea is to see if we have a file for every chunk number. The function should only next() one time for a given (complete) file.
However, my merge_chunks function is being invoked multiple times. (usually between 1 and 4) Logging does reveal that it's only invoked after I've received all of the chunks.
With this in mind, my assumption here is that it's the async nature of the fs.exists function that's causing the issue.
Even though the n'th invocation of check_completion_status may occur before I have all of the chunks, by the time we get to the nth call to fs.exists(), x more chunks may have arrived and been processed concurrently, so the function can keep going and in some cases get to the end and next(). However those chunks that arrived concurrently are also going to correspond to invocations of check_completion_status, which are also going to next() because we obviously have all of the files at this point.
This is causing issues because I didn't account for this when I wrote merge_chunks.
For completeness, here's the merge_chunks function:
var merge_chunks = (function () {
var pipe_chunks = function (args) {
args.chunk_number = args.chunk_number || 1;
if (args.chunk_number > args.total_chunks) {
args.write_stream.end();
args.next();
} else {
var file_name = get_chunk_file_name(args.chunk_number, args.file_id)
var read_stream = fs.createReadStream(file_name);
read_stream.pipe(args.write_stream, {end: false});
read_stream.on('end', function () {
//once we're done with the chunk we can delete it and move on to the next one.
fs.unlink(file_name);
args.chunk_number += 1;
pipe_chunks(args);
});
}
};
return function (req, res, next) {
var out = path.resolve('videos', req.video_id);
var write_stream = fs.createWriteStream(out);
pipe_chunks({
write_stream: write_stream,
file_id: req.file_id,
total_chunks: req.total_chunks,
next: next
});
};
}());
Currently, I'm receiving an error because the second invocation of the function is trying to read the chunks that have already been deleted by the first invocation.
What is the typical pattern for handling this type of situation? I'd like to avoid a stateful architecture if possible. Is it possible to cancel pending handlers right before calling next() in check_completion_status?
If you just want to make it work ASAP, I would use a lock (much like a db lock) to lock the resource so that only one of the requests processes the chunks. Simply create a unique id on the client, and send it along with the chunks. Then just store that unique id in some sort of a data structure, and look that id up prior to processing. The example below is by far not optimal (in fact this map will keep growing, which is bad), but it should demonstrate the concept
// Create a map (an array would work too) and keep track of the video ids that were processed. This map will persist through each request.
var processedVideos = {};
var check_completion_status = function (req, res, next) {
var current_chunk = 1;
var see_if_chunks_exist = function () {
fs.exists(get_chunk_file_name(current_chunk, req.file_id), function (exists) {
if (processedVideos[req.query.uniqueVideoId]){
res.sendStatus(202);
} else if (current_chunk > req.total_chunks) {
processedVideos[req.query.uniqueVideoId] = true;
next();
} else if (exists) {
current_chunk ++;
see_if_chunks_exist();
} else {
res.sendStatus(202);
}
});
};
see_if_chunks_exist();
};

How to use filesystem's createReadStream with Meteor router(NodeJS)

I need to allow the user of my app to download a file with Meteor. Currently what I do is when the user requests to download a file I enter into a "fileRequests" collection in Mongo a document with the file location and a timestamp of the request and return the ID of the newly created request. When the client gets the new ID it imediately goes to mydomain.com/uploads/:id. I then use something like this to intercept the request before Meteor does:
var connect = Npm.require("connect");
var Fiber = Npm.require("fibers");
var path = Npm.require('path');
var fs = Npm.require("fs");
var mime = Npm.require("mime");
__meteor_bootstrap__.app
.use(connect.query())
.use(connect.bodyParser()) //I add this for file-uploading
.use(function (req, res, next) {
Fiber(function() {
if(req.method == "GET") {
// get the id here, and stream the file using fs.createReadStream();
}
next();
}).run();
});
I check to make sure the file request was made less than 5 seconds ago, and I immediately delete the request document after I've queried it.
This works, and is secure(enough) I think. No one can make a request without being logged in and 5 seconds is a pretty small window for someone to be able to highjack the created request URL but I just don't feel right with my solution. It feels dirty!
So I attempted to use Meteor-Router to accomplish the same thing. That way I can check if they're logged in correctly without doing the 5 second open to the world trickery.
So here's the code I wrote for that:
Meteor.Router.add('/uploads/:id', function(id) {
var path = Npm.require('path');
var fs = Npm.require("fs");
var mime = Npm.require("mime");
var res = this.response;
var file = FileSystem.findOne({ _id: id });
if(typeof file !== "undefined") {
var filename = path.basename(file.filePath);
var filePath = '/var/MeteorDMS/uploads/' + filename;
var stat = fs.statSync(filePath);
res.setHeader('Content-Disposition', 'attachment; filename=' + filename);
res.setHeader('Content-Type', mime.lookup(filePath));
res.setHeader('Content-Length', stat.size);
var filestream = fs.createReadStream(filePath);
filestream.pipe(res);
return;
}
});
This looks great, fits right in with the rest of the code and is easy to read, no hacking involved, BUT! It doesn't work! The browser spins and spins and never quite knows what to do. I have ZERO error messages coming up. I can keep using the app on other tabs. I don't know what it's doing, it never stops "loading". If I restart the server, I get a 0 byte file with all the correct headers, but I don't get the data.
Any help is greatly appreciated!!
EDIT:
After digging around a bit more, I noticed that trying to turn the response object into a JSON object results in a circular structure error.
Now the interesting thing about this is that when I listen to the filestream for the "data" event, and attempt to stringify the response object I don't get that error. But if I attempt to do the same thing in my first solution(listen to "data" and stringify the response) I get the error again.
So using the Meteor-Router solution something is happening to the response object. I also noticed that on the "data" event response.finished is flagged as true.
filestream.on('data', function(data) {
fs.writeFile('/var/MeteorDMS/afterData', JSON.stringify(res));
});
The Meteor router installs a middleware to do the routing. All Connect middleware either MUST call next() (exactly once) to indicate that the response is not yet settled or MUST settle the response by calling res.end() or by piping to the response. It is not allowed to do both.
I studied the source code of the middleware (see below). We see that we can return false to tell the middleware to call next(). This means we declare that this route did not settle the response and we would like to let other middleware do their work.
Or we can return a template name, a text, an array [status, text] or an array [status, headers, text], and the middleware will settle the response on our behalf by calling res.end() using the data we returned.
However, by piping to the response, we already settled the response. The Meteor router should not call next() nor res.end().
We solved the problem by forking the Meteor router and making a small change. We replaced the else in line 87 (after if (output === false)) by:
else if (typeof(output)!="undefined") {
See the commit with sha 8d8fc23d9c in my fork.
This way return; in the route method will tell the router to do nothing. Of course you already settled the response by piping to it.
Source code of the middleware as in the commit with sha f910a090ae:
// hook up the serving
__meteor_bootstrap__.app
.use(connect.query()) // <- XXX: we can probably assume accounts did this
.use(this._config.requestParser(this._config.bodyParser))
.use(function(req, res, next) {
// need to wrap in a fiber in case they do something async
// (e.g. in the database)
if(typeof(Fiber)=="undefined") Fiber = Npm.require('fibers');
Fiber(function() {
var output = Meteor.Router.match(req, res);
if (output === false) {
return next();
} else {
// parse out the various type of response we can have
// array can be
// [content], [status, content], [status, headers, content]
if (_.isArray(output)) {
// copy the array so we aren't actually modifying it!
output = output.slice(0);
if (output.length === 3) {
var headers = output.splice(1, 1)[0];
_.each(headers, function(value, key) {
res.setHeader(key, value);
});
}
if (output.length === 2) {
res.statusCode = output.shift();
}
output = output[0];
}
if (_.isNumber(output)) {
res.statusCode = output;
output = '';
}
return res.end(output);
}
}).run();
});

Resources