node.js and ncp module - fails to copy single file - node.js

I am using Node.js v6.3.1 and ncp v2.0.0
I can only get ncp to copy the contents of a directory, but not a single file within that directory.
Here is the code copying the contents of a directory recursively that works:
var ncp = require("ncp").ncp;
ncp("source/directory/", "destination/directory/", callback);
...and here is the same code but with a file as the source:
var ncp = require("ncp").ncp;
ncp("source/directory/file.txt", "destination/directory/", callback);
From this all I can think is that ncp was specifically designed to copy directories recursively, not single files maybe?
I had thought about using something like fileSystem's read/write stream functions as described here but really for consistency I was hoping to stick with ncp.
Update:
I have found another package called node-fs-extra which does what I want without the need for me to add event handlers to the operations, like I would have to do with the fileSystem read/write solution.
Here is the code that is working:
var fsExtra = require("fs-extra");
fsExtra.copy("source/directory/file.txt", "destination/directory/file.txt", callback);
Obviously this still is inconsistent, but at least is a little less verbose.

Ok I have figured out what I was doing wrong.
I was trying to copy a file into a directory, where as I needed to copy and name the file inside a directory.
So here is my original code that does not work:
var ncp = require("ncp");
ncp("source/directory/file.txt", "destination/directory/", callback);
...and here is the fixed code working, notice the inclusion of a file name in the destination directory:
var ncp = require("ncp");
ncp("source/directory/file.txt", "destination/directory/file.txt", callback);
So it looks like ncp wont just take the file as is, but needs you to specify the file name at the other end to successfully copy. I guess I was assuming that it would just copy the file with the same name into the destination directory.

I have found another package called node-fs-extra which does what I want without the need for me to add event handlers to the operations, like I would have to do with the fileSystem read/write solution.
Here is the code that is working:
var fsExtra = require("fs-extra");
fsExtra.copy("source/directory/file.txt", "destination/directory/file.txt", callback);
Obviously this still is inconsistent, but at least is a little less verbose.

Related

Node : What is the right way to delete all the files from a directory?

So I was trying to delete all my files inside a folder using node.
I came across 2 methods .
Method 1
Delete the folder using rmkdir. But if I plan on adding the images on the same folder then I use mkdir and creates the same folder again and appends the files to it.
Example: I have an Add Files and Delete ALL button. When I click deleteAll , the folder gets deleted. And when I click add then the folder gets created and the file gets added to that folder
Method 2
Using readdir , I loop through the files and stores in an array and then delete only the files instead of the folder.
Which is the best way to do it ? If its not among these then please advice me a better solution.
The rm function of ShellJS will do the trick. It works as a one-liner, and it works cross-platform, and is well tested and documented. It even supports recursive deletes.
Basically, something such as:
const { rm } = require('shelljs');
rm('-rf', '/tmp/*');
(Sample code taken from ShellJS' documentation.)

Alternative to fs.readdirSync large directory directory in Node

I have a single directory with a few million json files in it. I ultimately want to iterate over each file in the directory, read it, do something with the information and then write something into a database.
My script works perfectly when I use a test directory with a few hundred files. However, it stalls when I use the real directory. I strongly believe that I have pinpointed the problem to the use of:
fs.readdirSync('my dir path')
Converting this to the Async function would not help anything since I need the file names before anything else can happen anyways. However, my belief is that this operation hangs because it simply "takes too long" for it to read the entire directory.
For reference here is a broader portion of the function:
function traverseFS(){
var path = 'my dir name and path';
var files = fs.readdirSync(path);
for (var i in files) {
path + '/' + files[i];
var fileText = fs.readFileSync(currentFile,'utf8');
var json= JSON.parse(fileText);
if(json)
// do something
}
}
My question is either:
Is there something I can do get this to work using readdirSync?
Is there another operation I should be using?
You would need to either use a child process (easiest) that creates a directory listing and parse that or write your own streamable binding to scandir() (on *nix) and/or whatever the equivalent is on Windows and use that. For the latter, you may want to use the libuv code (*nix, Windows) as a guide.

Node js. Writing directory to the archive, without path to this directory

Got some task, it is not hard, but i have some trouble.
Maybe someone already has similar problem.
My task is writing to zip archive some folder, with files and other folders in it with using NodeJS
I try to use AdmZip pakage
folder project structure:
var AdmZip = require('adm-zip');
let Archive = new AdmZip();
Archive.addLocalFolder('Archive/filesFolder', '');
Archive.writeZip('Archive/newArchive.zip);
I must get archive with 'filesFolder', instead i get archive with 'Archive' folder and in it i have 'filesFolder'.
If anybody know, how to record only target folder, and not the sequence of a way folders?
What happens is that you are providing Archive/filesFolder as value to writeZip and that means include in the zip Archive folder and inside that include filesFolder.
For testing purpose change the value of writeZip() to just filesFolder.zip and it should zip content of Archive as filesFolder.zip in current working directory. See below (you can copy/paste the bit of code and run it and it should work).
var zip = new AdmZip();
// I don't know why you provided a second argument to below, I removed it
// There was nothing about it in the documentation.
zip.addLocalFolder('./Archive');
//rename the value of the following to just filesFolder.zip
zip.writeZip('filesFolder.zip');
The above should output the content of Archive to the current working directory as filesFolder.zip
I did mention this in my comment and your commend seem to indicate that you have path issue, so try the above script.

Best ways to merge folders using nodejs, grunt or gulp

I have two folders and I just want to merge them (overwriting BaseFolder files with ExtendedFolder ones when present)
Example:
BaseFolder
---main.js
---test.js
ExtendedFolder
---main.js
Result expected:
ResultFolder
---main.js (from ExtendedFolder)
---test.js (from BaseFolder)
I similar question has been asked but without a satisfying answer: Grunt. Programmatically merge corresponding files in parallel folders with concat
EDIT: I just realized my previous answer wouldn't work. I am posting new code.
If you just have two specified folders and want to merge their content, it would be pretty straightforward with gulp (this assumes that the folder names are known before and don't change):
var gulp = require('gulp');
gulp.task('moveBase',function(){
return gulp.src('BaseFolder/*')
.pipe(gulp.dest('ResultFolder/'));
});
gulp.task('moveExtended',['moveBase'],function(){
return gulp.src('ExtendedFolder/*')
.pipe(gulp.dest('ResultFolder/'));
});
gulp.task('mergeFolders',['moveBase','moveExtended']);
The files in BaseFolder having the same names as files in ExtendedFolder get overwritten.
The key here is the order of copying. First, copy the folder whose files should be overwritten in case of a conflict. You can split the copying into two tasks and take advantage of the dependency system - task moveExtended depends on moveBase which ensures that it will be copied later.
I was able to do exactly what I wanted using https://www.npmjs.org/package/event-stream
var gulp = require('gulp')
, es = require('event-stream');
gulp.task('web_dev', function () {
es.merge(gulp.src('./BaseFolder/**/*')
, gulp.src('./ExtendedFolder/**/*'))
.pipe(gulp.dest('out'));
});

How to copy file in node.js (including modified time)?

I am able to copy a file in node.js using the following:
var readStream = fs.createReadStream(fromFilePath);
readStream.pipe(fs.createWriteStream(toFilePath));
The question is how to also copy/keep the modified time (mtime) like in a regular file copy command.
There are methods in the fs module to access mtime:
var stat = fs.statSync(fromFilePath);
fs.utimesSync(toFilePath, stat.atime, stat.mtime)
Use https://nodejs.org/api/fs.html#fs_fs_copyfile_src_dest_flags_callback .
The documentation does not say it but based on my tests it does keep/set the Modified-time to be the same as in the source-file, at least on Windows-10.
It does set the Created -time to the time the copy was made. But your question is about the modified-time so this is probably the simplest way to get what you want.
BTW. I find it curious that it now seems like the file was modified before it was created. How could that be! But so it seems, at least on Windows 10. I guess that's a good hint for us that the file was copied from somewhere else.

Resources