Unable to delete a file with nodejs in protractor - node.js

I'm trying to delete the file with nodejs fs and I notice that file has been generated then trying to delete (failed to delete the file) while the file is not even uploaded on browser with protractor. Generate and delete file functions are created using nodejs fs.
So how can I put them in a way then wait until file is uploaded then delete the file?
helper.generateFile(filePath);
helper.uploadFile(UploadButtonElement, filePath);
uploadButtonElm.click();
helper.deleteFile(filePath);
Is there a way to execute deleteFile only when below two actions are completed.
helper.uploadFile(UploadButtonElement, filePath);
uploadButtonElm.click();
Thanks.

Protractor operations schedule promises to do things. They do not actually do them. Thus, your helper functions will end up running well before any of the protractor code actually accomplishes what you asked. Use then to chain your dependencies explicitly. Like so:
helper.generateFile(filePath);
helper.uploadFile(UploadButtonElement, filePath);
uploadButtonElm.click().then(function() {
helper.deleteFile(filePath);
});
Please read https://github.com/angular/protractor/blob/master/docs/control-flow.md and https://code.google.com/p/selenium/wiki/WebDriverJs#Understanding_the_API

function deleteAlreadyDownloadedFiles(extension,username) {
    let os = require('os');
    console.log('USERNAME HERE IS'+require("os").userInfo().username);
    var downloadDirectory = '/Users/'+require("os").userInfo().username+'/Downloads/';
    const fs = require('fs');
    const path = require('path');
    const directory = '/Users/'+require("os").userInfo().username+'/Downloads/';
    fs.readdir(directory, (err, files) => {
        if (err) throw err;
         for (const file of files) {
            fs.unlink(path.join(directory, file), err => {
                if (err
                    && (err.code === "EACCES" || err.code === "EPERM")) {
                    console.log("Retrying rename file: ")
                    return;
                }
            });
        }
    });
}

Related

Best way to copy a directory from an external drive to a local folder with electronjs?

Just wondering if anyone has ever attempted to copy a directory from an external drive (connected via USB) to a local folder.
I am using ElectronJS so I can use my JavaScript, HTML/CSS skills to create a desktop application without utilising a C language. (i.e. C# or C++) With ElectronJS there's a lot less to worry about.
Here is the list of things I've tried so far:
basic fs.copyFile (using copyFile intially and will then loop round the directory to copy all files)
var fs = require('fs');
window.test = () => {
fs.moveSync("targetFile","destDir", function(err) {
if(err){
console.log(err);
}else{
console.log("copy complete")
}
});
}
fs.moveSync is not a function even though Visual Studio Code brought up moveSync as a suggestion when I entered fs. (ctrl + space)
using child_process functions to copy files using the command line.
Code is:
var process = require('child_process')
window.test = function(){
process.exec('ipconfig', function(err, stdout, stderr){
if(err){
console.log(err);
}else{
console.log(stdout)
}
})
}
Then bundled with browserify. Bundle.js is then imported into the html file and the test function is called on the click of a button. I'm aware the command is ipconfig for now, this was merely used to see if a command could be executed. It appears it could because I was getting process.exec is not defined.
use the node-hid node module to read and trasfer data from the external drive.
The exposed functions within this module were also reported as being undefined. And I thought about the use case longer I thought a simple copy process would suffice because external drive can be accessed like any other folder in the file explorer.
Unfortunately, all of the above have failed and I've spent the most part of the day looking for alternative modules and/or solutions.
Thanks in advance because any help to achieve this would be much appreciated.
Thanks
Patrick
The npm package fs-extra should solve your problem.
It has the move function, which
Moves a file or directory, even across devices
Ended up adding this to my preload.js for:
window.require = require;
It will work for now but is due to be depreciated.
I'll use this for now and make other updates when I have to.

How to download a directory's content via ftp using nodejs?

So, i am trying to download the contents of a directory via sftp using nodejs, and so far I am getting stuck with an error.
I am using the ssh2-sftp-client npm package and for the most part it works pretty well as i am able to connect to the server and list the files in a particular remote directory.
Using the fastGet method to download a file also works without any hassles, and since all the methods are promise based i assumed i could easily download all the files in the directory simply enough, by doing something like:
let main = async () => {
await sftp.connect(config.sftp);
let data = await sftp.list(config.remote_dir);
if (data.length) data.map(async x => {
await sftp.fastGet(`${config.remote_dir}/${x.name}`, config.base_path + x.name);
});
}
So it turns out the code above successfully downloads the first file, but then crashes with the following error message:
Error: Failed to get sandbox/demo2.txt: The requested operation cannot be performed because there is a file transfer in progress.
This seems to indicate that the promise from fastGet is resolving too early as the file transfer is supposed to be over when the next element of the file list is processed.
I tried to use the more traditional get() instead but it is using streams, and it fails with a different error. After researching it seems there's been a breaking change regarding streams in node 10.x. well in my case calling get simply fails (not even downloading the first file).
Does anyone know a workaround to this? or else, another package that can download several files by sftp?
Thanks!
I figured out, since the issue was concurrent download attempts on one client connection, i could try to manage it with one client per file download. I ended up with the following recursive function.
let getFromFtp = async (arr) => {
if (arr.length == 0) return (processFiles());
let x = arr.shift();
conns.push(new Client());
let idx = conns.length - 1;
await conns[idx].connect(config.sftp.auth);
await conns[idx]
.fastGet(`${config.sftp.remote_dir}/${x.name}`, `${config.dl_dir}${x.name}`);
await connections[idx].end();
getFromFtp(arr);
};
Notes about this function:
The array parameter is a list of files to download, presumably fetched using list() beforehand
conns was declared as an empty array and is used to contain our clients.
using array.prototype.shift(), to gradually deplete the array as we go through the file list
the processFiles() method is fired once all the files were downloaded.
this is just the POC version. of couse we need to add the error management to that.

How to add async callbacks in node to a function call?

Question is too broad / unclear. Anyone interested in this answer would be better served by visiting: Creating Callbacks for required modules in node.js
Basically I have included a CLI package in my node application. I need the CLI to spin up a new project (this entails creating a folder for the project). After the project folder is created, I need to create some files in the folder (using fs writeFile). The problem is right now, my writeFile function executes BEFORE the folder is created by the CLI package (This is detected by my console.log. This brings me to main main question.
Can I add an async callback function to the CLI.new without modifying the package I included?
FoundationCLI.new(null, {
framework: 'sites', // 'apps' or 'emails' also
template: 'basic', // 'advanced' also
name: projectName,
directory: $scope.settings.path.join("")
});
try{
if (!fs.existsSync(path)){
console.log("DIRECTORY NOT THERE!!!!!");
}
fs.writeFileSync(correctedPath, JSON.stringify(project) , 'utf-8');
} catch(err) {
throw err;
}
It uses foundation-cli. The new command executes the following async series. I'd love to add a callback to the package - still not quite sure how.
async.series(tasks, finish);
Anyone interested in this can probably get mileage out of:
Creating Callbacks for required modules in node.js
The code for the new command seem to be available on https://github.com/zurb/foundation-cli/blob/master/lib/commands/new.js
this code was not written to allow programmatic usage of the new command (it uses console.log everywhere) and does not call any callback when the work is finished.
so no there is no way to use this package to do what you are looking for. Either patch the package or find another way to do what you want to achieve.

Including concatenated scripts asynchronously with gulp.js

I'm trying to do the following:
gulp.src('js/*.js')
.pipe(concat('_temp.js'))
.pipe(gulp.dest('js/'));
gulp.src('build/js/app.js')
.pipe(replace('// includes', fs.readFileSync('js/_temp.js')))
.pipe(uglify())
.pipe(rename('app.min.js'))
.pipe(gulp.dest('build/js'));
So, I'm concatenating all .js files in js/. The concatenated file is named _temp.js;
After that, I'm reading a app.js and try to replace the // includes string with the concatenated _temp.js file. This doesn't work as node says the file doesn't exist. It does exist though, so I second time I run this task, it works.
This has most probably to do with asynchronisity, but I'm unable to see how I would do this differently?
You can split your task into 2 and make one of the tasks run after another by using Gulp's dependency resolution system.
gulp.task('task1', function () { ... });
gulp.task('task2', ['task1'], function () { ... });

nodejs require() json from file garbage collection

I'm using a file for storing JSON data. My module makes CRUD actions on the file and I'm using require() to load the json, instead of fs.readFile(). The issue is, if the file is deleted, using fs.unlink(), then calling the file again using require still loads the file... which has just been deleted. I'm a bit lost how to get around this, possibly #garbage-collection?
Example:
fs.writeFile('foo.json', JSON.stringify({foo:"bar"}), function(){
var j = require('./foo.json')
fs.unlink('./foo.json', function(){
console.log('File deleted')
var j = require('./foo.json')
console.log(j)
})
})
When loading a module using require, Node.js caches the loaded module internally so that subsequent calls to require do not need to access the drive again. The same is true for .json files, when loaded using require.
That's why your file still is "loaded", although you deleted it.
The solution to this issue is to use the function for loading a file that is appropriate for it, which you already mentioned: fs.readFile(). Once you use that, everything will work as expected.

Resources