gulp plugin looping through all files and folders - node.js

OK so I am creating a gulp plugin to create a table of contents and imports for my sass setup.
I have the gulp task all setup like this
var sgc = require('./tasks/sass-generate-contents');
gulp.task('sass-generate-contents', function(){
gulp.src(config.tests + '/**/*.scss')
.pipe(sgc(config.tests + '/_main.scss'));
});
and then my plugin code
function sassGenerateContents(){
return through.obj(function(file, enc, cb){
if (file.isNull()) {
cb(null, file);
return;
}
if (file.isStream()) {
cb(new gutil.PluginError(PLUGIN_NAME, 'Streaming not supported'));
return;
}
console.log(file);
});
}
module.exports = sassGenerateContents;
in my file system (the test directory) are 3 x *.scss files
the console log in the plugin only ever returns the first.
I'm probably missing somehting really obvious here, but I thought the idea of **/* was to loop through the folder structure specified and pull out the files.
Can someone explain to me why it's only returning the first file?
Thanks

I was missing the callback function, which called back to itself. At the end of the function i needed to ad
return cb();

Related

reading a packaged file in aws lambda package

I have a very simple node lambda function which reads the contents of packaged file in it. I upload the code as zip file. The directory structure is as follows.
index.js
readme.txt
Then have in my index.js file:
fs.readFile('/var/task/readme.txt', function (err, data) {
if (err) throw err;
});
I keep getting the following error NOENT: no such file or directory, open '/var/task/readme.txt'.
I tried ./readme.txt also.
What am I missing ?
Try this, it works for me:
'use strict'
let fs = require("fs");
let path = require("path");
exports.handler = (event, context, callback) => {
// To debug your problem
console.log(path.resolve("./readme.txt"));
// Solution is to use absolute path using `__dirname`
fs.readFile(__dirname +'/readme.txt', function (err, data) {
if (err) throw err;
});
};
to debug why your code is not working, add below link in your handler
console.log(path.resolve("./readme.txt"));
On AWS Lambda node process might be running from some other folder and it looks for readme.txt file from that folder as you have provided relative path, solution is to use absolute path.
What worked for me was the comment by Vadorrequest to use process.env.LAMBDA_TASK_ROOT. I wrote a function to get a template file in a /templates directory when I'm running it locally on my machine with __dirname or with the process.env.LAMBDA_TASK_ROOT variable when running on Lambda:
function loadTemplateFile(templateName) {
const fileName = `./templates/${templateName}`
let resolved
if (process.env.LAMBDA_TASK_ROOT) {
resolved = path.resolve(process.env.LAMBDA_TASK_ROOT, fileName)
} else {
resolved = path.resolve(__dirname, fileName)
}
console.log(`Loading template at: ${resolved}`)
try {
const data = fs.readFileSync(resolved, 'utf8')
return data
} catch (error) {
const message = `Could not load template at: ${resolved}, error: ${JSON.stringify(error, null, 2)}`
console.error(message)
throw new Error(message)
}
}
This is an oldish question but comes up first when attempting to sort out whats going on with file paths on Lambda.
Additional Steps for Serverless Framework
For anyone using Serverless framework to deploy (which probably uses webpack to build) you will also need to add the following to your webpack config file (just after target: node):
// assume target: 'node', is here
node: {
__dirname: false,
},
Without this piece using __dirname with Serverless will STILL not get you the desired absolute directory path.
I went through this using serverless framework and it really was the file that was not sent in the compression. Just add the following line in serverless.yml:
package:
individually: false
include:
- src/**
const filepath = path.resolve('../../filename.text');
const fileData2 = fs.readFileSync(process.env.LAMBDA_TASK_ROOT + filepath, 'utf-8');
I was using fs.promises.readFile(). Couldn't get it to error out at out. The file was there, and LAMBDA_TASK_ROOT seemed right to me as well. After I changed to fs.readFileSync(), it worked.
I hade the same problem and I tried applying all these wonderful solutions above - which didn't work.
The problem was that I setup one of the folder name with one letter in upper case which was really lowercase.
So when I tried to fetch the content of /src/SOmething/some_file.txt
While the folder was really /src/Something/ - I got this error...
Windows (local environment) is case insensitive while AWS is not!!!....

writeFile no such file or directory

I have a file(data.file an image), I would like to save this image. Now an image with the same name could exist before it. I would like to overwrite if so or create it if it does not exist since before. I read that the flag "w" should do this.
Code:
fs.writeFile('/avatar/myFile.png', data.file, {
flag: "w"
}, function(err) {
if (err) {
return console.log(err);
}
console.log("The file was saved!");
});
Error:
[Error: ENOENT: no such file or directory, open '/avatar/myFile.png']
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: '/avatar/myFile.png'
This is probably because you are trying to write to root of file system instead of your app directory '/avatar/myFile.png' -> __dirname + '/avatar/myFile.png' should do the trick, also check if folder exists. node.js won't create parent folder for you.
Many of us are getting this error because parent path does not exist. E.g. you have /tmp directory available but there is no folder "foo" and you are writing to /tmp/foo/bar.txt.
To solve this, you can use mkdirp - adapted from How to write file if parent folder doesn't exist?
Option A) Using Callbacks
const mkdirp = require('mkdirp');
const fs = require('fs');
const getDirName = require('path').dirname;
function writeFile(path, contents, cb) {
mkdirp(getDirName(path), function (err) {
if (err) return cb(err);
fs.writeFile(path, contents, cb);
});
}
Option B) Using Async/Await
Or if you have an environment where you can use async/await:
const mkdirp = require('mkdirp');
const fs = require('fs');
const writeFile = async (path, content) => {
await mkdirp(path);
fs.writeFileSync(path, content);
}
I solved a similar problem where I was trying to create a file with a name that contained characters that are not allowed. Watch out for that as well because it gives the same error message.
I ran into this error when creating some nested folders asynchronously right before creating the files. The destination folders wouldn't always be created before promises to write the files started. I solved this by using mkdirSync instead of 'mkdir' in order to create the folders synchronously.
try {
fs.mkdirSync(DestinationFolder, { recursive: true } );
} catch (e) {
console.log('Cannot create folder ', e);
}
fs.writeFile(path.join(DestinationFolder, fileName), 'File Content Here', (err) => {
if (err) throw err;
});
Actually, the error message for the file names that are not allowed in Linux/ Unix system comes up with the same error which is extremely confusing. Please check the file name if it has any of the reserved characters. These are the reserved /, >, <, |, :, & characters for Linux / Unix system. For a good read follow this link.
It tells you that the avatar folder does not exist.
Before writing a file into this folder, you need to check that a directory called "avatar" exists and if it doesn't, create it:
if (!fs.existsSync('/avatar')) {
fs.mkdirSync('/avatar', { recursive: true});
}
you can use './' as a prefix for your path.
in your example, you will write:
fs.writeFile('./avatar/myFile.png', data.file, (err) => {
if (err) {
return console.log(err);
}
console.log("The file was saved!");
});
I had this error because I tried to run:
fs.writeFile(file)
fs.unlink(file)
...lots of code... probably not async issue...
fs.writeFile(file)
in the same script. The exception occurred on the second writeFile call. Removing the first two calls solved the problem.
In my case, I use async fs.mkdir() and then, without waiting for this task to complete, I tried to create a file fs.writeFile()...
As SergeS mentioned, using / attempts to write in your system root folder, but instead of using __dirname, which points to the path of the file where writeFile is invoked, you can use process.cwd() to point to the project's directory. Example:
writeFile(`${process.cwd()}/pictures/myFile.png`, data, (err) => {...});
If you want to avoid string concatenations/interpolations, you may also use path.join(process.cwd(), 'pictures', 'myFile.png') (more details, including directory creation, in this digitalocean article).

fs.open fails with file extension - node.js

I've got the following code:
fs.open("uploads/test.txt", "a", "0755", function(err, fd){
if(err) { console.log(err); }
else {
file.handler = fd; //We store the file handler so we can write to it later
...
}
});
The file is created and written to perfectly when I simply have "uploads/test", but when I try to do "uploads/test.txt" it breaks. Any ideas?
I think you should try using
var path = './uploads/test.txt'.
Or
var path = __dirname + 'your_path';
fs.open(path, "a", "0755", function(err, fd){
if(err) { console.log(err); }
else {
file.handler = fd; //We store the file handler so we can write to it later
...
}
});
This is really silly, but I found what was causing my code to break:
fs.open works as intended. The bug was with my file detection setup using nodemon.
The reason is everytime my app would load it would run the above mentioned code. The code would then write to a new file in my apps /uploads directory. Nodemon would then detect the new file and restart the app thus creating a vicious circle.

Gulp copying empty directories

In my gulp build I've made a task that runs after all compiling, uglifying and minification has occurred. This task simply copies everything from the src into the dest directory that hasn't been touched/processed by earlier tasks. The little issue I'm having is that this results in empty directories in the dest directory.
Is there a way to tell the gulp.src glob to only include files in the pattern matching (like providing the 'is_file' flag)?
Thanks.
Fixed it by adding a filter to the pipeline:
var es = require('event-stream');
var onlyDirs = function(es) {
return es.map(function(file, cb) {
if (file.stat.isFile()) {
return cb(null, file);
} else {
return cb();
}
});
};
// ...
var s = gulp.src(globs)
.pipe(onlyDirs(es))
.pipe(gulp.dest(folders.dest + '/' + module.folder));
// ...
I know I'm late to the party on this one, but for anyone else stumbling upon this question, there is another way to do this that seems pretty elegant in my eyes. I found it in this question
To exclude the empty folders I added { nodir: true } after the glob pattern.
Your general pattern could be such (using the variables from Nick's answer):
gulp.src(globs, { nodir: true })
.pipe(gulp.dest(folders.dest + '/' + module.folder));
Mine was as follows:
gulp.src(['src/**/*', '!src/scss/**/*.scss', '!src/js/**/*.js'], { nodir: true })
.pipe(gulp.dest('dev/'));
This selects all the files from the src directory that are not scss or js files, and does not copy any empty folders from those two directories either.

NodeJs Method Chaining of fs.mkdir and fs.rename

I'm working on a file uploader script which creates a new folder (based on the timestamp) and moves the uploaded file to the created folder.
Sometimes it works and sometimes I'm getting a ENOENT rename Error (file/folder not exist).
The following code is in my post route:
var form = new multiparty.Form({
uploadDir: "C:"+ path.sep + "files"
});
form.parse(req, function(err, fields, files) {
var dirPath = "C:"+ path.sep + "files" + path.sep + +new Date;
fs.mkdir(dirPath, 0777, function(err, dirPath){
if (err) console.error(err);
console.log("Created new folder");
fs.rename(files.file[i].path, dirPath + path.sep + files.file[i].originalFilename, function(err){
if (err) console.error(err);
console.log("Moved file");
});
}(err, dirPath));
next();
});
I'm using express(4) and the multiparty module.
As you can see I'm using async functions.
So the question is: What is wrong with my code?
Edit
The error I'm talking about: "Error: ENOENT, rename 'C:\files\7384-1r41cry.png'"
It has something to do with a race condition. With fs.mkdirSync everything works fine.
My guess would be some sort of race condition happening here.
This kind of stuff is easy to get wrong and hard to get it right.
I normally use gulp for this kind of stuff and maybe should you :)
To copy a whole directory tree into some other directory wouldn't be easier.
gulp.src('./inputDir/**/*').pipe(gulp.dest('./outputDir')
And all files from inputDir would be copied into outputDir
But maybe coping is not a option. The files could too large, right?
Lets hack it a bit to make it work the way we want.
var fs = require('fs')
, gulp = require('gulp')
, thr = require('through2').obj
, SRC = './test/**/*.{css,html,js}' // it could be 'my/file/located/here'
, OUT = './output' // it could be 'my/newer/file/located/there'
;
gulp.src(SRC)
.pipe(thr(function(chunk, enc, next){
chunk.contents = new Buffer('') // cleaning the contents of the file
chunk._originalPath = chunk.path
next(null, chunk)
}))
.pipe(gulp.dest(OUT))
.pipe(thr(function(chunk, enc, next){
// now a place holder file exists at our destination.
// so we can write to it and being convident it exists
console.log('moving file from', chunk._originalPath, 'to', chunk.path)
fs.rename(chunk._originalPath, chunk.path, function(err){
if (err) return next(err)
next()
})
}))
This moves all css, html and js files from input to output, regardless of how many nested directories there are
gulp is awesome :)
Ok a few things...
you really should start using promises, it makes the code easier to read and the error handling is way superior. I usually use when.js, but there are other alternatives.
you should throw the errors or return on errors or you would try to continue running the function even when the previous operations failed.
you do
if (err) console.error(err);
should be
if (err) throw err;

Resources