Lambda ENOENT: no such file or directory - node.js

I am trying to read a file .yml inside an AWS Lambda function (Node 6.10.0).
console.log(__dirname + '/gameOptions.yml');
console.log(path.resolve('./gameOptions.yml'));
console.log(path.resolve('/gameOptions.yml'));
console.log('./gameOptions.yml');
console.log(process.cwd() + '/api/lib/gameOptions.yml');
let doc = yaml.safeLoad(fs.readFileSync(path.resolve('./gameOptions.yml'), 'utf8'));
I have tried all possibles ways to do it, but always get ENOENT: no such file or directory.
The file is at the same folder and it is an .yml so require('') also doesnt work.
The results for the above code are:
/Users\marcus\Documents\Workspace\proak-api\proak-api\api\lib/gameOptions.yml
/var/task/gameOptions.yml
/gameOptions.yml
./gameOptions.yml
/var/task/api/lib/gameOptions.yml
And it works locally.

To solve this you need to care of two things:
Optimize
If you are using something to minify the code, E.g. serverless-plugin-optimize:
Include the file to not be minified.
myLambda:
handler: mySubFolder/myLambda.handler
optimize:
includePaths: ['mySubFolder/myFile.json']
Resolve the path.
path.resolve(process.env.LAMBDA_TASK_ROOT, '_optimize', process.env.AWS_LAMBDA_FUNCTION_NAME, 'mySubFolder/myFile.json')
If you dont use minify, you also need to require the .yml file into the Lambda, to compile into the function.
Require require('file.yml') gives you an error. So you let to:
var fs = require('fs')
, yaml = require('js-yaml');
require.extensions['.yaml'] =
require.extensions['.yml'] = function(module, filename) {
var content = fs.readFileSync(filename, 'utf8');
// Parse the file content and give to module.exports
content = yaml.load(content);
module.exports = content;
};

Related

JSON file not found

I have a json file with the name of email_templates.json placed in the same folder as my js file bootstrap.js. when I try to read the file I get an error.
no such file or directory, open './email_templates.json'
bootstrap.js
"use strict";
const fs = require('fs');
module.exports = async () => {
const { config } = JSON.parse(fs.readFileSync('./email_templates.json'));
console.log(config);
};
email_templates.json
[
{
"name":"vla",
"subject":"test template",
"path": ""
}
]
I am using VS code , for some reason VS code doesnt autocomplete the path as well which is confusing for me.Does anyone know why it is doing this?
Node v:14*
A possible solution is to get the full path (right from C:\, for example, if you are on Windows).
To do this, you first need to import path in your code.
const path = require("path");
Next, we need to join the directory in which the JavaScript file is in and the JSON filename. To do this, we will use the code below.
const jsonPath = path.resolve(__dirname, "email_templates.json");
The resolve() function basically mixes the two paths together to make one complete, valid path.
Finally, you can use this path to pass into readFileSync().
fs.readFileSync(jsonPath);
This should help with finding the path, if the issue was that it didn't like the relative path. The absolute path may help it find the file.
In conclusion, this solution should help with finding the path.

How to generate zip and put files into zip using stream in Node js

Currently, I tried to make zip file(or any format of compressed file) containing few files that I want to put into zip file.
I thought it would work with adm-zip module.
but I found out that the way adm-zip module put files into zip is buffer.
It takes a lot of memory when I put files that size is very huge.
In the result, My server stopped working.
Below is What I'd done.
var zip = new AdmZip();
zip.addLocalFile('../largeFile', 'dir1'); //put largeFile into /dir1 of zip
zip.addLocalFile('../largeFile2', 'dir1');
zip.addLocalFile('../largeFile3', 'dir1/dir2');
zip.writeZip(/*target file name*/ `./${threadId}.zip`);
Is there any solution to solve this situation?
to solve memory issue the best practice is to use streams and not load all files into memory for example
import {
createReadStream,
createWriteStream
} from 'fs'
import { createGzip } from 'zlib'
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)

How to determine the root directory of a project for config/rc files

I'm writing a CLI tool in Node and I'd like it to be configurable by a config file when it's used by a consumers project. Pretty similar to how es-lint's .eslintrc or babel .bablerc works.
consumer-app
node_modules
my-cli-tool
index.js ← my tool
.configfile ← configuration file for the cli tool
package.json
These files are usually placed at the root of the project and sometimes you can have multiple config files at different levels of the file tree.
consumer-app
sub-directory
.configfile ← another configuration file for this sub-dir
node_modules
my-cli-tool
index.js ← my tool
.configfile ← configuration file
package.json
What's the overall architecture of building something similar? I can have my module look for its config file - but I'm having a hard time locating those config files or the root directory of a project because that's most likely where they're going to be.
I was able to resolve this problem by looking the config file up the tree of a given __dirname.
This following method takes a filename and scans up the tree each directory the __dirname is part of until it finds the given file. This also makes it possible for each directory to have its own config file.
function getRootFile(filename) {
return new Promise((resolve, reject) => {
let lastFound = null;
let lastScanned = __dirname;
__dirname.split('/').slice(1).reverse().forEach(dir => {
const parentPath = path.resolve(lastScanned, '../');
if (fs.existsSync(path.join(parentPath, filename))) {
lastFound = path.join(parentPath, filename);
}
lastScanned = parentPath;
});
resolve(lastFound);
});
}
async function main() {
const configPath = getRootFile('.myapprc')
}
This is only a proof-of-concept so it isn't perfect, but something to demonstrate what I'm trying to achieve.

reading a packaged file in aws lambda package

I have a very simple node lambda function which reads the contents of packaged file in it. I upload the code as zip file. The directory structure is as follows.
index.js
readme.txt
Then have in my index.js file:
fs.readFile('/var/task/readme.txt', function (err, data) {
if (err) throw err;
});
I keep getting the following error NOENT: no such file or directory, open '/var/task/readme.txt'.
I tried ./readme.txt also.
What am I missing ?
Try this, it works for me:
'use strict'
let fs = require("fs");
let path = require("path");
exports.handler = (event, context, callback) => {
// To debug your problem
console.log(path.resolve("./readme.txt"));
// Solution is to use absolute path using `__dirname`
fs.readFile(__dirname +'/readme.txt', function (err, data) {
if (err) throw err;
});
};
to debug why your code is not working, add below link in your handler
console.log(path.resolve("./readme.txt"));
On AWS Lambda node process might be running from some other folder and it looks for readme.txt file from that folder as you have provided relative path, solution is to use absolute path.
What worked for me was the comment by Vadorrequest to use process.env.LAMBDA_TASK_ROOT. I wrote a function to get a template file in a /templates directory when I'm running it locally on my machine with __dirname or with the process.env.LAMBDA_TASK_ROOT variable when running on Lambda:
function loadTemplateFile(templateName) {
const fileName = `./templates/${templateName}`
let resolved
if (process.env.LAMBDA_TASK_ROOT) {
resolved = path.resolve(process.env.LAMBDA_TASK_ROOT, fileName)
} else {
resolved = path.resolve(__dirname, fileName)
}
console.log(`Loading template at: ${resolved}`)
try {
const data = fs.readFileSync(resolved, 'utf8')
return data
} catch (error) {
const message = `Could not load template at: ${resolved}, error: ${JSON.stringify(error, null, 2)}`
console.error(message)
throw new Error(message)
}
}
This is an oldish question but comes up first when attempting to sort out whats going on with file paths on Lambda.
Additional Steps for Serverless Framework
For anyone using Serverless framework to deploy (which probably uses webpack to build) you will also need to add the following to your webpack config file (just after target: node):
// assume target: 'node', is here
node: {
__dirname: false,
},
Without this piece using __dirname with Serverless will STILL not get you the desired absolute directory path.
I went through this using serverless framework and it really was the file that was not sent in the compression. Just add the following line in serverless.yml:
package:
individually: false
include:
- src/**
const filepath = path.resolve('../../filename.text');
const fileData2 = fs.readFileSync(process.env.LAMBDA_TASK_ROOT + filepath, 'utf-8');
I was using fs.promises.readFile(). Couldn't get it to error out at out. The file was there, and LAMBDA_TASK_ROOT seemed right to me as well. After I changed to fs.readFileSync(), it worked.
I hade the same problem and I tried applying all these wonderful solutions above - which didn't work.
The problem was that I setup one of the folder name with one letter in upper case which was really lowercase.
So when I tried to fetch the content of /src/SOmething/some_file.txt
While the folder was really /src/Something/ - I got this error...
Windows (local environment) is case insensitive while AWS is not!!!....

gulp task that dynamically create folder with name based on file name

I have the following gulp task that is currently not working.
gulp.task('emails', function() {
gulp.src('views/emails/src/**/*.html')
.pipe(inky())
.pipe(gulp.dest('views/emails/dist/'+debug()+"/html.ejs"));
});
I would like to iterate over the /views/emails/src/ directory, find all html files, then use inky to convert them to html, and then copy the resulting html file to...
views/emails/dist/'+ folderName +"/html.ejs
where folderName is the name of the .html file that was processed.
I need this in order to get the file structure in the format that the npm email-templates package requires.
That's a job for gulp-rename:
var rename = require('gulp-rename');
var path = require('path');
gulp.task('emails', function() {
gulp.src('views/emails/src/**/*.html')
.pipe(inky())
.pipe(rename(function(file) {
file.dirname = path.join(file.dirname, file.basename);
file.basename = 'html';
file.extname = '.ejs';
}))
.pipe(gulp.dest('views/emails/dist/'));
});

Resources