I'm developing a node.js package that generates some boilerplate code for me.
I was able to create this package and if I run every module individual it works fine and generates the code as expected.
My idea now is to import this package in another project and have it generate the code.
I have no idea to how to achieve this or even if it's possible.
The perfect solution would be that every time a file changed in a set of folders it would run the package and generate the files but if this isn't possible it would be ok as well to expose or command to manually generate this files.
I have created a script to run the generator script but it only works on the package itself and not when I import it in another project.
Thanks
I think you want the fs.watch() function. It invokes a callback when a file (or directory) changes.
import { watch } from 'fs';
function watchHandler (eventType, filename) {
console.log(`event type is: ${eventType}`) /* 'change' or 'rename' */
if (filename) {
console.log(`filename provided: ${filename}`)
} else {
console.log('filename not provided');
}
}
const options = { persistent: true }
const watcher = fs.watch('the/path/you/want', options, watchHandler)
...
watcher.close()
You can use this from within your nodejs app to invoke watchHandler() for each file involved in your code generation problem.
Related
Is there a way where I can save a timestamp out of my application / object, so when I restart the nodeserver I can get that value?
I need this for my cronjob. I need to save the last synching even though I restart the server.
There are all sorts of ways to save this sort of information so you can load it when you restart your node process. One is to write it to a file in your file system, then read it when you start your program.
To write the current timestamp to a file do this.
const fs = require('fs')
...
fs.writeFile('timestamp.txt', Date.now().toString(), err => {console.error(err)})
To read it do this.
const fs = require('fs')
...
const timestamp = Number(fs.readFileSync('timestamp.txt'))
Obviously there's more programming to do to put the file in the correct directory, to handle errors, and to cope with the case where you attempt to read the file before writing it. But that's the idea.
You can also store it in some kind of database. But this should do you for now. Unless you're using a system like Heroku where the files don't always get saved from run to run.
When a process dies, all data stored in its working memory (such as variables and functions) die with it.
I recently wrote an npm package cashola that makes it easier to store this data across process restarts.
You can run this example script twice and see how the print statements differ each time.
import { rememberSync } from 'cashola';
const myState = rememberSync('timestamp-example');
console.log('Before:', myState);
// First run: {}
// Second run: { <timeString1>: 'hi! }
myState[new Date.getTime().toString()] = 'hi!';
console.log('After:', myState);
// First run: { <timeString1>: 'hi! }
// Second run: { <timeString1>: 'hi!, <timeString2>: 'hi! }
I need to pull in the contents of a program source file for display in a page generated by Gatsby. I've got everything wired up to the point where I should be able to call
// my-fancy-template.tsx
import { readFileSync } from "fs";
// ...
const fileContents = readFileSync("./my/relative/file/path.cs");
However, on running either gatsby develop or gatsby build, I'm getting the following error
This dependency was not found:
⠀
* fs in ./src/templates/my-fancy-template.tsx
⠀
To install it, you can run: npm install --save fs
However, all the documentation would suggest that this module is native to Node unless it is being run on the browser. I'm not overly familiar with Node yet, but given that gatsby build also fails (this command does not even start a local server), I'd be a little surprised if this was the problem.
I even tried this from a new test site (gatsby new test) to the same effect.
I found this in the sidebar and gave that a shot, but it appears it just declared that fs was available; it didn't actually provide fs.
It then struck me that while Gatsby creates the pages at build-time, it may not render those pages until they're needed. This may be a faulty assessment, but it ultimately led to the solution I needed:
You'll need to add the file contents to a field on File (assuming you're using gatsby-source-filesystem) during exports.onCreateNode in gatsby-node.js. You can do this via the usual means:
if (node.internal.type === `File`) {
fs.readFile(node.absolutePath, undefined, (_err, buf) => {
createNodeField({ node, name: `contents`, value: buf.toString()});
});
}
You can then access this field in your query inside my-fancy-template.tsx:
{
allFile {
nodes {
fields { content }
}
}
}
From there, you're free to use fields.content inside each element of allFile.nodes. (This of course also applies to file query methods.)
Naturally, I'd be ecstatic if someone has a more elegant solution :-)
I'm trying to build a nodeJS tool to help me analyzing another AngularJS source code.
The idea is to :
read some of the angular project javascript files
for each file, grab the content
eval the content from the file
do some stuff
The problem I'm facing is that my Angular source code uses es6 features like import, export, arrow functions, ...Etc. and I using nodeJS which does not support these features yet.
So I tried to use #babel/core transform() from my Node app code, but it doesn't work. I keep getting error like Unexpected identifier which means it doesn't understand the import {stuff} from 'here'; syntaxe.
srcFiles.forEach(content => {
try {
(function() {
eval(require("#babel/core").transform(content.text).code)
}.call(window, angular));
} catch (e) {
console.log(e);
}
});
An sample test file :
import _ from 'loadash';
console.log("I'm a file with import and export");
export const = 42;
Any idea how I can get this stuff working ? Or maybe another approach ?
You can pass options as the second parameter of transform method. See examples here
Question is too broad / unclear. Anyone interested in this answer would be better served by visiting: Creating Callbacks for required modules in node.js
Basically I have included a CLI package in my node application. I need the CLI to spin up a new project (this entails creating a folder for the project). After the project folder is created, I need to create some files in the folder (using fs writeFile). The problem is right now, my writeFile function executes BEFORE the folder is created by the CLI package (This is detected by my console.log. This brings me to main main question.
Can I add an async callback function to the CLI.new without modifying the package I included?
FoundationCLI.new(null, {
framework: 'sites', // 'apps' or 'emails' also
template: 'basic', // 'advanced' also
name: projectName,
directory: $scope.settings.path.join("")
});
try{
if (!fs.existsSync(path)){
console.log("DIRECTORY NOT THERE!!!!!");
}
fs.writeFileSync(correctedPath, JSON.stringify(project) , 'utf-8');
} catch(err) {
throw err;
}
It uses foundation-cli. The new command executes the following async series. I'd love to add a callback to the package - still not quite sure how.
async.series(tasks, finish);
Anyone interested in this can probably get mileage out of:
Creating Callbacks for required modules in node.js
The code for the new command seem to be available on https://github.com/zurb/foundation-cli/blob/master/lib/commands/new.js
this code was not written to allow programmatic usage of the new command (it uses console.log everywhere) and does not call any callback when the work is finished.
so no there is no way to use this package to do what you are looking for. Either patch the package or find another way to do what you want to achieve.
I'm trying to use mkdirp for a project, but when I feed it a var with my dir path I want created, it only creates the first half of it. I've installed the module locally with npm. I'm using Node v0.10.20 on a Raspberry Pi.
This is how it looks:
var filePath = "upload/home/pi/app/temp";
mkdirp(filePath, function(error) {
if(error) {
console.log(error);
} else {
...
}
});
I don't get an error creating the path, but it only creates "upload/home/pi", however if I run my script again, it creates the rest of the directory structure. Upload is a
directory in the current working directory which is the user home.
I emailed the author of the module who suggested that it could be because I'm using a flash drive as my medium, which in turn lies about when IO operations are complete, which I guess confuses node.js to think it has successfully written the path to disk. How should I tackle my problem? I guess I can do a check on if the directory was created, and loop that until it has, but that feels like the wrong thing to do. Any suggestions welcome.
Thanks.
Try doing this synchronously:
var filePath = "upload/home/pi/app/temp";
mkdirp(filePath)