Is there a way change a node.js while it's running? - node.js

Is there a way change a node.js while it's running?
Like edit the javascript files while it's running and then it change accordingly while it's running.

Yes.
You'll have to load your file through another file that watches the script for changes. You will probably need some setup/teardown code that runs in the script whenever it is restarted.
var fs = require('fs');
var file = 'file.js';
var script;
function loadScript() {
if (script) {
if (typeof script.teardown === 'function') {
script.teardown();
}
delete require.cache[file];
}
script = require(file);
}
fs.watch(file, function(event, filename) {
if (event !== 'change') return;
loadScript();
});
loadScript();
The fs.watch API is not 100% consistent across platforms, and is unavailable in some situations.

Have you checked Node-supervisor

No, it is not possible. When you startup a Node server / app it will load in the current versions of the files. If you make a change after startup it will be unaware. You will have to kill the app and restart for these changes to take affect.
There are some utilities like node-dev which do this for you. They monitor the filesystem for changes and restart the app as needed (along with some other features like growl notification).
I prefer restarting the app manually. That way you know exactly what version it's running, and can save changes to a file multiple times before deciding to try it out again.

You can use nodemon.
nodemon will watch the files in the directory in which nodemon was started, and if any files change, nodemon will automatically restart your node application.
Perfect solution to work with Node during development.

This is totally possible.
Just don't require your modules in the header; instead, load them by calling at the time of need.
In addition, you can also call require.undef('./myModule') to ditch existing versions before including the new module. You can scan the file system for any new module names, if you feel like dropping in new behavior at runtime.
I have implemented the plugin pattern numerous times with node, such that a submodule update will include new plugins that will be picked up at runtime.
Enjoy.

Related

AppImage from electron-builder with file system not working

I’m using Electron Builder to compile my Electron app to an .AppImage file, and I’m using the fs module to write to an .json file, but it’s not working in the appimage format (it’s working fine when I have the normal version not made with Electron Builder). I can still read from the file.
The code (preload):
setSettings: (value) => {fs.writeFileSync(path.join(__dirname, "settings.json"), JSON.stringify(value), "utf8")}
The code (on the website):
api.setSettings(settings);
The project: https://github.com/Nils75owo/crazyshit
That's not a problem with AppImage or Electron Builder but with the way you're packaging your app. Since you didn't post your package.json*, I can only guess what's wrong, but probably you haven't changed Electron Builder's default behaviour regarding packing your application.
By default, Electron Builder compiles your application, including all resources, into a single archive file in the ASAR format (think of it like the TAR format). Electron includes a patched version of the fs module to be able to read from the ASAR file, but writing to it is obviously not supported.
You have two options to mitigate this problem: Either you store your settings somewhere in the user's directory (which is the way I'd go, see below) or you refrain from packing your application to an ASAR file, but that will leave all your JavaScript code outside the executable in a simple folder. (Note that ASAR is not capable of keeping your code confidential, because there are applications which can extract such archives, but it makes it at least a little harder for attackers or curious eyes to get a copy of your code.)
To disable packing to ASAR, simply tell Electron Builder that you don't want it to compile an archive. Thus, in your package.json, include the following:
{
// ... other options
"build": {
// ... other build options
"asar": false
}
}
However, as I mentioned above, it's probably wiser to store settings in a common place where advanced users can actually find (and probably edit, mostly for troubleshooting) them. On Linux, one such folder would be ~/.config, where you could create a subdirectory for your application.
To get the specific application data path on a cross-platform basis, you can query Electron's app module from within the main process. You can do so like this:
const { app } = require ("electron"),
path = require ("path");
var configPath;
try {
configPath = path.join (app.getPath ("appData"), "your-app-name");
} catch (error) {
console.error (error);
app.quit ();
}
If you however have correctly set your application's name (by using app.setName ("...");), you can instead simply use app.getPath ("userData"); and omit the path joining. Take a look at the documentation!
For my Electron Applications, I typically choose to store settings in a common hidden directory (one example for a name could be the brand under which you plan to market the application) in the user's home directory and I then have separate directories for each application. But how you organise this is up to you completely.
* For the future, please refrain from directing us to a GitHub repository and instead include all information (and code is information too) needed to replicate/understand your problem in your question. That'd save us a lot of time and could potentially get you answers faster. Thanks!

Electron Node.js node localstorage osx mkdir permission denied

I am working with Electron and Node.js. We have developed an application that works fine on windows and as a requirement had to package it for mac os. I packaged the application using electron-packager, the packaging process completes and package is generated. Double clicking it throws an error that permission denied for mkdir, as i am using node localstorage to maintain some settings on the user's local machine. somehow mac doesn't local storage to create folder in the root of the application. Any help in this matter will be great. Thanks
First off, is the code in question in the main process or in a renderer process? If it is the latter, you don't need to use 'node-localstorage', because you can use the renderer's native LocalStorage. If you are in the main process, then you need to provide your own storage strategy so using 'node-localstorage' is a viable option.
In any case, you need to carefully consider where to store the data; for starters, let's look at where Electron's renderer processes would store its LocalStorage data: this differs based on the OS, but you can get and set the paths using the app module -- the path in question is userData, which on OS X would default to ~/Library/Application Support/<App Name>. Electron uses that folder to persist cookies, caches, LocalStorage etc. so I would suggest using that folder as well. (Otherwise, refer to XDG defaults for good defaults)
What your example above was trying to do is store your 'errorLogDb' in the current working directory, which might depend on your OS, where your App is installed, how you executed it, etc.
Finally, it's a good idea to differentiate between your 'production' app and your app during development and testing, because you might not want to use the same storage folders for every environment. In any case, just writing to './errorLogDb' is likely to cause lots of headaches so I'd be thankful for the permission denied error.
this strategy worked for me:
const { LocalStorage } = require('node-localstorage');
let ls;
mb.on('ready', () => {
let prefsPath = mb.app.getPath('userData') + '/prefs';
ls = new LocalStorage(prefsPath);
loadPrefs();
});
mb.on('after-create-window', () => { /* ls... */ }
exports.togglePref = () => { /* ls... */ }

Where should I store cache of a custom CLI npm module?

I am developing an npm module, where user can interact with it through a terminal by executing commands:
> mymodule init
> mymodule do stuff
When executing certain commands user is being asked for some data, which will be used by the module. Since this data won't really change while using the module and since these commands can be executed pretty frequently, it is not the best option to ask user for the data any time he runs a command. So I've decided to cache this data, and as soon as it should live through multiple module calls, the easiest way to store it that I see is a file (the data structure allows to store it in a simple JSON). But I am to quite sure where should this file go on a user's machine.
Where in the file system should I store a cache in a form of a file or multiple files for a custom npm module, considering that the module itself can be installed globally, on multiple operation systems and can be used in multiple projects at the same time?
I was thinking about storing it in a module's folder, but it might be tricky in case of global installation + multi-project use. The second idea was to store it in OS specific tmp storage, but I am not quite sure about it too. I am also wondering if there are some alternatives to file storage in this case.
I would create a hidden folder in the user's home directory (starting with a dot). For instance, /home/user/.mymodule/config.cfg. The user's home directory isn't going anywhere, and the dot will make sure it's out of the user's way unless they go looking for it.
This is the standard way that most software stores user configs, including SSH, Bash, Nano, Wine, Ruby, Gimp, and even NPM itself.
On some systems you can cache to ~/.cache by create a sub-directory to store your cache data, though its much more common for applications to create a hidden directory in the users home directory. On modern windows machines you can use create a directory in C:/Users/someUser/AppData. In Windows using a . suffix will not hide a file. I'd recommend you do something platform agnostic like so:
var path = require('path');
function getAppDir(appName, cb) {
var plat = process.platform;
var homeDir = process.env[(plat == 'win32') ? 'USERPROFILE' : 'HOME'];
var appDir;
if(plat == 'win32') {
appDir = path.join(homeDir, 'AppData', appName);
}
else {
appDir = path.join(homeDir, '.' + appName);
}
fs.mkdir(appDir, function(err) {
if(err) return cb(err);
cb(null, appDir);
})
}
Just declare a function to get the app directory. This should handle most systems, but if you run into a case where it does not it should be easy to fix because you can just create some kind of alternate logic here. Lets say you want to allow a user to specify a custom location for app data in a config file later on, you could easily add that logic. For now this should suite most of your cases for most all Unix/Linux systems and Windows Vista and up.
Storing in system temp folder, depending on the system, your cache could be lost on an interval(cron) or on reboot. Using the global install path would lead to some issues. If you need this data to be unique per project, then you can extend that functionality to allow you to store this data in the project root, but not the module root. It would be best not to store it in the module root, even if its just installed as a local/project module, because then the user doesn't have the ability to include this folder in their repositories without including the entire module.
So in the event that you need to store this cached data relevant to a project, then you should do so in the project root not the node_modules. Otherwise store it in the users home directory in a system agnostic way.
First you need to know in what kind of SO you are running:
Your original idea is not bad, because global modules are not really global in all SO and in all virtual enviroments.
Using /home/user may not work in Windows. In windows you have to check process.ENV.HOMEPAHT
I recommend you a chain of checks to determine the best place.
Let the user take the control. Chose your own env variable. Supouse MYMOD_HOME. You firts check if process.ENV.MYMOD_HOME exists, and use it
Check if windows standard process.ENV.LOCALAPPDATA
Check if windows standard process.ENV.HOMEPATH
Check if exists '/home/user' or '~'
Otherwise use __dirname
In all cases create a directory ./mymodule

Grunt - Is it possible to have a task that only runs once ever?

There is a particular task that I want to run only once and then guarantee that it is never run again. Has anyone done this? I was looking at using grunt.event.once(...), or try and detect folders or files using a shell script on postinstall, but both ways leave a task in the gruntfile.js that could potentially be invoked at any time overwriting files.
At a very simple level it would do something like this:
grunt.registerTask('setup', [
'mkdir' // run some setup tasks
]);
grunt.event.once('setup', function() {
// some how do what's below here so it can't be done again
// so not available in config for reuse and possibly overwriting
// modified files
grunt.task.run([
'bowercopy:src_codeigniter'
]);
});
This even possible in Grunt? I know it's just a task runner, in this case I just want it to run it once.
There are several libraries that let you access a Gruntfile's content via an API so you could use one of these to alter your setup task's configuration after it is run the first time.
There's Gruntfile Editor and Gruntfile API
While both of them don't support complete task removal you can always modify your tasks configuration this way.

Temporary File Download

Is there a service that creates basically a one-time download of a file, preferably something I can use from NodeJS?
I've done some research on FilePicker, and haven't found anything about regenerating the link it gives you for a file. There may be a way to do this with NodeJS, but I'm using Meteor at the same time so many Node things probably will conflict.
You could build it with meteor. Using meteor-router with meteorite & use server side routing to deliver the files.
You need a collection to keep track of downloaded files:
Server JS
var downloads = new Meteor.Collection("downloads");
//create a link
downloads.insert({url:"/mydownload.zip",downloaded:false})
Meteor.Router.add('/file/:id', 'GET', function(id) {
download = downloads.findOne(id);
if( download) {
if(dowload.downloaded) {
this.response.send("You've already downloaded me")
}
else
{
//I guess you could just redirect or stream the file for an extra layer of surety
this.response.redirect(download.url);
}
}
});
On the client you can use /files/{{_id}} with _id of the file from downloads the person has as the link
My recommendation would also be to add custom server-side logic to count # of uploads (or just flag a file as downloaded/not downloaded) and respond accordingly. The closest you could do with Filepicker.io would be using the security policies to restrict downloading the file to a specific time interval.
in addition to using the router package
in Meteor.startup you can add
var require = __meteor_bootstrap__.require;
fs = require( 'fs' );
the fs variable should be declared on the server only. the fs package is used by Meteor and does not need to be added separately.
once you have done this, you can create files with Meteor.uuid() as their name which makes them unique and very difficult to guess. It is also possible to delete the file after a certain amount of time by using Meteor.setTimeout
the question is: where do the files to be downloaded come from?
Solution using Heroku Cloud and NodeJS Meteor Hooks
Heroku in particular is actually great for temporary file download links: they offer a "temporary scratchpad" filesystem that is reset every time the program restarts, and each running Node server cannot see the files other instances have created.
Each dyno gets its own ephemeral filesystem, with a fresh copy of the
most recently deployed code. During the dyno’s lifetime its running
processes can use the filesystem as a temporary scratchpad, but no
files that are written are visible to processes in any other dyno and
any files written will be discarded the moment the dyno is stopped or
restarted.
Taken from the Heroku documentation: https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem
Thus, any files written to the "filesystem" will be temporary.
This allows for a very easy solution to this problem: you can simply use NodeJS filesystem manipulation to create temporary files on the server, serve them once (or for a limited time), and then remove them so they cannot be downloaded again.
This in combination with something like $.download() will make a seamless experience which in turn prevents unauthorized downloads.

Resources