Node-cron in multiuser environment - node.js

I am trying to create an app using node, in which users can schedule certain cron jobs ,the setting for these job I am fetching from the users and saving it to mongoDB .I need my app to be such that users can start stop these jobs whenever they want . I have a class created which is something like this
Class Croncreator {
constructor () {Creates the cron job}
startCron ()
stropCron()}
Now all this is working ,but I cannot wrap my head around how do I manage this in a multi user environment?
Do I create an instance of this class in my express route for "/api/savecronjob" ,if yes then how do I manage the start stop feature .Considering the fact that 1 user may be creating multiple jobs a time and switching them on off when ever he wants.

I solved the problem by using a NPM package called cron-job-manager.
Read more about it NPM LINK
The documentation is pretty self explanatory .
I basically created a function to create the cron jobs with a identifier,and created two routes to start and stop these jobs.

Related

MongoDB and Mongoose databases not connecting across directories

I have a web server running, that uses mongoDB to store posts created on the website.
I would like to use a separate script to manage some things on the site, however for some reason I can't seem to get the code working across directories.
The website is running in /home/username/program/
I want my utility script to reside in /home/myname/utils/
This is currently the script I have:
#!/usr/bin/nodejs
var mongoose = require('mongoose');
var db = mongoose.connect('mongodb://localhost/db_name',{useNewUrlParser:true});
var chat = require('/home/username/project/lib/models/chat');
chat.findOne(function(err,doc) {
console.log(err,doc);
});
This code works, and gets data, but only if the file it's written it resides in /home/username/project/lib/.
If the file is in /home/mynameutil/ then it doesn't get any data at all. Why is this?
It is appropriate, because you don't start your project on /home/username so your program shared same instance, If you want to create something across the folder considers using workspaces.
Some tools make it easy to create workspaces like yarn and pnpm with this your code can be run in the same instance and share same node_modules. So you can use it across folder

How to automigrate when needed in loopback 3?

I created an automigrate script under /bin in my loopback app and added its path in the package.json file so that I can run this script to automigrate whenever I want from the terminal.
I also have a boot script "createUsers.js" which creates some default users in a model. The problem is, whenever I run this script it calls the boot script and it tries to create the users while automigration is still not finished, resulting in a failed automigration. I don't understand why the boot scripts are called when I only run automigrate script specifically. I could call automigrate in the boot scripts and wrap the createUsers.js code in its callback (as shown here), but that would automigrate every time the app is started which is undesirable since the data is lost on automigration. Where should I call automigrate() so that it can be called whenever required? Any help is greatly appreciated.
What I normally do, is create a script called util.js inside boot
util.js
class util{
static _automigrate(){
//Whatever you need to do
}
}
module.exports = function (server) {
global.util = util;
};
This way your script is available across the entire application. And you can call it whenever you need to.
You could call it with
util._automigrate();
I normally use this pattern to store all my input validations etc since I might need those across different models.

How do I invoke a Sails.js controller function from a file in the project root?

I am building a Sails.js application that runs on Heroku. I need to use Heroku Scheduler to run a "CRON" job every few hours. The scheduler only allows me to run a single command so I have it setup to run $ node sendEmails.js every 1 hour.
The issue is, sendEmails.js is not a part of the core Sails.js project and I need it to invoke a function inside my ReportsController.js file. How exactly do I go about doing this? I don't want to copy the controller logic to sendEmails.js because it has a lot of dependencies to the database and other services which I can't duplicate. For context:
/**
* ReportsController
*
* #description Server-side logic for managing reports
* #help See http://sailsjs.org/#!/documentation/concepts/Controllers
*/
module.exports = {
// I need to call this function from sendEmails.js which is in my project root
generate: function(req, res) {
// Logic for generating reports
}
}
You can do this in several ways:
(Better) Create a service and then invoke the service name like Myservice.myfunction or even sails.myservice.function. Your service, as the name says, will be available for every controller and can be used to centralize code that will be used globally. Take a look : Sails Services. You can then invoke your service inside a controller, then your service can (or cannot) do option 2 if it suits you.
(Not very good) Inside a controller or service, do a manual require for the path of your file. Like this let myfunctions = require('../folder/myfile.js') and then invoke the functions like ``myfunctions.myfunction(nargs). Don't forget to usemodule.exports = {...}`.

Grunt - Is it possible to have a task that only runs once ever?

There is a particular task that I want to run only once and then guarantee that it is never run again. Has anyone done this? I was looking at using grunt.event.once(...), or try and detect folders or files using a shell script on postinstall, but both ways leave a task in the gruntfile.js that could potentially be invoked at any time overwriting files.
At a very simple level it would do something like this:
grunt.registerTask('setup', [
'mkdir' // run some setup tasks
]);
grunt.event.once('setup', function() {
// some how do what's below here so it can't be done again
// so not available in config for reuse and possibly overwriting
// modified files
grunt.task.run([
'bowercopy:src_codeigniter'
]);
});
This even possible in Grunt? I know it's just a task runner, in this case I just want it to run it once.
There are several libraries that let you access a Gruntfile's content via an API so you could use one of these to alter your setup task's configuration after it is run the first time.
There's Gruntfile Editor and Gruntfile API
While both of them don't support complete task removal you can always modify your tasks configuration this way.

Update deployment via linux script in weblogic

What is the script to update deployment ( from GUI, we can do this update by unlock & save changes ) in linux. Is it possible to do this ? If not what is script to redeploy ?
As Kevin pointed out, WLST is the way to go. You should probably craft a script (named wlDeploy.py, for instance), with content like follows (import clauses were omitted for the sake of simplicity):
current_app_name = '[your current deployed app name]'
new_app_name = '[your new app name]'
target_name = '[WL managed server name (or AdminServer)]'
connect([username],[pwd],'t3://[admin server hostname/IP address]:[PORT]')
stopApplication(current_app_name)
undeploy(current_app_name, timeout=60000);
war_path = '[path to war file]'
deploy(appName=new_app_name, path=war_path, targets=target_name);
And call it via something like:
./wlst.sh wlDeploy.py
Of course you can add parameters to your script, and a lot of logic which is relevant to your deployment. This is entirely up to you. The example above, though, should help you getting started.
In WebLogic you can use wlst to perform administrative tasks like managing deployments. If you google weblogic wlst, you will receive tons of information. wlst runs on the python language.
Assuming you are using weblogic 10 you can also "Record" your actions. This will save the actions into a python script which you can "replay" (execute) later.

Resources