Is there a integration between crontab and Rundeck? - cron

I'm trying to find a tool where you can easily monitor cronjobs for the company I work at. Rundeck seems like the perfect tool for this but I can't figure out if it's possible to integrate the existing cronjobs into Rundeck. It's too much work to do this manually because there are hundreds of them.
If you now another tool that can do this feel free to recommend!
I'd rather want it to be open source but if it's paid and it works correct I'm open to it.

Rundeck works as a "very enhanced cron replacement" maybe the best approach in your case is to migrate your cron jobs to Rundeck and use the notification for monitoring. This looks like a good starting point for you, donĀ“t forget to visit the official documentation.

What Mega mentions is correct, rundeck can be used as a replacement for cron.
Even better, there is no need to configure all of those manually: Rundeck allows you to import job definitions via API call.
Steps:
Get rundeck installed
Set up a job to run one of your cron jobs manually
Export that job
Use a script to make many copies of that file, that each contain a different item from cron*
Import all those files via API call
*You'll need to change at least the name and workflow (called sequence commands file - you'll be able to see based on the workflow step you configured)

Related

How to start/stop/edit cron job from Node.js?

I have did some search for this question, but something useful didn't came out. So, i decided to create a new thread.
Problem Description
I am making CLI for installation of our server, so one of the pre-requisite is that cron job should be running. So, to accomplish same, i want to add some cron jobs and restart the service. Is it possible via Node.js
Checkout this module, will this help you?
https://github.com/ncb000gt/node-cron

Extract SAS Enterprise Guide into Unix Server runnable batch?

We have built a project in Enterprise Guide for the purpose of creating a easy understandable and maintainable code. The project contain a set of process flows which run should be done in specific order. This project we need to run on a Linux Server machine, where the SAS Metadata Server is running.
Basic idea is to extract this project into SAS code, which we would be able to run from command line in Linux as a batch job.
Question 1:
Is there any other way to schedule a batch job in Linux-hosted SAS Server? I have read about VBS scripting for scheduling/running batch jobs, but in order this to be done on Linux Server, a installation of WINE is required, which on a production machine which already runs a number of other important applications, is almost completely out of question.
Is there a way to specify a complete project export into SAS code, provided that I give the specific order of running process flows? I have tried out ordered list, which is able to make you a list of tasks to run in order (although there is no way to choose a whole process flow as a single task), but unfortunately, this ordered list itself is later not possible to be exported as a SAS code.
Current solution we do is the following:
We export each single process flow of the SAS EG project into SAS code, and then create another SAS code with %include lines to run all the extracted codes in order that we want. This is of course a possible solution, but definitely not the most elegant one.
Question 2:
Since I don't know how exactly the code is being exported afterwards, are there any dangers I should bear in mind with the solution I chose.
Is there any other, more elegant way?
You have a couple of options from what I'm familiar with, plus I suspect if Dom happens by he'll know more. These answers are based on EG 6.1, which is the current version (ships with 9.4); it's possible some of these things may not be true in earlier versions.
First, if you're running Enterprise Guide from Windows, you can schedule the job locally (on any Windows machine with Enterprise Guide). You're not scheduling the server directly, you schedule Windows to launch an EG process that connects to the server and does its magic. That's how I largely interact with scheduling (because I have a fairly 'light' scheduling need).
Second, from the blog post "Four Ways to Schedule SAS Tasks", options 3 and 4 may be helpful for you. The SAS Platform Suite is designed in part for scheduling, and the options using SAS Management Console to schedule via operating system tools, are both very helpful.
Third, you may want to look into SAS Stored Processes, which should be schedulable. A process flow can be converted into a stored process.
For your specific questions:
Question 1: When you export a process flow or a project, at least in 6.1 you have the option to change the order in which the programs are exported. It's manual, so it's probably not perfect, but it does give you that option. (The code seems to be by default in creation order, which is sub-optimal.) The project export does group process flows together, but you don't have the option of manipulating the order of process flows - you have to move each program around, which would be tedious. It also of course gives you less flexibility if you need to multiply run programs.
Question 2: As Stig Eide points out in comments, make sure your System Option LRECL is > 256 (the default) or you run some risk of code being cut off. In 9.2+ this is modifiable; just place LRECL=32767in your config.sas file.

Is it possible to use Jenkins server to run custom tasks one by one?

Is it possible to use Jenkins server to run custom tasks one by one?
By task I mean to execute an external groovy program which designed as an independent performance and integration test for specific deployment.
If it is possible then how to:
To define tasks in Jenkins and group them so they can start by starting a group.
To see an output of each task (output log).
If there is a specific outcome like "-1" then stop execution of the whole group.
And all this should start automatically after software has been built and deployed.
I feel there has to be a way to do it with Jenkins utilising its out-of-the-box functionality, just not sure how. Or I am wrong and we are looking at custom plugin as a solution?
Thanks a lot!
P.S. I am not asking for detailed answer, just a general direction would be Ok. Also Jenkins is not a requirement, it can be another similar CI server.
It sounds like this could work by a simple Jenkins task with Execute shell commands.
The Console Output for the jobs will contain the output from the processes that you run externally, and the exit status of the script can cause the task to be in failure (any non-zero exit code will do this by default).
On unix systems, #! beginning the first line will denote the script environment to use.
To chain this together with the other Jenkins steps, you can use Build Triggers for Build after other projects are built and use your deployment step as the starting off point.
It is possible, but be careful. Normally Jenkins is used to run build jobs and to deploy software to a QA or staging server. It does not touch Production. But when you start doing this in Jenkins you increase the risk that someone will accidentally run a production job that should not have been run. So if you do decide to use Jenkins for this, set up an entirely separate instance of Jenkins that does nothing other than run these jobs. Then go to Manage Jenkins->Configure Global Security and set up login users. At the least, use "logged in users can do anything" but it would be better to set up "matrix-based security". Then run any jobs that you need by using an Execute Shell step. You can schedule jobs by using a Build Trigger, and you can connect jobs sequentially by setting up Build Other Projects in the post build section. If you want to do more complex job chaining, look into the Join Plugin.
Just keep this Jenkins entirely separate from the Jenkins which you use for CI.

Front End for Running Talend Jobs

I am looking for a front end for our operator to run our Talend jobs. We do not want him to have the ability to delete or modify jobs. Only to run them and monitor their results. Any suggestions for tools for doing this?
Thanks
The subscription version of Talend (called Talend Integration Suite, or TIS) has precisely that. It's a web-based console called Talend Administration Center (TAC) and it allows an operator to run jobs and monitor their results -- among many other things. The permissions can be set in the way you described, so that the operator is not able to delete or modify the jobs.
In my company we use SOS Job Scheduler, which you can use with TOS out-of-the-box as a general diy scheduling solution. Just these days I'm working on a Talend Custom Component to integrate deeply between the first and the latter (like log-propagation, context parameter sharing and so on. Basically it let you use SOS API from a TOS job). I can speed-up and put on github in a few days, if you need :)
You can export job as shall script (.sh) and anyone can execute that and will not able to delete those jobs.
To see the execution details put some tlog component into your talend jobs.
This is just a work around not the solution if you only have open studio (free version).
I'm using Rundeck for manual and scheduled execution of Talend jobs . I find the job setup and scheduling to be far easier in Rundeck than in SpagoBI (which I frequently use for reports and BI).
You could also use something like automic (uc4) automation engine. It has fine grained security and is a commercial product . I've used it in the past for similar work.

How do I run test scripts from TestTrack Test Case Manager in Linux?

I use Seapine's TestTrack Test Case Manager (TCM) under Linux and thus far have been unable to figure out how to use its ability to kick off our Perl test scripts and save the resulting data into a test run. Could someone provide me with a config, or example?
Sean,
Take a look at the Script Agent mechanism on our Labs site. That will allow you to kick-off the Perl scripts.
If you have any questions about that, or want some help shoot me an email (mharp#) and we can talk.

Resources