Control M specific job status to a excel - linux

I have configured control-m batches to execute complex job/workflows.
Is there a way to get the specific job/workflow status to a excel or csv ?

You could make use of the ctmlog listmsg Control-M utility.
This utility needs to be run on the server that hosts the control-m server.
I run the following powershell daily to extract all failures to a text file.
You could do the same for ended OK jobs - just change the message number from 5134 to 5133.
#Get Today's and Yesterday's date
$date = Get-Date -Format 'yyyyMMdd'
$m1Date = (get-date).AddDays(-1).ToString('yyyyMMdd')
#Generate and execute Control-M Utility command
$cmd = "ctmlog listmsg 5134 $m1Date 0000 $date 0000 C:\Temp\Todays_Failures.txt"
iex $cmd

Related

Unable to output script results with column/table formatting

Answered - previously titled 'Cron job for shell script not running'
I recently downloaded Speedtest onto my Raspberry Pi, and wrote a script to output the results in csv format to a CSV file.
I'm trying to do this regularly via a cron job, but for some reason, it won't execute the shell script as intended.
Here's the script below. I've commented/cut out a lot to try and find the issue
#!/bin/bash
# Commented out if statement detects presence of data file and creates one if it doesn't exist. Was going to adjust later to include variables/input options if I wanted to used script on alternate systems, but commented out while working on main issue.
file='/home/User/Documents/speedtestdata.csv'
# have tried this with and without quotes, does not seem to make a difference either way
#HEADERS='/usr/bin/speedtest-cli --csv-header'
SPEEDTEST='/usr/bin/speedtest-cli --csv'
# Used absolute path for the executable
#LOG=/home/User/scripts/testreclog.txt
#DATE=$( date )
# Was using the above to log steps of script running successfully with timestamp, commented out
#if [ ! -f $file ]
#then
# echo "Creating results file">>$LOG
# touch $file
# $HEADERS > $file
#fi
#echo "Running speedtest">>$LOG
$SPEEDTEST >> $file
#echo "Formatting results">>$LOG
#column -s, -t < $file
# this step was used to format the log file neatly
#echo "Time completed ",$DATE>>$LOG
And here's how the crontab currently looks
# Edit this file to introduce tasks to be run by cron.
#
# Each task to run has to be defined through a single line
# indicating with different fields when the task will be run
# and what command to run for the task
#
# To define the time you can provide concrete values for
# minute (m), hour (h), day of month (dom), month (mon),
# and day of week (dow) or use '*' in these fields (for 'any').
#
# Notice that tasks will be started based on the cron's system
# daemon's notion of time and timezones.
#
# Output of the crontab jobs (including errors) is sent through
# email to the user the crontab file belongs to (unless redirected).
#
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
#
# For more information see the manual pages of crontab(5) and cron(8)
#
# m h dom mon dow command
*/5 * * * * /bin/bash /home/User/scripts/testandrec.sh
# 2> /home/User/scripts/testrecerror.txt
# Was attempting to log errors to this file, nothing seen so commented out on a newline.
#* * * * * /home/User/scripts/testscript.sh test to verify cron works (it does)
I've added my scripts folder to the end of my path, but for some reason this only shows up when I'm using the Pi directly, when I ssh in I'm missing the scripts folder on the end.
However, given that I've used absolute path for everything I'm not sure why this would be an issue.
First I tested whether a simple Cron job would work, so I created testscript.sh, which simply returned 'Test' and a timestamp to a specific file and used the same shebang, and used the absolute paths, and functioned as intended.
I have checked systemctl for Cron, restarted Cron with sudo service cron restart and made sure a new line is in place in the crontab.
I have tried with and without /bin/bash in the cron tab entry, it seemingly hasn't made a difference.
I tried cd /home/User/scripts && ./testandrec.sh but no luck.
I changed the run time to every 5 then every 10 minutes, which has not worked.
I have noticed that when I ran the script manually with column -s, -t < $file left in, when cating the results file it is formatted as intended.
However, the next instance of when the cron job should run reverts this to CSV with a , as a delimitter, so clearly something is running.
To confuse matters further, I think the script may be firing once after restarting cron, and then not working when it should be running subsequently. When I leave the column line in, this appears to just revert the formatting, but if I comment it out it appears to run a speed test and append the results, but only once. However, I may be wrong on this and reproducing it
If I instead try 0 * * * * /usr/bin/speedtest-cli --csv >> /home/User/Documents/speedtestdata.csv && column -s, -t < /home/User/Documents/speedtestdata.csv, it appeared to perform/append speedtest but does not action the column command.
I would much rather neatly tie up the process in a shell script, however, rather than have the above which isn't very DRY code.
I've looked extensively, but none of the solutions I've found on this site or others have fixed the issue.
Any troubleshooting suggestions/help would be greatly appreciated.
Here you go - the solution is simple:
#!/bin/bash
# Commented out if statement detects presence of data file and creates one if it doesn't exist. Was going to adjust later to include variables/input options if I wanted to used script on alternate systems, but commented out while working on main issue.
file='/home/User/Documents/speedtestdata.csv'
# have tried this with and without quotes, does not seem to make a difference either way
#HEADERS='/usr/bin/speedtest-cli --csv-header'
SPEEDTEST='/usr/bin/speedtest-cli --csv'
# Used absolute path for the executable
#LOG=/home/User/scripts/testreclog.txt
#DATE=$( date )
# Was using the above to log steps of script running successfully with timestamp, commented out
#if [ ! -f $file ]
#then
# echo "Creating results file">>$LOG
# touch $file
# $HEADERS > $file
#fi
#echo "Running speedtest">>$LOG
$SPEEDTEST | column -s, -t >> $file
Just check the last line ;)

Comparing file created yesterday with file created today

I have files in a directory as shown in the below format.
$today and $yesterday are two variables holding today's date and yesterday's date , both will hold date as shown in below structure .
today=$(date +"%Y-%m-%d")
yesterday=$(date -d "yesterday 13:00" '+%Y-%m-%d')
example-$today.txt
polar-$today.txt
example-$yesteday.txt
polar-$yesteday.txt
Example yesterday : example-2020-09-24.txt
Example today: example-2020-09-25.txt
Files are created on a daily basis using cronjob , so there will be files in below structure with tomorrow's date.
example-$tomorrow.txt
polar-$tomorrow.txt
I want to compare files starting with same name on different dates and if there is difference execute a python script.The python script takes today's file as first argument if there is a difference.
if diff example-$today.txt example-$yesteday.txt
then
echo "No difference"
else
python script.py example-$today.txt
fi
If I have only 2 or 3 files I can use if else code for each file using diff as mentioned above, but the list will be populated with more unique names in future , and writing if command is tedious.
Requirement :
Compare all the txt files in the directory with same named file names on yesterday and today , if there is a difference execute the python script.
Seems like it's probably not too terrible to do:
for base in example polar; do
if ! diff ${base}-$today.txt ${base}-$yesteday.txt; then
python script.py ${base}-$today.txt
fi
done
That should be fairly maintainable, and you can write list='example polar ...' ... for base in $list, or list=$( cmd to dynamically generate names), or use an array. There's a lot of flexibility. For example, if you don't want to maintain the list of files, you could do:
for file in *-${today}.txt; do
base="${file%-${today}.txt}"
if ! diff "${base}-$today.txt" "${base}-$yesteday.txt"; then
python script.py "${base}-$today.txt"
fi
done
Note that I've removed the excess verbosity. Succeed quietly, fail loudly.

PowerShell date format not behaving

I'm stumped. :)
My computer has PowerShell 5.1 installed. On another computer (same language) 5.0 it works as expected. (Check using Get-Culture; my locale is nb-NO (Norwegian) )
Consider this:
Get-Date
returns
tirsdag 23. mai 2017 13.13.18
So I do this
Get-Date -Format "H-m-s"
as expected it returns
13-13-18
But then I do this
Get-Date -Format "H:m:s"
You think it returns
13:13:18
right? (it does on PS5.0!) No! I get this:
13.13.18
Only if I do this, is the output what I want:
Get-Date -Format "H\:m\:s"
13:13:18
Can someone please explain why this is? I discovered it "by accident" when I wanted to format a datetime-compatible string for use in SQL Server.
That's because the underlying DateTime formatting function see's : and treats it as a culture-dependent "time separator".
In Norwegian (no-NO), the default time separator is .. You can inspect this with (assuming that no-NO is the current culture):
PS C:\> [CultureInfo]::CurrentCulture.DateTimeFormat.TimeSeparator
.
You can also override this, either by inserting a literal :, with the escape sequence \: as you've already found, or you can override it globally (for the lifetime of the current process/appdomain):
PS C:\> [CultureInfo]::CurrentCulture.DateTimeFormat.TimeSeparator = ":"
PS C:\> Get-Date -Format "H:m:s"
13:13:18

Need help to make my program automated

I just wanted to get some idea how I should approach to this. I am trying to automate to get report back to a database with these bunch of scripts (i.e., java -jar snet_client.jar -mode report -id 13528 -props /int2/contact/client0.properties & ). Lets say I have hundreds of this command with unique numbers as it in the script(13528). I need to put that in a loop so that I do not need to write/copy&paste that hundred scripts over and over to execute. Any suggestion would be helpful. It has to be in unix.
This first bash script would iterate over each line in the file textfile, assuming that each of the id values are on a single line, start the java process and wait for it to complete before starting the next one.
# Queueing
# This one will only start the next process when the previous one completes.
OLD_IFS=$IFS
while IFS=$'\n' read -r line_data; do
java -jar snet_client.jar -mode report -id ${line_data} -props /int2/contact/client0.properties &
wait;
done < /path/to/textfile
IFS=$OLD_IFS
Alternatively, this script does the same, as far as getting id values from a text file, but doesn't wait for the first to complete before the next is started. This will likely cause problems is the snet_client.jar program is very resource intensive:
# Non-queueing
# This starts and runs all the processes
OLD_IFS=$IFS
while IFS=$'\n' read -r line_data; do
java -jar snet_client.jar -mode report -id ${line_data} -props /int2/contact/client0.properties &
done < /path/to/textfile
IFS=$OLD_IFS
On both, we store the current IFS value before we begin so we can reset it after the process runs, just in case we need it set back for something later in the script file.
I have not tested these (since I don't have the dependencies available), so you might have to make adjustments for your own environment.

How to get last modification time and timezone of a remote server via FTP (perl)

I am writing a perl script that can sync a local directory with remote directory incremently using FTP. It checks last modified date of each file and compare it with one remote file's. I use the following code to get last modified time via FTP.
my $f = Net::FTP->new($config->{'host'}, Passive => 1, Debug => 0) || die "Couldn't ftp to $config->{'host'}: $#";
$f->mdtm($file);
It works great if local machine and remote machine have same time and timezone, I need to get a workaround if they are not!
Thanks in advance
As far as I know it doesn't exist a way to know the timezone set for an FTP Server using only ftp commands ... Anyway, if you have write permission you could try creating a file on the remote FTP Server and then
doing an "ls" on it to see the date/time stamp, having the timestamp of the newly created file you could calculate the difference between your local timezone and the server time zone.
$f->touch('test.time') or die $f->message; #Creates new zero byte file
$f->mdtm('test.time'); #Now you should have the local time of the FTP Server
#Now you can compare with your local time and find the difference ...
Make sure that the test file doesn't exist before trying create one with touch command .
Does modification time obtained from (stat file[9]) not depend upon which timezone server configured with?
I touched a sample file almost simultaneously in two servers with different timzones. Modification time obtained was very similar. only last 4 digits were dissimilar.
Server 1
--------
root#- [~/ftp_kasi]# touch file
root#- [~/ftp_kasi]# perl mdtime.pl
1380005862
root#- [~/ftp_kasi]# date
Tue Sep 24 02:59:19 EDT 2013
root#- [~/ftp_kasi]#
Server 2
--------
root#ffVM32 kasi]# touch file
[root#ffVM32 kasi]# perl mdtime.pl
1380006066
[root#ffVM32 kasi]# date
Tue Sep 24 12:38:45 IST 2013
[root#ffVM32 kasi]#
[root#ffVM32 kasi]# cat mdtime.pl
#!/usr/local/cpanel/3rdparty/bin/perl
$file = "file";
#open my $fh,'<',$file or die "Could not open file : $!\n";
#my $mdtime = (stat($fh))[9];
open (FILE, "file");
my $mdtime = (stat(FILE))[9];
print $mdtime."\n";

Resources