I am trying to grab the date, CL and user of a list of changelists submitted within a given timeframe.
p4 changes -s submitted //depot/mainline/... #2020/03/09,#2020/03/14
will give me the changes with a date but too much other information. So you can use -F to strip out the information delivered.
p4 -F %change%-%user%-%date% changes -s submitted //depot/mainline/... #2020/03/09,#2020/03/14
But irritatingly -F %date% does not mean "what date was this submitted?" it means "what date is it today?" This is despite the information on the -e flag telling me that %date% is the submitted date.
So any ideas on how to get the submitted date from the -F flag?
Many thanks!
I assume you're on Windows and %date% is getting expanded by the shell so that p4 never sees it:
C:\Perforce\test>echo %date%
Thu 03/19/2020
Escaping the % will prevent that and let p4 see the command you actually wanted to run. In the cmd shell you can escape % as ^%:
C:\Perforce\test>p4 -F ^%date^% changes -m1
2020/03/16
Related
Is it possible to do an "git status" and output the result into an echo? Or send the output as email?
I guess the email-thing is no problem but I stuck doing an echo.
What I got
clear
output="$(git status)"
echo output
But ... yeah, it won't work and I searched certain examples but they lead always ino a git status if,. and i need the output :( Is there way simple way to get this done
And, how to handle if this should be called on a remote system:
ssh ${SSHUSER}#${LIVE_HOST} << EOF
...
EOF
The echo is useless; all you need is
git status
If you genuinely need to store the output in a variable as well as print it, try
output=$(git status)
echo "$output"
To run it remotely, you don't need a here document at all;
ssh "${SSHUSER}#${LIVE_HOST}" git status
and again, if you need to store that in a variable, that would be
output=$(ssh "${SSHUSER}#${LIVE_HOST}" git status)
echo "$output"
If you really want to store the command in a here document, that's possible too:
ssh "${SSHUSER}#${LIVE_HOST}" <<\:
git status
:
or in Bash you could use a "here string":
ssh "${SSHUSER}#${LIVE_HOST}" <<<'git status'
If you want to send the result as email, a common arrangement is
git status | mail -s "Status report from xyz" you#example.com
but the mail command is poorly standardized, so you may have to consult a manual page or Google for a bit.
An echo "$output" would work better
echo ouput would just print output.
I just tried:
$ o=$(git status 2>&1); echo "$o"
On branch master
Your branch is up to date with 'origin/master'.
nothing to commit, working tree clean
It does not working because you missed the variables syntax.
Rewrite your code as follow:
clear
output="$(git status)"
echo "$output"
I wants to check on my linux system when which command was fired - at which date and time.
I fired commands like this:
history 50
It shows me the last 50 commands history, but not with date and time at which it was fired. Does any one knows how to do it?
Regarding this link you can make the first solution provided by krzyk permanent by executing:
echo 'export HISTTIMEFORMAT="%d/%m/%y %T "' >> ~/.bash_profile
source ~/.bash_profile
Try this:
> HISTTIMEFORMAT="%d/%m/%y %T "
> history
You can adjust the format to your liking, of course.
In case you are using zsh you can use for example the -E or -i switch:
history -E
If you do a man zshoptions or man zshbuiltins you can find out more information about these switches as well as other info related to history:
Also when listing,
-d prints timestamps for each event
-f prints full time-date stamps in the US `MM/DD/YY hh:mm' format
-E prints full time-date stamps in the European `dd.mm.yyyy hh:mm' format
-i prints full time-date stamps in ISO8601 `yyyy-mm-dd hh:mm' format
-t fmt prints time and date stamps in the given format; fmt is formatted with the strftime function with the zsh extensions described for the %D{string} prompt format in the section EXPANSION OF PROMPT SEQUENCES in zshmisc(1). The resulting formatted string must be no more than 256 characters or will not be printed
-D prints elapsed times; may be combined with one of the options above
It depends on the shell (and its configuration) in standard bash only the command is stored without the date and time (check .bash_history if there is any timestamp there).
To have bash store the timestamp you need to set HISTTIMEFORMAT before executing the commands, e.g. in .bashrc or .bash_profile. This will cause bash to store the timestamps in .bash_history (see the entries starting with #).
HISTTIMEFORMAT="%d/%m/%y %H:%M "
For any commands typed prior to this, it will not help since they will just get a default time of when you turned history on, but it will log the time of any further commands after this.
If you want it to log history for permanent, you should put the following
line in your ~/.bashrc
export HISTTIMEFORMAT="%d/%m/%y %H:%M "
It depends on which shell you are using. For GNU Bash, changing the HISTTIMEFORMAT variable will help. This nixCraft article explains how to set that variable permanently, but uses an ambiguous date format. For ISO 8601, use:
HISTTIMEFORMAT="%G-%m-%dT%T "
Result:
$ history
[...]
13 2022-11-07T13:32:01 pwd
14 2022-11-07T13:32:05 cd
15 2022-11-07T13:32:10 ls -l
On macOS without editing any rc files:
history -t"%F %T"
I've got about a dozen servers that each have crontabs with anywhere from 20-50 crontab entries. My single most common cause of a process failure is someone commenting out jobs in cron during a fix or patch and then forgetting to uncomment the jobs.
I'd like to do two things to solve this:
Start using our schedule suppression process that allows users to suppress schedules without actually touching crontab. Nothing magical - just touch a file in a directory dedicated to the process. The process checks that directory on start-up.
Implement a process that will send out alerts if crontab doesn't match its backup or current version in svn.
Can anyone recommend an existing solution for #2 (alert when crontab changes)?
In this case i would suggest to compare hashvalues of the file you want to have and the actual file.
Just write a little bashscript that sends out a emailnotification or creates a notification file or whatever you want and let this script be run automatically every x seconds / minutes / hours.
A possible script could be
if [[ $(md5sum path/to/crontab.backup | cut -d' ' -f1) == $(md5sum /etc/crontab | cut -d' ' -f1) ]]
then
# send your notification
fi
This is a very simple solution to check if a file was changed since the last backup was made.
I was wondering, out of curiosity, if it is possible to code a bash script logs all the command run in a Bash/SSH session. I know history is suppose to log all the commands run but it seems to be very unreliable!
I have been messing about this morning and came up with the following bash script which does log what the user runs in the terminal but does not run all the commands correctly.
prompt_read() {
echo -n “$(whoami)#$(hostname):$(pwd)~$ “
read userinput
}
prompt_read
while :; do
if [[ $userinput != exit ]]; then
logger "logit $userinput"
bash -c "$userinput"
prompt_read
else
kill -1 $PPID
fi
done
Is anyone aware of anything that logs commands better and more reliably than history
Cheers
The reason why history seems unreliable to you is because it only writes to history at the end of a BASH session, so you could lose commands.
I have a few things in my bash profile:
HISTFILESIZE=10000 # how many lines of history to store in the history file
HISTSIZE=10000 # how many lines of history to store in a session ( I think )
HISTCONTROL=ignoredups # ignore duplicate commands
shopt -s histappend # append history, rather than having sessions obliterate existing history
PROMPT_COMMAND="history -a;$PROMPT_COMMAND"
The last few are the important ones, setting your PROMPT_COMMAND with history -a will make history append immediately, rather than post-session. And setting shopt -s histappend will make bash sessions append to the history file, rather than overwrite existing histories.
Some more info: http://linuxcommando.blogspot.com/2007/11/keeping-command-history-across-multiple.html
Additionally, if this is useful to you, you can change the name of the history file you use for a particular bash session with the HISTFILE environment variable.
Check out script: http://bashshell.net/commands/using-the-script-command/
It records everything that appear on the terminal to a file, and iirc it can play back the recorded session.
You can set this as the user's shell to make it record everything upon login.
You can find a script here to log all 'bash' commands/builtins into a text-file or a 'syslog' server without using a patch or a special executable tool.
You can also write directly without syslog to a logfile.
It is very easy to deploy, as it is a simple shell script that need to be called once at the initialization of the 'bash'.
I made a shell script that get a remote file with ftp protocol, if the file is well downloaded, it launch another script in php with curl.It's kind of working right now but i have a few questions to improve:
Do the script is waiting the end of the download to execute the rest of the script ?Or during the time of download the script do the following instructions ?
I receive the first mail of beginning instruction but never the last ones (the one that get the result of the curl, and the one at the end of the script) how come ?
I would like to find a good way to disallow the script to be run more than once (if the archive has been downloaded) even if it's launch every hours with crontab ?
what is the difference between quit/bye/by at the end of the ftp connection ?
This is the shell script:
echo start of the script | mail -s "beginning of the script" krifur#krifur.com
cd /my_rep
HOST='domaine.com'
PORT='21'
USER='admin'
PASSWD='pass'
jour=$(date "+%Y%m%d")
FILE="file_"$jour".txt";
ftp -i -n $HOST $PORT <<EOF
quote USER $USER
quote PASS $PASSWD
cd firstlevel
cd end
get $FILE
quit
EOF
if test -f $FILE
then
CurlRes="$(curl "http://doma.com/myfile.php")"
echo debug CURL : $CurlRes | mail -s "debug" krifur#krifur.com
else
echo no file : $FILE | mail -s "no file" krifur#krifur.com
fi
echo this is the end of the script download | mail -s "end of script download" krifur#krifur.com
Do the script is waiting the end of
the download to execute the rest of
the script ?Or during the time of
download the script do the following
instructions ?
If you mean "Will the FTP command block until finished?" , the answer is yes.
I receive the first mail of beginning
instruction but never the last ones
(the one that get the result of the
curl, and the one at the end of the
script) how come ?
Take a look at your code:
then
CurlRes="$(curl "http://doma.com/myfile.php")"
echo debug CURL : $CurlRes | mail -s "debug" krifur#krifur.com
else
echo no file : $FILE | mail -s "no file" krifur#krifur.com
fi
What are the contents of $CurlRes and $FILE respectively? Try ${CurlRes} and ${FILE}. I'd also suggest quoting strings when using echo.
There is also a good chance that spam filters don't like the message, have you checked on that?
I would like to find a good way to
disallow the script to be run more
than once (if the archive has been
downloaded) even if it's launch every
hours with crontab ?
That could be done in a number of ways. Perhaps, upon success echo the file name to the bottom of something like successfully_downloaded.txt , then use grep to see if the file name is in the list. Depending on use, that file might get rather large .. so I'd also implement some means of rotating it.
From man (3) ftp:
bye Terminate the FTP session with the remote server and exit ftp.
An end of file will also terminate the session and exit.
quit A synonym for bye.
by is also a synonym for bye, as far as I know.
This should be avoided at all costs:
USER='admin'
PASSWD='pass'
Use ssh's scp with keys (no password prompt required):
Linux Journal article howto