How to grep lines in a cron job whose crontab appears to be deleted? - linux

I'm trying to search for a specific word in my cron job, which we'll call word for now.
The machine is an AWS server, and I've only ever used one username, so I do not think it is an issue of cron jobs being under a different user.
So, in the root directory, I do egrep -r ".*word.*" . Nothing comes up. I would assume the original crontab was deleted at some point, even though the process is still running.
However, when I do crontab -l, I comb through the entire output (takes a while), I can see that there definitely is a cronjob with this word.
What is the best way to grep lines in a cronjob? Or, does egrep only work on certain types of files? Thanks.

Wait. If you're doing this:
egrep -r ".*word.*"
It won't return because you haven't pointed egrep at a file to grep through. Generally, you use grep like this:
egrep "word" filename
egrep -r "word" directory/
egrep -r "word" *
You can grep the output of crontab -l like this:
crontab -l | egrep "word"
Other notes:
the .*word.* is unnecessary, as word will match. It should be faster (greedy matching)
egrep -r word * in your home directory wouldn't match the crontab, as that isn't stored in your homedir. (it's likely at /var/spool/cron/crontabs/$USER, but that's not terribly relevant)

you simply have to do:
crontab -l | grep 'word'
That's all.

Related

How can I delete the oldest n group of files with the same prefix?

In Linux I use InfluxDB which can make a backup of the database for archival purposes. Each backup comprises a series of files with the same prefix "/tank/Backups/var/Influxdb/20191225T235655Z." and different extensions.
I wanted to write a bash script which first deletes the oldest existing backups, then creates a new one (here I paste only the removal):
ls -tp /tank/Backups/var/Influxdb/* | grep -v '/$' | sed -E 's/\..+//' | \
sort -ru | sed 's/$/.*/' | tail -n +4 | xargs -d '\n' -r rm --
However, when I run the script as "sudo", I get
rm: cannot remove '/tank/Backups/var/Influxdb/20191225T235655Z.*': No such file or directory
When I run the quoted script, except the latest part, I get:
/tank/Backups/var/Influxdb/20190930T215357Z.*
/tank/Backups/var/Influxdb/20190930T215352Z.*
which is correct. Also, if I manually write
sudo /tank/Backups/var/Influxdb/20190930T215357Z.*
the command succeeds.
Why is the script reporting an error?
I'm using Ubuntu 18.04 and the folder "/tank" is a ZFS volume.
Better do :
find /tank/Backups/var/Influxdb/* -mtime +5 -delete
to remove files older than 5 days.
Then, you can run the next command
Explaining the Error
This answer is only here to explain the error and give a deeper understanding of what is happening. If you are simply looking for an elegant solution search for other answers.
When I run the quoted script, except the latest part, I get:
/tank/Backups/var/Influxdb/20190930T215357Z.*
/tank/Backups/var/Influxdb/20190930T215352Z.*
which is correct
The listed strings are not what you want. When you pass these paths to rm it sees them just as literal strings, that is, two files whose names end with a literal *. Since you don't have such files you get an error.
When you type rm * manually into your console bash (not rm!) does globbing. bash searches files and replaces the * with the list of found files. Only after that bash executes rm foundFile1 foundFile2 .... rm never sees the *.
Strings inside a pipeline are not processed by bash, but by the commands in the pipeline, in your case rm. rm does not glob.
You could run bash inside your pipeline and let it expand the * you inserted earlier. To this end, replace the last command in your pipeline with xargs -r bash -c 'rm -- $*' --. However, note that your paths are not quoted here. If there are spaces or literal * in your filenames the command will break. This is necessary for globbing as quoted "*" are not expanded by bash.
To quote your files you have to insert the * glob inside the bash command:
ls -tp /tank/Backups/var/Influxdb/* | grep -v '/$' | sed -E 's/\..+//' |
sort -ru | tail -n +4 | xargs -d\\n -L1 -r bash -c 'rm -- "$0."*'
Above command is only a simple fix for your command. It is neither elegant nor very robust. Using tools like find is strongly recommended.

How to understand this shell script?

cat urls.txt | xargs -P 10 -n 1 wget -nH -nc -x]
This shell is very confusing to new user, just want to ask if there is any reference document I can refer?
There is nothing much confusing about it.
If you want to know what the commands do then use the manual.
man cat
man xargs
The pipe sends the output of one command to the next, in this case cat urls.txt to xargs.
cat urls.txt will write the contents of the file urls.txt to stdout, which is then used as the input for xargs.
xargs -P 10 -n 1 will execute a command with with the input (the contents of urls.txt) as arguments. The command in this case being wget -nH -nc -x]. I don't know what ] is supposed to do there, but that's probably a typo.
All in all you can understand, without much caring about the options, that this will download a list of files that is in urls.txt into your current directory. Of course it's always safe to check the options flags. in this case -nc for example causees wget to rename a downloaded file and append a number if the file is already in the directory.
All three man pages can also be found online:
cat
xargs
wget
you can follow this book https://www.iiitd.edu.in/~amarjeet/Files/SM2012/Linux%20Dummies%209th.pdf
And best way to learn Linux command is use man command
example :
type > man xargs on terminal you will get all detail
you will get man page for all linux comman
The best way is follow this link https://explainshell.com

GNU grep on FreeBSD not working properly

I have a weird problem on FreeBSD 8.4-STABLE with grep (GNU grep) 2.5.1-FreeBSD.
If I try to grep -Hnr searchstring I didn't get any output, but grep is running said ps aux and is keep running until I kill the process.
If I copy a testfile in an empty directory and do
cat testfile | grep searchstring it is working.
But if I try to
grep -Hnr searchstring in that directory I also get no output, grep keeps running and running but didn't produce any matches.
Anybody knows how to solve this?
Even though you gave -r, you still have to give grep a file argument. Othersize, as you've discovered, it just sits there waiting for input on stdin.
You want
grep -Hnr searchstring .
# ....................^^
That will recursively find files under the current directory.
Though it doesn't seem to be documented, if you invoke grep with the -r option and no file or directory name arguments, it defaults to the current directory, almost as if you had typed grep -R pattern . except that ./ does not appear in the output.
Apparently this is a fairly new feature.
If you do a recursive grep in a directory with a lot of contents, it could simply take a long time -- perhaps forever if there are device files such as /dev/zero that can produce infinite output.

Can you mass edit all files returned in a grep?

I want to mass-edit a ton of files that are returned in a grep. (I know, I should get better at sed).
So if I do:
grep -rnI 'xg_icon-*'
How do I pipe all of those files into vi?
The easiest way is to have grep return just the filenames (-l instead of -n) that match the pattern. Run that in a subshell and feed the results to Vim.
vim $(grep -rIl 'xg_icon-*' *)
A nice general solution to this is to use xargs to convert a stdout from a process like grep to an argument list.
A la:
grep -rIl 'xg_icon-*' | xargs vi
if you use vim and the -p option, it will open each file in a tab, and you can switch between them using gt or gT, or even the mouse if you have mouse support in the terminal
You can do it without any processing of the grep output! This will even enable you to go the the right line (using :help quickfix commands, eg. :cn or :cw). So, if you are using bash or zsh:
vim -q &lt(grep foo *.c)
if what you want to edit is similar across all files, then no point using vi to do it manually. (although vi can be scripted as well), hypothetically, it looks something like this, since you never mention what you want to edit
grep -rnI 'xg_icon-*' | while read FILE
do
sed -i.bak 's/old/new/g' $FILE # (or other editing commands, eg awk... )
done
vi `grep -l -i findthisword *`

How to execute a command with one parameter at a time in the *nix shell?

Some commands like svn log, for example will only take one input from the command line, so I can't say grep 'pattern' | svn log. It will only return the information for the first file, so I need to execute svn log against each one independently.
I can do this with find using it's exec option: find -name '*.jsp' -exec svn log {} \;. However, grep and find provide differently functionality, and the -exec option isn't available for grep or a lot of other tools.
So is there a generalized way to take output from a unix command line tool and have it execute an arbitrary command against each individual output independent of each other like find does?
The answer is xargs -n 1.
echo moo cow boo | xargs -n 1 echo
outputs
moo
cow
boo
try xargs:
grep 'pattern' | xargs svn log
A little one off shell script (using xargs is much better for a one off, that's why it exists)
#!/bin/sh
# Shift past argv[0]
shift 1
for file in "$#"
do
svn log $file
done
You could name it 'multilog' or something like that. Call it like this:
./multilog.sh foo.c abc.php bar.h Makefile
It allows for a little more sanity when being called by automated build scripts, i.e. test the existence of each before talking to SVN, or redirect each output to a separate file, insert it into a sqlite database, etc.
That may or may not be what you are looking for.

Resources