Linux Bash auto command, source text file - linux

Thank you for your concern.
I'm a noob trying to bulk-ip-lookup with [geoiplookup -f GeoLiteCity.dat] command.
I have more than 700 ips to lookup which saved on as c.txt (on the same folder)
How can I make a bash shell script? I've already made one and all I got was:
sudo: unable to execute ./ok.sh: No such file or directory
Here is my script
It would be all - ok to use another language.
To make it more clear;
[geoiplookup -f GeoLiteCity.dat IP1]
[geoiplookup -f GeoLiteCity.dat IP2]
...
[geoiplookup -f GeoLiteCity.dat IP700]
and save them as one text file. (Which would be 700 row)
I'm Korean and sorry for my poor English, but I couldn't find any in my language how to do this. I'll really appreciate it, or I have to look up 1 by 1 till Friday... (as internet speed is extremely slow in my company)
Please help me. I will pray for you at every sunday morning. Thank you.

found a very simple answer with a duckduckgo search for 'iterate through each line of file bash'
stackoverflow.com/questions/1521462/looping-through-the-content-of-a-file-in-bash
#!/usr/bin/bash
printf "\n\n"
while read ip; do
echo "LOOKING UP IP $ip"
geoiplookup $ip
printf "\n\n"
done < ipaddresses.txt
save it as iplookup.sh and run, without 'sudo':
bash iplookup.sh
tested and working. be sure to rename your file 'c.txt' to 'ipaddresses.txt' ! also the 'ipaddresses.txt' file must be in the same directory

Related

linux create a file with the titles of the files inside a csv

So I have a file that has a list of file names, this file is a .csv
filename1
filename2
filename3...
and so on (there are hundreds of these)
What I want to do is output this list into their own files at the same spot as where this list is and make them into a .pdf file
so basically the folder will have
Filenamelist.csv
filename1.pdf
filename2.pdf
filename3.pdf
Any insight or help into this is greatly appreciated!
Edit:
here's my snippet of code
#!/bin/bash
if [[-f "/mnt/c/users/jesse/desktop/test/list.csv"]]
then
while IFS='|' read -r pdfid
do
touch "/mnt/c/users/jesse/desktop/test/${pdfid}.pdf"
done
fi
I'm getting an error when I try to bash this
syntax error near unexpected token `fi
Can anyone help me with this error?
I'm using ubuntu on windows.
You can use the read command in an iterative fashion, as described here: https://www.cyberciti.biz/faq/unix-howto-read-line-by-line-from-file/
would this work?
if [[ -f "/mnt/c/users/jesse/desktop/test/list.csv"]]
then
while IFS='|' read -r pdfid
do
touch "/mnt/c/users/jesse/desktop/test/pdfid.pdf"
done < "/mnt/c/users/jesse/desktop/test/list.csv"
fi

bin/bash nested loop does not work

I currently working as a intern at a hosting firm. They asked me to write a bin/bash script to help automate a process to check the user's domain's and .pointers for them. And validate with a "whois" command if the domains/pointers are on our server's.
I'm new with bin/bash scripting but i was told i should check nested loops out. So to test my script out i made similar paths as they would look like on the server. /usr/local/directadmin/data/users/#USER#/domains.list and users/#USER#/domains/#DOMAIN NAME OF USER#.pointers
#part 1
for i in $(cat /home/MrC/Desktop/Users) #<the list of users i need to check)
do
if [ -f "/usr/local/directadmin/data/users/$i/domainlist.txt" ]
then
echo "/usr/local/directadmin/data/users/$i" >> /home/MrC/Desktop/output.tx$
cat "/usr/local/directadmin/data/users/$i/domainlist.txt" >> /home/carlos/Des$
fi
#part 2
for s in $(cat /home/mrC/Desktop/output.txt)
do
if [ -f "/usr/local/directadmin/data/users/$i/domains/$s.pointers" ]
then
echo "/usr/local/directadmin/data/users/$i" >> /home/MrC/Desktop/pointers.$
cat "usr/local/directadmin/data/users/$i/domains/$s.pointers" >> /home/MrC$
fi
done
done
So part 1 works this is the output.txt below
/usr/local/directadmin/data/users/testuser
lolla.nl
blaat2.nl
blaat3.nl
google2.nl
/usr/local/directadmin/data/users/testusers
blaat.nl
google.com
test.nl
pietje.nl
But i cant seem part two to work (no pointer file). my goal with part two of the script is to read the output (domainname) and put it #/$i/domains/$s.pointers.
I'm new on the forum i hope i asked my question in a proper fashion. if some one could give me hints/tips to which direction i should look that would be highly appreciated.
For
Do
if
then
for
do
COMMAND A
COMMAND B
COMMAND C
done
fi
done
while read -r i; do #stuff; done < /home/MrC/Desktop/Users (adjust IFS or specify the delimiter with the -d option to read).
– David C. Rankin

Find and Replace in bash Shell

Please advise on replacing a variable with latest date & time.
Here is my requirement.
FN='basename$0'
TS=`date '+%m/%d/%Y %T'`
QD='08/27/2014 16:25:45'
Then I have a query to run. After it has run, I need to take $TS (current system date & time) and assign it as a value to the $QD variable. This is a loop process and gets updated every time the script runs.
I've tried using sed but was not successful.
Please help.
Programatically modifying your script to have a different timestamp constant is absolutely and emphatically the wrong way to handle this problem.
Instead, when you want to mark that the query has been done, simply touch a file:
touch lastQueryCompletion
...and when you want to know when the query was last done, check that file's timestamp:
# with GNU date
QD=$(date -r lastQueryCompletion '+%m/%d/%Y %T')
# or, with Mac OS X stat
QD=$(stat -t '%Y/%m/%d %H:%M:%S' -f '%Sm' lastQueryCompletion)
Although you haven't mentioned the overall goal that you wish to accomplish, I have a feeling something like this would be more robust than using sed to update an existing script file.
FN='basename$0'
TS=date '+%m/%d/%Y %T'
# Load the latest QD (from the last run)
[ -e ~/.QD.saved ] && QD="`cat ~/.QD.saved`"
QD='08/27/2014 16:25:45'
...Later in that file...
#Save the new QD variable
echo '$(date +$FORMAT)'" > ~/.QD.saved
Although I'm not sure if sed is the tool you're looking, I believe that your command would have to go like this:
sed -i -r 's/^QD=.*/QD="$TS"/g' "$FN"
I'm assuming you're using gnu-sed, which with -i option tells to do an in-place substitution, rather then copying the input line to the pattern space.
Well, hope it helps.

Rollover shell script

Assuming a shell script(commands.sh) with few commands.
I need to write a script which sends the output of commands executed by commands.sh to a file f1.csv
if file size exceeds 1MB then the output flowing should go to file f2.csv
if the file size exceeds 1 mb again here,the output flowing should go to file f3.csv
if f3.csv exceeds the size 1mb,then the older f1 should be deleted and again new file f1 should be created,
output flowing should be to written to f1. This process should go on .
I can write the crontab file, just the shell script is a bit tricky
I have been experimenting:
#!/usr/bin/env bash
PREFIX="f"
# Maximum size after which you want a new file in bytes
MAX_SIZE=1048576
LAST_FILE=`ls "$prefix"*.csv | tail -1`
# Check if file exists and if it does not, create it.
if [[ -z "$LAST_FILE" ]]
then
LAST_FILE=$PREFIX"1.csv"
touch $LAST_FILE
fi
LAST_FILE_NO=`echo $LAST_FILE | sed s/$PREFIX/''/ | sed s/.csv/''/`
LAST_FILE_SIZE=`stat -c %s $LAST_FILE`
if [ `stat -c %s $LAST_FILE` -lt 200 ]
then
`/bin/sh ./sam.sh >> $LAST_FILE`
else
UPCOMING_FILE_NO=$((LAST_FILE_NO+1))
`/bin/sh ./sam.sh >> $PREFIX$UPCOMING_FILE_NO.csv`
fi
help is appreciated guys.
EDIT: Have got the secondary shell script to work too...
Now if anyone could help me with resetting after 3 files are done and starting from f1.
thanks
It sounds like you'd be better off using logrotate, depending on how your script is running. If you are running 'commands.sh' on a cron, you can have logrotate rotate out the logs. There is a good guide on logrotate here:
http://linuxers.org/howto/howto-use-logrotate-manage-log-files
If your commands.sh isn't going to be on a cron, meaning it's not a regular time interval that triggers it, you could manually set up a log rotation at the beginning of your script. I once had to do something similar. I found this guide really useful:
http://wazem.blogspot.com/2013/11/simple-bash-log-rotate-function.html

how to print the ouput/error to a text file?

I'm trying to redirect(?) my standard error/output to a text file.
I did my research, but for some reason the online answers are not working for me.
What am I doing wrong?
cd /home/user1/lists/
for dir in $(ls)
do
(
echo | $dir > /root/user1/$dir" "log.txt
) > /root/Desktop/Logs/Update.log
done
I also tried
2> /root/Desktop/Logs/Update.log
1> /root/Desktop/Logs/Update.log
&> /root/Desktop/Logs/Update.log
None of these work for me :(
Help please!
Try this for the basics:
echo hello >> log.txt 2>&1
Could be read as: echo the word hello, redirecting and appending STDOUT to the file log.txt. STDERR (file descriptor 2) is redirected to wherever STDOUT is being pointed. Note that STDOUT is the default and thus there is no "1" in front of the ">>". Works on the current line only.
To redirect and append all output and error of all commands in a script, put this line near the top. It will be in effect for the length of the script instead of doing it on each line:
exec >>log.txt 2>&1
If you are trying to obtain a list of the files in /home/user1/lists, you do not need a loop at all:
ls /home/usr1/lists/ >Update.log
If you are attempting to run every file in the directory as an executable with a newline as its input, and collect the output from all these programs in Update.log, try this:
for file in /home/user1/lists/*; do
echo | "$file"
done >Update.log
(Notice how we avoid the useless use of ls and how there is no redirection inside the loop.)
If you want to create an empty file called *.log.txt for each file in the directory, you would do
for file in /home/user1/lists/*; do
touch "$(basename "$file")"log.txt
done
(Using basename to obtain the file name without the directory part avoids the cd but you could do it the other way around. Generally, we tend to avoid changing the directory in scripts, so that the tool can be run from anywhere and generate output in the current directory.)
If you want to create a file containing a single newline, regardless of whether it already exists or not,
for file in /home/user1/lists/*; do
echo >"$(basename "$file")"log.txt
done
In your original program, you redirect the echo inside the loop, which means that the redirection after done will not receive any output at all, so the created file will be empty.
These are somewhat wild guesses at what you might actually be trying to accomplish, but should hopefully help nudge you slightly in the right direction. (This should properly be a comment, I suppose, but it's way too long and complex.)

Resources