Nested for loops in Shellscript - linux

I need help with a Shellscript.
I have a For loop in my script, which creates files names with a variable names. like file.$variable.
For example I have a list of, servers in a servers.txt file. from there i will read the server and connect to them and get some data from the each server. The filenames will be file.$server.
From that i am using For loop and creating the files of each server.
for server in `cat servers.txt`; do
ssh $server ls | awk '{print $2}' | tee -a files.$server.txt
This one working fine.
Now, from those generated files I need to run one more for loop to read all the files and read the content of the each file and give it as input to another command.
ex:
for file in `cat files.$servers.txt`; do
cat $file | awk '{print $2}' | tee -a column.$file.txt
But, not working for my in the second loop. Please help.
In a nut shell its a nested loop. Excuse me for my English.
Thanks in Advance.

Related

How to check if linux user has .sh file?

I have to write script in bash that will check if logged users have any .sh files.
Checking who is logged in is simple just using:
w| awk '{print $1}'
But i have no idea how to check if they havy any .sh files
You need to read the output of the who command and use that in your find command.
Since the same user can be logged in multiple times, it's a good idea to remove duplicates before looping.
#!/bin/bash
who| awk '{print $1}' | sort -u | while read -r username; do
find /home/"$username" -name "*.sh"
done

Linux CP using AWK output

I have been trying to learn more about Linux and have spent this morning focusing on the awk command. the command I have been trying to get to work is below.
ls -lRt lpftp.* | awk '{print $7, $9}' | mkdir -p $(awk '{print $1}') | ls -lRt lpftp.* | cp $(awk '{print $9, $7}')
Essentially I am trying to move each file in a directory into a sub directory based on that files last modified day. The command first prints only the files I want, then uses mkdir to create a folder based on the day of the month it was last modified. What I want to do after that is move each file into its associated directory, however as the command is now it moves every file into the 01 folder and prints out the following text
cp: 0653-436 12 is a directory.
Specify -r or -R to copy.
once for every directory.
does anyone know how I can fix this issue? or if there is a better way to go about it?
ls -lRt lpftp.* | awk '{print $7, $9}' | while read day file ; do mkdir -p "$day"; cp "$file" "$day"; done
The commands between do and done will be executed for each line of output, with the first thing awk prints in the day variable and the second in file (per line). I used quotes here somewhat unnecessarily, as there will not be spaces in the variables given the method by which they are set.
The safest way to do something like this -- and the fastest to execute -- is to use awk on the data to output a shell script. In awk, print the mkdir and cp commands you expect to execute. Pipe the results into head(1) until you're satisfied. Maybe look at the whole thing in less(1). Then execute as follows:
ls -lRg lpfpt.* | awk script.awk | sh -ex
That will echo the commands to standard error, and stop on the first error. If you're super absolutely sure it's right, drop the x option.
The advantage of this approach over a loop or a bunch of subprocesses in awk (with the system function) is:
you can see what's going to happen, and what's happening
speed of execution

Unsing integer Variable to process linux cut command fields

The following command below does not succeed.
for i in {1..5} ; do cat /etc/fstab | egrep "(ext3|ext4|xfs)" | awk '{print $2}' | cut -d"/" -f1-$i ; done
It seems that $i is ignored completely. It always returns instead result of
cut -d"/" -f1-
Any idea why it fails?
Thanks in advance!
The command itself is a part of a script that should help me to auto re-arrange fstab lines to match the right mount order (like /test/subfolder must come after /test was mounted and not before).
I tried and it didn't work for zsh shell. BUT I tried it in bash and it does work, so if you are using zsh just run the command with bash and it should work ;)

Issues passing AWK output to BASH Variable

I'm trying to parse lines from an error log in BASH and then send a certain part out to a BASH variable to be used later in the script and having issues once I try and pass it to a BASH variable.
What the log file looks like:
1446851818|1446851808.1795|12|NONE|DID|8001234
I need the number in the third set (in this case, the number is 12) of the line
Here's an example of the command I'm running:
tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | awk -F '[|]' '{print $3}'
The line of code is trying to accomplish this:
Grab the last lines of the log file
Search for a phrase (in this case connect, I'm using the same command to trigger different items)
Separate the number in the third set of the line out so it can be used elsewhere
If I run the above full command, it runs successfully like so:
tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | awk -F '[|]' '{print $3}'
12
Now if I try and assign it to a variable in the same line/command, I'm unable to have it echo back the variable.
My command when assigning to a variable looks like:
tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | brand=$(awk -F '[|]' '{print $3}')
(It is being run in the same script as the echo command so the variable should be fine, test script looks like:
#!/bin/bash
tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | brand=$(awk -F '[|]' '{print $3}')
echo "$brand";
I'm aware this is most likely not the most efficient/eloquent solution to do this, so if there are other ideas/ways to accomplish this I'm open to them as well (my BASH skills are basic but improving)
You need to capture the output of the entire pipeline, not just the final section of it:
brand=$(tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | awk -F '|' '{print $3}')
You may also want to consider what will happen if there is more than one line containing CONNECT in the final five lines of the file (or indeed, if there are none). That's going to cause brand to have multiple (or no) values.
If your intent is to get the third field from the latest line in the file containing CONNECT, awk can pretty much handle the entire thing without needing tail or grep:
brand=$(awk -F '|' '/CONNECT/ {latest = $3} END {print latest}')

500 internal error when uploading new files to server

I just switched to a new server host (VPS) and I transferred all my files over. I noticed that nothing was working everything was throwing a 500 internal error.
I then ran this via command line and it worked fine
for i in `cat /etc/trueuserdomains | awk '{print $2}'`; do chown $i.$i /home/$i/public_html -R; chown $i.nobody /home/$i/public_html; done
I'm not really sure what it does, but I think it changes the owner of the script. Anyways I've noticed over the past week anytime I upload a new script that wasn't already on the server it gives me the same 500 error and I have to run that script again. Is there somehow I can prevent this from happening?
for i in `cat /etc/trueuserdomains | awk '{print $2}'`;
do
chown $i.$i /home/$i/public_html -R;
chown $i.nobody /home/$i/public_html;
done
Description by breaking down the above code
cat /etc/trueuserdomains | awk '{print $2}'
This prints out a list of users made up of each word to be found in the second column of the file /etc/trueuserdomains (there is likely only one line in this file and the second word of which contains the user that the files should be owned by)
If you want to see exactly what that list is then run the following from the command line.
cat /etc/trueuserdomains | awk '{print $2}'
Then the for i part executes the two chown commands replacing the $i with the word gathered from the cat /etc/trueuserdomains | awk '{print $2}' command.
The first chown command changes the owner and group of every file and directory to be that which is found in the cat /etc/trueuserdomains | awk '{print $2}' command.
The second chown command then sets group on the public_html to be nobody, a group that likely has no user account assigned to it on the host machine.
So that sorts out the permissions in your web server files but, like you say, does not quite describe the root cause of your problem.
To fix the underlying problem let us know the following.
How do you upload files to the server? What is the name of the tool? When you upload the files can you give a sample of the owner and group permissions that they have prior to running the above command?

Resources