Script to look at files in a directory - linux

I am writing a script that shows all the files in a directory named "Trash". The script will then prompt the user for which file he wants to "undelete" and send it back to it's original directory. Currently I am having a problem with the for statement, but I also am not sure how to have the user input which file and how to move it back to it's original directory. Here is what I have thus far:
PATH=/home/user/Trash
for files in $PATH
do
echo "$files deleted on $(date -r $files)"
done
echo "Enter the filename to undelete from the above list:"
Actual Output:
./undelete.sh: line 6: date: command not found
/home/user/Trash deleted on
Enter the filename to undelete from the above list:
Expected Output:
file1 deleted on Thu Jan 23 18:47:50 CST 2014
file2 deleted on Thu Jan 23 18:49:00 CST 2014
Enter the filename to undelete from the above list:
So I am having two problems currently. One instead of reading out the files in the directory it is giving $files the value of PATH, the second is my echo command in the do statement is not processing correctly. I have changed it around all kinds of different ways but can't get it to work properly.

You're making many mistakes in your script but biggest of all is setting the value of reserved path variable PATH. Which is basically messing up standard paths and causing errors like date command not found.
In general avoid using all caps variables in your script.
To give you a start you can use script like this:
trash=/home/user/Trash
restore=$HOME/restored/
mkdir -p "$restore" 2>/dev/null
for files in "$trash"/*
do
read -p "Do you want to keep $file (y/n): " yn
[[ "$yn" == [yY] ]] && mv "$file" "restore"
done

Related

Backup the first argument on bash script

I wrote a script to backup the first argument that the user input with the script:
#!/bin/bash
file=$1/$(date +"_%Y-%m-%d").tar.gz
if [ $1 -eq 0 ]
then
echo "We need first argument to backup"
else
if [ ! -e "$file" ]; then
tar -zcvf $1/$(date +"_%Y-%m-%d").tar.gz $1
else
exit
fi
fi
The result that i want from the script is
backup folder the first argument that user input
save the backup file into folder that user input with date time format.
but the script is not running when I try to input the argument. What's wrong with the script?
The backup part of your script seem to be working well, but not the part where you check that $1 is not empty.
Firstly you would need quotes around $1, to prevent that it expends to nothing. Without the quotes the shell sees it as
if [ -eq 0 ]
and throws an error.
Secondly it would be better to use the -z operator to test if the variable exists:
if [ -z "$1" ]
Now you script should work as expected
I see several problems:
As H. Gourlé pointed out, the test for whether an argument was passed is wrong. Use if [ -z "$1" ] to check for a missing/blank argument.
Also, it's almost always a good idea to wrap variable references in double-quotes, as in "$1" above. You do this in the test for whether $file exists, but not in the tar command. There are places where it's safe to leave the double-quotes off, but the rules are complicated; it's easier to just always double-quote.
In addition to checking whether $1 was passed, I'd recommend checking whether it corresponds to a directory (or possibly file) that actually exists. Use something like:
if [ -z "$1" ]; then
echo "$0: We need first argument to backup" >&2
elif [ ! -d "$1" ]; then
echo "$0: backup source $1 not found or is not a directory" >&2
BTW, note how the error messages start with $0 (the name the script was run as) and are directed to error output (the >&2 part)? These are both standard conventions for error messages.
This isn't serious, but it really bugs me: you calculate $1/$(date +"_%Y-%m-%d").tar.gz, store it in the file variable, test to see whether something by that name exists, and then calculate it again when creating the backup file. There's no reason to do that; just use the file variable again. The reason it bugs me is partly that it violates the DRY ("Don't Repeat Yourself") principle, partly that if you ever change the naming convention you have to change it consistently in two places or the script will not work, and partly because in principle it's possible that the script will run just at midnight, and the first calculation will get one day and the second will get a different day.
Speaking of naming conventions, there's a problem with how you store the backup file. If you put it in the directory that's being backed up, then the first day you'll get a .tar.gz file containing the previous contents of the directory. The second day you'll get a file containing the regular contents plus the first backup file. Thus, the second day's backup will be about twice as big. The third day's backup will contain the regular contents, plus the first two backup files, so it'll be four times as big. And the fourth day's will be eight times as big, then 16 times, then 32 times, etc.
You need to either store the backup file somewhere outside the directory being backed up, or add something like --exclude="*.tar.gz" to the arguments to tar. The disadvantage of the --exclude option is that it may exclude other .tar.gz files from the backup, so I'd really recommend the first option. And if you followed my advice about using "$file" everywhere instead of recalculating the name, you only need to make a change in one place to change where the backup goes.
One final note: run your scripts through shellcheck.net. It'll point out a lot of common errors and bad practices before you discover them the hard way.
Here's a corrected version of the script (storing the backup in the directory, and excluding .tar.gz files; again, I recommend the other option):
#!/bin/bash
file="$1/$(date +"_%Y-%m-%d").tar.gz"
if [ -z "$1" ]; then
echo "$0: We need first argument to backup" >&2
elif [ ! -d "$1" ]; then
echo "$0: backup source $1 not found or is not a directory" >&2
elif [ -e "$file" ]; then
echo "$0: A backup already exists for today" >&2
else
tar --exclude="*.tar.gz" -zcvf "$file" "$1"
fi

why if expression is always true in bash script

I'm very new in shell script and I wrote this code to copy an input file from directory new1 to directory new2 if the file doesn't exist in second directory.
the problem is that the first if expression is always true and the code always print "file copied successfully" even if the file exists in second directory.
here is my code:
while true; do
echo "enter a file name from directory new1 to copy it to directory new2 "
echo "or enter ctrl+c to exit: "
read input
i=0
cd ~/new2
if [ -f ~/new1/$input ]; then
i=1
fi
if [ $i -eq 0 ];then
cp ~/new1/$input ~/new2/
echo "####" $input "copied successfully ####"
else
echo "#### this file exist ####"
fi
done
I will be appreciated if any one tell me how to fix this problem
You are comparing the wrong file. In addition, you probably want to refactor your logic. There is no need to keep a separate variable to remember what you just did.
while true; do
echo "enter a file name from directory new1 to copy it to directory new2 "
echo "or enter ctrl+c to exit: "
read input
#i=0 # no use
#cd ~/new2 # definitely no use
if [ -f ~/new2/"$input" ]; then # fix s/new1/new2/
# diagnostics to stderr; prefix messages with script's name
echo "$0: file ~/new2/$input already exists" >&2
else
cp ~/new1/"$input" ~/new2/
echo "$0: ~/new1/$input copied to ~/new2 successfully" >&2
fi
done
Take care to make your diagnostic messages specific enough to be useful. Too many beginner scripts tell you "file not found" 23 times but you don't know which of the 50 files you tried to access were not found. Similarly, including the name of the script or tool which produces a diagnostic in the diagnostic message helps identify the culprit and facilitate debugging as you start to build scripts which call scripts which call scripts ...
As you learn to use the command line, you will find that scripts which require interactive input are a dog to use because they don't offer command history, tab completion of file names, and other niceties which are trivially available to any tool which accepts a command-line file argument.
cp -u already does what this script attempts to implement, so the script isn't particularly useful per se.
Note also that ~ is a Bash-only feature which does not work with sh. Your script otherwise seems to be compatible with POSIX sh and could actually benefit from some Bash extensions such as [[ if you are going to use Bash features anyway.

I have series of file in OLD dir, I want to check if the file exist in NEW dir. If it does not exist, I would like to do few operations on it

For instance... I have some files in the format mmddyy.zip in the OLD directory...
041414.zip
041514.zip
041614.zip
041714.zip
041814.zip(today's file Apr 18 2014)
and I have a NEW dir with 041414.zip 041514.zip in it...
I'm trying to copy all the files from OLD to NEW and do some other operations if that file doesn't exist in NEW dir...
I'm thinking of doing it ' while do' statement, but not sure what what to use in the condition...
Thanks,
Sam.
You probably want to use find, and execute a script that does both the check and the operation.
For instance, script.sh (fill in your own variables):
#!/bin/sh
FNAME="`basename $1`"
NEWDIR="/tmp"
NEWFNAME="$NEWDIR/$FNAME"
if [ ! -f "$NEWFNAME" ]; then
# do stuff to "$1"
# optionally cp "$1" "$NEWFNAME"
fi
Then you run something like cd /old/dir; find -name "*.blah" -exec script.sh "{}" ";"

MV command giving me are the same file error

So I am building this script to take a file from the "trash" directory and move it to the home directory. I am getting an error mv:/home/user/Trash/ and /home/user/Trash are the same file. The problem is I am moving the file to /home/user. I can't figure out why it is giving me this error.
Script:
trash="/home/user/Trash"
homedirectory="/home/user/"
for files in "$trash"/*
do
echo "$(basename $files) deleted on $(date -r $files)"
done
echo "Enter the filename to undelete from the above list:"
read $undeletefile
mv $trash/$undeletefile $homedirectory
Output:
myfile2 deleted on Thu Jan 23 18:47:50 CST 2014
trashfile deleted on Fri Feb 28 23:07:33 CST 2014
Enter the filename to undelete from the above list:
trashfile
mv: `/home/user/Trash/' and `/home/user/Trash' are the same file
I think your problem is in the read command. You are not supposed to add $ to it.
Try:
read undeletefile

linux checking number of files subdirectory - providing wrong variable result when subdirectory searching for does not exist

I have created a script that goes through specific subdirectories of files and tells me how many files are in each sub-directory that start with s. My problem occurs when it is searching and the sub-directory has failed to be created. For some reason, when the sub-directory that this script is searching for does not exist, it replaces the output with another previously created variable????
I am writing this in bash for linux.
I am looking at the following subdirectories...
participantdirectory/EmotMRI
participantdirectory/EmotMRI/firstfour/
participantdirectory/T1
So, this is the output I should get, when the subdirectory exists and everything is ok. It is the same for all files (if it is correct).
/home/orkney_01/jsiegel/ruth_data/participants/analysis2/1206681446/20090303/14693
16 in firstfour
776 in EmotMRI folder
2 files in T1 folder
For a directory which does not have a subdirectory created, I get this output...
bash: cd: /home/orkney_01/jsiegel/ruth_data/participants/analysis2/2102770508/20090210 /14616/EmotMRI/firstfour/: No such file or directory
/home/orkney_01/jsiegel/ruth_data/participants/analysis2/2102770508/20090210/14616
776 in firstfour
114 in EmotMRI folder
2 files in T1 folder
I think that, because firstfour is a subdirectory of EmotMRI, when firstfour folder hasn't been created, it substitutes the scan numbers in EmotMRI for this answer? The number of scans in EmotMRI (in this instance is correct). Here is my script below. If this is happening, how do I stop it from doing this?
for d in $(cat /home/orkney_01/jsiegel/ruth_data/lists/full_participant_list_location_may20)
do
if [ -d "$d" ]
then
gr="failed"
er="failed"
fr="failed"
cd $d/EmotMRI/firstfour/
gr=$(ls s*| wc -l)
echo " "
echo "$d"
echo "$gr in firstfour"
cd $d/EmotMRI/
er=$(ls s*| wc -l)
echo "$er in EmotMRI folder"
cd $d/T1/
fr=$(ls s*| wc -l)
echo "$fr files in T1 folder"
cd $d/EmotMRI
else
echo "$d is currently not available in directory"
fi
done
cd /home/orkney_01/jsiegel/ruth_data/
echo "Check complete"
I know you will probably have many improvements on this script, I am very new to linux. Thanks for your help,
Currently, you set gr to the output of ls s* | wc -l regardless of whether you successfully change your working directory. When that cd fails, it leaves you in whatever directory you were in previously.
You can combine your cd command into your other commands to set gr:
gr=$(cd $d/EmotMRI/firstfour/ && ls s* | wc -l || echo failed)
This way, if you successfully cd into the subdirectory, gr will be set to the output of the commands after &&. Otherwise, gr will be set to the output of the command after the ||. You can do the same thing with er and fr.
You are getting error messages that you shoud fix. Cd is failing because you are not allowed to change into a non-existent directory. Your shell will just stay in the directory it was in. It looks like you know how to test for directory existence, so you should just do that more to avoid trying to go into non-existent directories.

Resources