Unable to remove files from directory - linux

I am using rm to delete some files from a directory through a Perl script but it throws the error
can't exec "rm" no such file or directory.
Command goes like this :
system("rm $directory$files"); $directory$files = /var/spool/mqueue/qf*

Perl has a builtin function for removing files, unlink. The third example shows how to use it in combination with glob to delete a list of files:
unlink glob "*.bak";
or in your case,
unlink glob($directory.$files);

Related

Tar command keeps bundling up entire directory path

I have a few sub-directories with files inside each of them in /home/user/archived/myFiles that I'm trying to bundle into a single tar file. The issue is, it keeps bundling a full directory path instead of just everything in the myFiles folder.
When I untar the file, I just want all the bundled sub-directories/files inside to appear in the directory I extracted the file rather than having to go through a series of folders that get created.
Instead, when I currently untar the file, I get a "home" folder and I have to go through /home/user/archived/myFiles to reach all the files.
I tried using the -C flag that I saw suggested online here Tar a directory, but don't store full absolute paths in the archive where you insert parameters for the full directory minus the last folder, and then the name of the last folder which contains all the stuff you want bundled. But the tar command doesn't work as I get a no such file or directory error.
#!/bin/bash
archivedDir="/home/user/archived/myFiles"
tar -czvf "archived-files.tar.gz" "${archivedDir}"/*
rm -vrf "${archivedDir}"/*
# Attempt with -C flag
#tar -cvf "${archivedDir}/archived-files.tar.gz" -C "${archivedDir}" "/*"
So for example, if I did an ls on /home/user/archived/myFiles, and it listed two directories called folderOne and folderTwo, and I ran this bash script and did an ls on /home/user/archived/myFiles again, that directory should only contain archived-files.tar.gz.
If I extracted the tar file, then folderOne and folderTwo would appear.
As I explain already here you should first change to this directory and then create the archive.
So change you script to something like:
archivedDir="/home/user/archived/myFiles"
cd $archivedDir
tar -czvf "../archived-files.tar.gz" *
This will create the archive in upper directory so you will not remove it with the next command.
the extraction should be something like:
archivedDir="/home/user/archived/myFiles"
cd $archivedDir
tar -xzvf "../archived-files.tar.gz"

How can I execute a command anywhere if certain required files are different directories?

Let say the command be my_command
And this command has to be prepared specific files (file1, file2, and file3) in the current working directory.
Because I often use my_command in many different directories, I'd like to keep the certain files in a certain directory and execute my_command without those three files in the working directory.
I mean I don't want to copy those three files to every working directory.
For example:
Directory containing the three files /home/chest
Working directory: /home/wd
If I execute command my_command, it automatically recognizes the three files in /home/chest/
I've thought the way is similar to add $PATH and not the executable files but just files.
It seems like the files needs to be in the current working directory for the vasp_std command to work as expected, I am thinking that you could simply add all files in a include folder in you home directory and then create a symbolic link to this folder from your script. In the end of your script the symbolic link will then be deleted:
#!/bin/bash
# create a symbolic link to our resource folder
ln -s ~/include src
# execute other commands here
# finally remove the symbolic link from the current directory
unlink src
If the vasp_std command require that the files are placed directly under the current working directory you could instead create a symbolic link for each file:
#!/bin/bash
# create link for to all resource files
for file in ~/include/*
do
ln -s $file `basename $file`
done
# execute other commands here
# remove any previously created links
for file in ~/include/*
do
unlink `basename $file`
done

shell script mv is throwing unhelpful error "No such file or directory" even though i see it

I need to use a shell script to move all files in a directory into another directory. I manually did this without a problem and now scripting it is giving me an error on the mv command.
Inside the directory I want to move files out of are 2 directories, php and php.tmp. The error I get is cd: /path/to/working/directory/php: No such file or directory. I'm confused because it is there to begin with and listed when I ls the working directory.
The error I get is here:
ls $PWD #ensure the files are there
mv $PWD/* /company/home/directory
ls /company/home/directory #ensure the files are moved
When I use ls $PWD I see the directories I want to move but the error afterward says it doesn't exist. Then when I ssh to the machine this is running on I see the files were moved correctly.
If it matters the directory I am moving files from is owned by a different user but the shell is executing as root.
I don't understand why I would get this error so, any help would be great.
Add a / after the path to specify you want to move the file, not rename the directory.
You should try this:
mv $PWD/\* /home/user/directory/
Are your variables properly quoted? You could try :
ls "$PWD" #ensure the files are there
mv "$PWD"/* "/company/home/directory"
ls "/company/home/directory" #ensure the files are moved
If any of your file or directory names contains characters such as spaces or tabs, your "mv" command may not be seeing the argument list you think it is seeing.

Error copying files in Linux shell bash script

I'm try to copy files from a location (/home/ppaa/workspace/partial/medium) to another location (/home/ppaa/workspace/complete) using bash shell scripting in Linux.
This is my code:
#!/bin/bash -u
MY_BASE_FOLDER='/home/ppaa/workspace/'
MY_TARGET_FOLDER='/home/ppaa/workspace/complete/'
cp $MY_BASE_FOLDER'partial/medium/*.*' $MY_TARGET_FOLDER
return=$?
echo "return: $return"
The folders exists and the files are copied but the value of return variable is 1. Whats wrong?
The files are not copied. cp is most likely giving you an error like:
cp: cannot stat ‘/home/ppaa/workspace/partial/medium/*.*’: No such file or directory
This is because globs (like *.*) are not expanded in quotes. Instead, use:
cp "$MY_BASE_FOLDER/partial/medium"/*.* "$MY_TARGET_FOLDER"

using execv/execl to delete all files

I'm trying to delete all files in folder using from c program using the following method:
execl("/bin/rm","/media/sda1/*",0,0,0,0,0,0,0,0,0);
But I get the following failure:
rm: can't remove '/media/sda1/*': No such file or directory, though there are files in this folder.
How can we delete all files or copy all files (from one folder to another) using execv family ? Does anyone have any idea ?
Thanks,
Ran
The problem is caused by the glob pattern /media/sda1/* you are using: Note the asterisk, which a shell would expand to the the list of all non-hidden files in that folder. If you are passing it directly to rm, it would attempt to delete a folder called *.
If you don't want to manually iterate over all filesinside the folder, you'll need to start the command in a shell which will expand the glob pattern for you.
You could use
execl("/bin/bash","-c 'rm -rf /media/sda1/*'",0,0,0,0,0,0,0,0,0);
... for that. A nice alternative would be to use system() which implicitly starts the command in a shell:
system("rm -rf /media/sda1/*");
More about:
glob
the function system()

Resources