Nullify file Through script not working - linux

cp /dev/null log working manually but from shell script it's not working for some logs.
Any idea why and how to fix this on AIX. Note: No ownership permission etc issue.

There's important information missing. But what about simply using
: > log
to empty log? This doesn't need cp or /dev/null at all.

Related

How to create a shell script

I am trying to create a shell script to remove certain files from a directory. How would I be able to achieve this?
Can I write the standard commands in a script as follows:
#!/bin/sh
rm -f /directory/of/file/file1.txt
rm -f /directory/of/file/file2.txt
rm -f /directory/of/file/file3.txt
rm -f /directory/of/file/file4.txt
Or is there a specific way to delete files in a shell script.
This is my first question here, so please bear with me as I do not know all the rules.
Thanks in advance :)
Edit:
Thanks for all the answers in a short matter of time, I really appreciate it.
Forgot to mention this will executed by root cron (crontab -e) every Tuesday and Friday # 5PM.
Do I still need to chmod +x the file if root is executing the file?
Your question can split into a few points:
You can use those commands to delete the specific files (if you have the permissions)
Make sure you add running permissions to the shell script file (that is used to perform the rm commands) by using: chmod +x file_name.sh
In order to delete the folder contents and not the folder itself the command should be: rm -r /path/to/dir/*
Yes you can. However if you don't have the permission to delete the files then you may get error on the statement. Try to handle that error and you are good to go

Delete file from Unix system

I want to delete a text file from Unix machine. I am using the following command for the same.
rm /tmp/filename.txt
It is working fine. After execution it returns nothing.
I want to get some conformation about deletion as follows,
filename.txt deleted successfuly.
Please any one help me on this.
Thanks in advance.
You can use these commands:
rm /tmp/filename.txt && echo "File successfuly deleted"
this will remove the file and then (only if the exit status of command is successful) print the message.
Otherwise, as Venkat said, you can use rm -i that asks for confirmation before deleting the file.
I have used the following to slove my issue.
rm -v /tmp/filename.txt
this will display the message as follows,
removed `/tmp/filename.txt`
you can use option with Unix command please See
this link below
The below command prompts you once whether you want to remove the file before deleting it.
rm -i tmp/filename.txt
If the file is unavailable shows you :
cannot remove ‘tmp/filename.txt’: No such file or directory
Hope it helps.

Recycle bin Script in Linux

In my production server, somebody executed rm -rf and my important files are removed permanently. So, I thought of having a recycle bin, so if a user do rmthe file will move to RecycleBin rather than deleting from server. And i've made the below script for it. But I'm getting some error while it executed.
alias rm='/root/remove.sh'
#rm test_file
Now below script will trigger when you type the rm command
#!/bin/bash
dir=$(pwd)
mv $dir/$1 /root/Recyclebin
when the above script is triggered i'm getting the following error.
mv:cannot move '/root/test_file' to '/root/Recyclebin': Not a directory
Now, please suggest is there anyother way to make a recycle bin concept other than this or please help to resolve the error. Thanks in advance.
I'm using CentOS 5.6
Try This
At first create a folder named as MyTrash under /root ie: /root/MyTrash
Then open .bashrc file and write the below line at the bottom of the file.
alias rm='mv -t /root/MyTrash/'
Here -t means
-t, --target-directory=DIRECTORY
move all SOURCE arguments into DIRECTORY
update .bashrc file by running this command source .bashrc
Now if you delete any file using rm command that file will be moved to /root/MyTrash directory

Adding timestamp to a filename with mv in BASH

Well, I'm a linux newbie, and I'm having an issue with a simple bash script.
I've got a program that adds to a log file while it's running. Over time that log file gets huge. I'd like to create a startup script which will rename and move the log file before each run, effectively creating separate log files for each run of the program. Here's what I've got so far:
pastebin
DATE=$(date +"%Y%m%d%H%M")
mv server.log logs/$DATE.log
echo program
When run, I see this:
: command not found
program
When I cd to the logs directory and run dir, I see this:
201111211437\r.log\r
What's going on? I'm assuming there's some syntax issue I'm missing, but I can't seem to figure it out.
UPDATE: Thanks to shellter's comment below, I've found the problem to be due to the fact that I'm editing the .sh file in Notepad++ in windows, and then sending via ftp to the server, where I run the file via ssh. After running dos2unix on the file, it works.
New question: How can I save the file correctly in the first place, to avoid having to perform this fix every time I resend the file?
mv server.log logs/$(date -d "today" +"%Y%m%d%H%M").log
The few lines you posted from your script look okay to me. It's probably something a bit deeper.
You need to find which line is giving you this error. Add set -xv to the top of your script. This will print out the line number and the command that's being executed to STDERR. This will help you identify where in your script you're getting this particular error.
BTW, do you have a shebang at the top of your script? When I see something like this, I normally expect its an issue with the Shebang. For example, if you had #! /bin/bash on top, but your bash interpreter is located in /usr/bin/bash, you'll see this error.
EDIT
New question: How can I save the file correctly in the first place, to avoid having to perform this fix every time I resend the file?
Two ways:
Select the Edit->EOL Conversion->Unix Format menu item when you edit a file. Once it has the correct line endings, Notepad++ will keep them.
To make sure all new files have the correct line endings, go to the Settings->Preferences menu item, and pull up the Preferences dialog box. Select the New Document/Default Directory tab. Under New Document and Format, select the Unix radio button. Click the Close button.
A single line method within bash works like this.
[some out put] >$(date "+%Y.%m.%d-%H.%M.%S").ver
will create a file with a timestamp name with ver extension.
A working file listing snap shot to a date stamp file name as follows can show it working.
find . -type f -exec ls -la {} \; | cut -d ' ' -f 6- >$(date "+%Y.%m.%d-%H.%M.%S").ver
Of course
cat somefile.log > $(date "+%Y.%m.%d-%H.%M.%S").ver
or even simpler
ls > $(date "+%Y.%m.%d-%H.%M.%S").ver
I use this command for simple rotate a file:
mv output.log `date +%F`-output.log
In local folder I have 2019-09-25-output.log
Well, it's not a direct answer to your question, but there's a tool in GNU/Linux whose job is to rotate log files on regular basis, keeping old ones zipped up to a certain limit. It's logrotate
You can write your scripts in notepad but just make sure you convert them
using this ->
$ sed -i 's/\r$//' yourscripthere
I use it all they time when I'm working in cygwin and it works. Hope this helps
First, thanks for the answers above! They lead to my solution.
I added this alias to my .bashrc file:
alias now='date +%Y-%m-%d-%H.%M.%S'
Now when I want to put a time stamp on a file such as a build log I can do this:
mvn clean install | tee build-$(now).log
and I get a file name like:
build-2021-02-04-03.12.12.log

tar file not archiving

I am doing the following in a shell script:
tar cvzf mytar.tgz *
It works fine when I run the shell script from a terminal. When it runs the shell script from a cron job using crontab it looks like it is archived because the tgz file is there but the filesize is nothing and when I untar it there is nothing there. However, when I run the shell script via terminal the tgz has a larger filesize and I can untar them.
Anyone know why it won't work via the cronjob?
Try specifying the complete path to the files you want to archive:
tar cvzf mytar.tgz /path/to/your/files/*
Cron runs from a different directory from your $HOME.
What's the working directory of the cronjob process? If there's nothing in it, then the command will archive all of the nothing.
First, no need to be verbose in a cron.
Second, it looks like you are using relative pathing there. Consider using absolute paths, even for the tar command itself.
Last, which user is running the cron? Is there a potential for a permissions issue or a quota issue?
The other answers so far give good advice. Cron has a lot of special rules wrt what is allowed in the command. I have he most success when I make a simple shell script, and put it in $HOME/cron, chmox 755 it and put the full path to it in cron. Making sure to test the script - ensuring to cd'ing as necessary. Be aware that cron not only won't necessarily run the command from your home, but it will also likely have a different PATH and other environment settings will be missing.

Resources