List of completely untouched directories - linux

I'm currently writing a clean up script and for that i need a list of folder without any changes to subfiles or subdiretories. Right now i have this as a base. (Would like it to be untouched for the last 14 days)
find ./* -type f -mtime -14
Which will generate a list of any untouched files and directories. The problem is that this one will generate a list where some directories might show up any way because there only been changes to a couple of files within some of the subdirectories.
Does anyone have any idea how to generate a list of completely untouched directories?
cheers!

Related

Finding files that are Hardlink in Soalris under specific folder

I need to find hardlink files under specific folder in Solaris. Tried this below command which lists the files based on inode count.
find . -type f -links +1
The above command list both source and target files. But i need to list only the target_file.
For Eg: Under Test folder, there is source.txt
Test
->source.txt
Created hardlink:
ln source.txt target.txt
The above find command return both source.txt and target.txt. But I need a command to fetch only target.txt. Is it possible?
No. After the hardlink both names of the file are equal in all ways, there is no original or copy.
Since they share the underlying inode, both files have the same attributes -- change one you change all of them.
Either switch to symbolic links or find a heuristic to choose which one you don't want to see, like it has an extension, or sorts later.

Delete only accessed files using find -amin -1 and not files created in the previous minute

I need to delete every accessed file in the previous minute, every minute.
That is, to delete the files using the output of find . -amin -1 command, and repeat that deletion again the next minute. But when, new files are added in that interval, the above find command also returns the newly created (not accessed) files in the output. I want to exclude those new files and prevent them from being deleted. How do I go about it?
Well .. the machine does exactly what you're asking.
What you WANT is something slightly different, I believe.
find . -amin -2 -a -amin +1

ubuntu and unix stuff about directories and hidden files

In no way do i think i'm an adequate unix administrator but i'm learning. I "cd" into a specific directory and it appears to be empty after i do a "ls". But when i "ll" it says this:
/integration/import$ ll
total 184
What is this total 184? And how do i see these text files. I've never seen anything like this before. super confusing.
My co-worker had originally said this: in the imports folder find the text file containing this order and move it out of the folder/queue.
The below should list the hidden files as well. Usually, hidden files are the one starting with a dot. e.g. .mail or so.
ls -latr /integration/import/

How do I find missing files from a backup?

So I want to backup all my music to an external hard drive. This worked well for the most part using Grsync, but some didn't copy over because of encoding issues with the file name.
I would like to compare my two music directories (current and backup) to see what files were missed, so I can copy these over manually.
What is a good solution for this? Note there are many many files, so ideally I don't want a tool that wastes time comparing the file contents. I just need to know if a file is missing from the backup that is in the source.
Are there good command line or gui solutions that can do this in good time?
Go to the top level directory in each set.
find . -type f -print | sort > /tmp/listfile.txt
Set up a sorted list for each directory, and diff should help you spot the differences.

How can I create a shell script to delete files that are 30 days old?

I am really horrible at programming or script writing but I have a few Linux based machines Ubuntu Debian etc.. In certain path locations I would like to run a script that removes files after 30 days.. I have a pretty good idea on how to use crontab -e but it's the script language that gets me baffled..
Here is an example of what I am working with..
So far I have been manually removing the files like this..
rm -r ./filename.wav
http://server.lorentedford.com/41715/
I suppose the real question is this possible to do? The other half of the issue is are their creation dates stored somewhere ls -l does not show creation date only the last time it's changed..
I noticed some negative feedback on the post but understanding the frustration of teaching the language..
Ok so i realized i forgot to add the most important part of this equazion and this was that chmod 775 gets run in these folders every minute.. Since technically that modifies things wouldn't this throw off the -mtime?
UNIX has only last modify, last change (including last modify, but also ownership and access rights change) and last access dates in its filesystem, there is no creation date.
Command "find -type f -mtime -30 ." will give you the list of files, which were modified in last 30 days.
The post 'How to delete files older than X hours' mentioned above has the answer for you. Just use -mtime +<days number> in the find command. You can also try the -atime or -ctime depending on your actual needs.

Resources