Determine what file has changed or been created - linux

I'm able to connect our CentOS 6.7 server to an external LDAP successfully but the last step in the process requires something in the GUI:
System > Administration > Authentication, then checking "Authentication Method"
I'd love to either know the exact file that this change in the GUI modifies or creates, or learn a method that I can use to show me which files have been created or modified in the last five minutes, for example. Of course, I have no idea which directory or directories this file was created or modified in.
I'm primarily a Mac user so we have tools in OS X that can be used for such a purpose, but I'm hoping there's an equivalent method in CentOD/RHEL.
Thanks in advance,
Dan

This isn't the exact answer I was looking for but does solve my immediate problem. The following find command shows recently created files, sorted by most recent:
find /etc -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort -r
Looks like the file changed by the GUI above is:
/etc/sysconfig/authconfig

Related

Locating files starting with a string for my hw

I have a conundrum and I am not necessarily trying to cheat or anything I am just simply stuck. I am trying to complete an assignment for my intro to Linux class and I was hoping someone would be able to help me find the right solution. I have to:
In the same directory (where the last file was found) list all files
starting with "host" --
Use the long listing format Use a command to
find the file that shows the name of your computer
now the directory in question is /etc and I have tried several commands to no avail for both of these but especially the first one. I have tried find and locate and even attempted a grep and it just is not working as intended. I can't get files that start with "host" at most I keep getting a list of permission denied or files that end in .host and so I am not sure what I am doing wrong but I really need help so I can turn in my assignment. You don't have to tell me what the exact command should be I am just looking for some guidance again I am not trying to cheat just need help to figure it out.
and welcome to Stack Overflow! Here are some pointers.
See globbing in Linux and the * symbol.
"long listing" is an option for ls command, see ls --help. The name of your computer (or, more accurately, the name of your host) is a file in /etc/. You should see it when doing #1.

How do I find missing files from a backup?

So I want to backup all my music to an external hard drive. This worked well for the most part using Grsync, but some didn't copy over because of encoding issues with the file name.
I would like to compare my two music directories (current and backup) to see what files were missed, so I can copy these over manually.
What is a good solution for this? Note there are many many files, so ideally I don't want a tool that wastes time comparing the file contents. I just need to know if a file is missing from the backup that is in the source.
Are there good command line or gui solutions that can do this in good time?
Go to the top level directory in each set.
find . -type f -print | sort > /tmp/listfile.txt
Set up a sorted list for each directory, and diff should help you spot the differences.

How can I create a shell script to delete files that are 30 days old?

I am really horrible at programming or script writing but I have a few Linux based machines Ubuntu Debian etc.. In certain path locations I would like to run a script that removes files after 30 days.. I have a pretty good idea on how to use crontab -e but it's the script language that gets me baffled..
Here is an example of what I am working with..
So far I have been manually removing the files like this..
rm -r ./filename.wav
http://server.lorentedford.com/41715/
I suppose the real question is this possible to do? The other half of the issue is are their creation dates stored somewhere ls -l does not show creation date only the last time it's changed..
I noticed some negative feedback on the post but understanding the frustration of teaching the language..
Ok so i realized i forgot to add the most important part of this equazion and this was that chmod 775 gets run in these folders every minute.. Since technically that modifies things wouldn't this throw off the -mtime?
UNIX has only last modify, last change (including last modify, but also ownership and access rights change) and last access dates in its filesystem, there is no creation date.
Command "find -type f -mtime -30 ." will give you the list of files, which were modified in last 30 days.
The post 'How to delete files older than X hours' mentioned above has the answer for you. Just use -mtime +<days number> in the find command. You can also try the -atime or -ctime depending on your actual needs.

Finding a File That Contains The Requested Text from Terminal Session

I don't know if this is a duplicate questions or not, but I haven't been able to find it...
I'm an old-school UNIX/Linux/AIX programmer, so I'm used to the terminal command line, even on MacOS X.
I've been using the following command to locate the given text in all files under the current folder:
find . -type f 2>/dev/null -exec grep [text] {} \; -ls
where [text] is the text string that I'm searching for. I have two issues, one minor and one major:
Minor - this command displays the text before the path to the file that contains the text string
Major - this command is both CPU (minor) and memory (major) intensive, grabbing almost all available memory while it's running on folders containing large numbers of files.
What I would like to find is a solution that resolves both of these issues, but the resource issue is the more annoying one.
Thanks in advance for any help....
BTW, I checked How do I find all files containing specific text on Linux?, and it doesn't resolve the resource/time issue.

How do you search for all the files that contain a particular string?

Let's say you're working on a big project with multiple files, directories, and subdirectories. In one of these directories/subdirectories/files, you've defined a method, but now you want to know exactly which files in your entire project have been calling your method. How do you do this?
You mentioned grep so I'll throw this solution out there. A more robust solution would be to implement a version control system as Fibbe suggested.
find . -exec grep 'method_name' {} \; -print 2> /dev/null
The idea is, for each file that is found in the current directory and sub-directories, a grep for 'method_name' is executed on that file. The 2> /dev/null is nice if you don't want to get warned about all of the directories and files you don't have access to.
The most common way to do this is by using your editor. For example emacs can do this if you create a tag index with etags.
Source: http://www.gnu.org/software/emacs/emacs-lisp-intro/html_node/etags.html
The you just types M-. and type the name of the function you want to visit and emacs will take you there.
I don't know what system or which editor you are using but most editors has a simular function.
If you don't use emacs an other good way to keep track of functions, and get a lots of other good features, is to use a versions control system. Like git, it provides really fast search.
If you don't use a version control system you may want to look at a program that is designed just for searching. Like OpenGrok.

Resources