Check if file has been changed and which line in file - linux

I am looking some solutions which help me to track any changes which have been made on files. I am working on Linux system where a lot of people have access to the same files. Sometimes it is happened, that someone changed something in file and don't notify other users. So I would like to write some script to check if provide file path or files have been changed, if so then write in file let's say "controlfile_File1.txt" something like that "File changed %date, line XXX". I know that I can use md5checksum for that, but I will get only info if file changed but I would like to know which line is changed. I also think about solution to make copy of this file to some place and make some diff between copied file and current file?
Any ideas?
thanks for support.

Your question goes above the possibilities of Linux as a platform: Linux can show you the last modification date of a file, and the last time a file has been accessed (even without modifying the file), but that's it.
What are you are looking for, as already mentioned in the comments, is a version control system. As mentioned indeed, Git is one of them, but there are also others (SourceTree, SourceSafe, Clearcase, ...) each of them having their (dis)advantages.
One thing they all have in common is that modifying such a file does not go that simply anymore: at every time somebody has modified such a file (a file under the version control system), (s)he will be asked why (s)he has done this, this will also be recorded for later reference.

Related

How do you make a specific folder that has subfolders to be uneditable in Sublime Text

How do you make a specific folder that has subfolders to be uneditable in Sublime text? ? I know it's possible but how ? Like i have some old projects that I want to use as a reference to study my old code, but i"m worried that I mistakenly edited some parts of that specific module / file, when I'm mindlessly touring around my code.. How do I do it ? like making a specific folder to be uneditable in sublime text that only modifying it can change it . I mean I tried installing this one package : https://packagecontrol.io/packages/Toggle%20Read-Only
But it still gives me a prompt whenever I want to changed something from a file
Your best bet is to make sure that your source code is covered by some sort of Source Control, such as git or Subversion; this is just always a good idea in general and unrelated to your particular question. Having your code under source control means that when you edit a file (accidentally or on purpose) you can see exactly what you edited or throw those edits away and go back to what you had if you want to. If you also push your code to an external server, such as GitHUb (if you use git) then you also have a cheap and easy off-site backup of your code as well.
That said, if you want to make files uneditable, that's more the job of your file system than the tools that you're using to edit the files (in this case Sublime Text).
All file systems and operating systems should have the concept of a read-only file, which sounds like what you want. A file being marked as read-only stops you from accidentally modifying it; depending on the software that you use, edits are either impossible or will need to be confirmed.
In your case, you can do this in a couple of different ways. If you only want to protect a couple of files, then you would do a Right click and choose Properties; in the bottom of the General tab there is a check box you can check to make that file read-only:
If you have many files, you can do the same thing by editing the properties of the folder that contains the code instead:
When you do this to a folder, the property set works a little differently; since you're modifying the folder as a whole, you need to click the box twice to change it from a square (shown above) to a check mark. When you apply the change, you will be asked if you only want to make files inside of that directory read-only, or all files in that folder as well as all folders under it; choose as appropriate.
Sublime will let you open read-only files and will also let you modify their contents, but when you try to save you will get a warning dialog telling you that the file is write-protected; you need to confirm if you want to actually save changes or not; (other software may not prompt with such a dialog and may just fail):
If you choose to save, you will remove the read-only attribute and make the file normal again.
If you want to make a file completely un-editable so that you can't even accidentally change the file, you can achieve that with a simple plugin in combination with making the file read-only (see this video if you're not sure how to apply a plugin):
import sublime
import sublime_plugin
import os
class ReadOnlyListener(sublime_plugin.EventListener):
def on_load(self, view):
if (os.path.exists(view.file_name())
and not os.access(view.file_name(), os.W_OK)):
view.set_read_only(True)
EDIT: The internal View Package Files command will open package resources from sublime-package files transparently and give them a filename that represents where they would exist on disk if they were not in the package file.
The plugin from the original answer would stop you from being able to use this command by noting that the file is not writable (because it does not exist on disk) and make the view read-only, which stops the file content from being displayed because the view can't be modified.
This is rectified in the edit above by only taking action if the file actually exists on disk (the View Package File command already makes files it loads in this manner read-only if they do not exist on disk).
This makes an event listener that checks every time you open a file to see if the file is writable, and if it's not it makes the view read-only. This distinction is Sublime specific; regardless of the underlying state of the file, you just can't make any changes to such a file at all. That doesn't stop you from saving the file even if you haven't made any changes, which would have the same effect as the above.

Linux bash to compare two files but the second file must be find

I have a batch that integrates an xml file time to time but could happens daily. After it integrates it puts in a folder like /archives/YYMMDD(current day). The problem is if the same file is integrated twice. So I need a script what verifys the file (with diff command its possible but risky to make a bottleneck) but the problem is I can't find to resolve how to make to give the second files location.
P.S. I can't install on the server anything.
Thanks in advance.

using hard link with kate editor

I have problem in working with link command in linux mint.
I made file1 and add a new hard link to that file :
link file1 file2
I know when I change the contents of file1 , file2 should change too.
and when I edit file1 with vim or add text to it with redirections it works well but when
I edit file1 with kate editor then it's like that the editor break the link of file2! and after that when
I change the contents of file1 with kate or vim,... file 2 will never change any more.
what's the problem?
I'm one of the Kate developers. The issue is as follows: Whenever Kate saves, it does so by saving into a temporary file in the same folder, and on success just does a move to the desired location.
This move operation is exactly what destroys your hard link: first, the hard link is removed, then the temporary file is renamed.
While this avoids data loss, it also has its own problems as you experience. We are tracking this bug here:
https://bugs.kde.org/show_bug.cgi?id=358457 - QSaveFile: Kate removes a hard link to file when opening a file with several hard links
And in addition, QSaveFile also has two further issues, tracked here:
https://bugs.kde.org/show_bug.cgi?id=333577 - QSaveFile: kate changes file owner
https://bugs.kde.org/show_bug.cgi?id=354405 - QSaveFile: Files are unlinked upon save
The solution would be to just directly write in all these corner cases, then we could avoid this trouble at the expense of loosing data in case of a system crash, so it's a tradeoff. To fix this, we need to patch Qt, which noone did so far.
Different programs save files in different ways. At least two come to mind:
open existing file and overwrite its content
create temporary file, write new content there, then somehow replace original file with new one (remove old file and rename new one; or rename old file, rename new file, then remove old file; or use system function to swap files content (in fact, swap names of files), then remove old file; or ...)
Judging from its current source code, Kate is using the latter approach (using QSaveFile in the end, with direct write fallback though). Why? Usually, to make file saving more or less atomic, and also to make sure that saving won't result in errors caused by e.g. lack of free space (although this also means that it uses the space of sum of old and new file sizes when saving).
I don't have Kate on Linux Mint but I have noticed issues which lead me to suspect you may have come across a "bug".
Here are two 'similar' issues with hard links that have been reported.
Bug 316240 - KSaveFile: Kate/kwrite makes a copy of a file while editing a hard link
"Hard link not supported" with NTFS on Linux #3241

History of users modifying a file in Linux

I am wondering if its possible to list who all modified the file with course of time. I am aware that stat or ls -lrt will give the last user who modified the file. But I want to find out if it is possible to find the N-1 user who modified the file.
Note: I think chances are very slim to find such user. Just want to confirm with experts before declaring its a dead end.
Example:
At 1:00 AM ABC modified the file
At 2:00 AM XYZ modified the same file.
I am aware that XYZ has modified the file, How to find who modified the file before XYZ (In this case ABC)?
One hack that can be used is (This will only work for the recent modification) you can check the last modified time for the file, and cross check with the log-in times of the users. You might be able to narrow the list down.
use stat command (ex: stat , See this)
Find the Modify time
Use last command to see the log in history (see this)
Compare the log-in/log-out times with the file's Modify timestamp
This will not work all the time, but you can narrow the results down.
I am aware that stat or ls -lrt will give the last user who modified the file.
No. Modifying a file does not change its owner.
In general filesystems do not keep track of modification histories. If this information is crucial, the way to go is
For complete file hierarchies: a VCS (Version Control System) like Git, Subversion, Mercurial, CVS, ...
For single files, RCS or SCCS, ...
It is possible to configure auditing to track changes to specific files. There are some limitations:
it has to be configured before the changes of interest
the auditing daemon tends to refuse to start if told to watch a file which has been deleted.
Still, it can be useful. Look for auditctl. Here are some useful links discussing the topic:
Linux audit files to see who made changes to a file
Monitoring Linux File access, Changes and Data Modifications
Track file changes using auditd
The Linux Audit System, or Who Changed That File?
It is not possible to track user details like username who modify the file by a particular command. Only we can check the assigned username to file by ls -l.

shell script to create backup file when creating new file in particular directory

Recently I was asked the following question in an interview.
Suppose I try to create a new file named myfile.txt in the /home/pavan directory.
It should automatically create myfileCopy.txt in the same directory.
A.txt then it automatically creates ACopy.txt,
B.txt then BCopy.txt in the same directory.
How can this be done using a script? I may know that this script should run in crontab.
Please don't use inotify-tools.
Can you explain why you want to do?
Tools like VIM can create a backup copy of a file you're working on automatically. Other tools like Dropbox (which works on Linux, Windows, and Mac) can version files, so it backs up all the copies of the file for the last 30 days.
You could do something by creating aliases to the tools you use for creating these file. You edit a file with the tools you tend to use, and the alias could create a copy before invoking a tool.
Otherwise, your choice is to use crontab to occasionally make backups.
Addendum
let me explain suppose i have directory /home/pavan now i create the file myfile.txt in that directory , immediately now i should automatically generate myfileCopy.txt file in the same folder
paven
There's no easy user tool that could do that. In fact, the way you stated it, it's not clear exactly what you want to do and why. Backups are done for two reasons:
To save an older version of the file in case I need to undo recent changes. In your scenario, I'm simply saving a new unchanged file.
To save a file in case of disaster. I want that file to be located elsewhere: On a different computer, maybe in a different physical location, or at least not on the same disk drive as my current file. In your case, you're making the backup in the same directory.
Tools like VIM can be set to automatically backup a file you're editing. This satisfy reason #1 stated above: To get back an older revision of the file. EMACs could create an infinite series of backups.
Tools like Dropbox create a backup of your file in a different location across the aether. This satisfies reason #2 which will keep the file incase of a disaster. Dropbox also versions files you save which also is reason #1.
Version control tools can also do both, if I remember to commit my changes. They store all changes in my file (reason #1) and can store this on a server in a remote location (reason #2).
I was thinking of crontab, but what would I backup? Backup any file that had been modified (reason #1), but that doesn't make too much sense if I'm storing it in the same directory. All I would have are duplicate copies of files. It would make sense to backup the previous version, but how would I get a simple crontab to know this? Do you want to keep the older version of a file, or only the original copy?
The only real way to do this is at the system level with tools that layer over the disk IO calls. For example, at one location, we used Netapps to create a $HOME/.snapshot directory that contained the way your directory looked every minute for an hour, every hour for a day, and every day for a month. If someone deleted a file or messed it up, there was a good chance that the version of the file exists somewhere in the $HOME/.snapshot directory.
On my Mac, I use a combination of Time Machine - which backs up the entire drive every hour, and gives me a snapshot of my drive that stretches back over a year and a half) and Dropbox which keeps my files stored in the main Dropbox server somewhere. I've been saved many times by that combination.
I now understand that this was an interview question. I'm not sure what was the position. Did the questioner want you to come up with a system wide way of implementing this, like a network tech position, or was this one of those brain leaks that someone comes up with at the spur of the moment when they interview someone, but were too drunk the night before to go over what they should really ask the applicant?
Did they want a whole discussion on what backups are for, and why backing up a file immediately upon creation in the same directory is a stupid idea non-optimal solution, or were they attempting to solve an issue that came up, but aren't technical enough to understand the real issue?

Resources