using hard link with kate editor - linux

I have problem in working with link command in linux mint.
I made file1 and add a new hard link to that file :
link file1 file2
I know when I change the contents of file1 , file2 should change too.
and when I edit file1 with vim or add text to it with redirections it works well but when
I edit file1 with kate editor then it's like that the editor break the link of file2! and after that when
I change the contents of file1 with kate or vim,... file 2 will never change any more.
what's the problem?

I'm one of the Kate developers. The issue is as follows: Whenever Kate saves, it does so by saving into a temporary file in the same folder, and on success just does a move to the desired location.
This move operation is exactly what destroys your hard link: first, the hard link is removed, then the temporary file is renamed.
While this avoids data loss, it also has its own problems as you experience. We are tracking this bug here:
https://bugs.kde.org/show_bug.cgi?id=358457 - QSaveFile: Kate removes a hard link to file when opening a file with several hard links
And in addition, QSaveFile also has two further issues, tracked here:
https://bugs.kde.org/show_bug.cgi?id=333577 - QSaveFile: kate changes file owner
https://bugs.kde.org/show_bug.cgi?id=354405 - QSaveFile: Files are unlinked upon save
The solution would be to just directly write in all these corner cases, then we could avoid this trouble at the expense of loosing data in case of a system crash, so it's a tradeoff. To fix this, we need to patch Qt, which noone did so far.

Different programs save files in different ways. At least two come to mind:
open existing file and overwrite its content
create temporary file, write new content there, then somehow replace original file with new one (remove old file and rename new one; or rename old file, rename new file, then remove old file; or use system function to swap files content (in fact, swap names of files), then remove old file; or ...)
Judging from its current source code, Kate is using the latter approach (using QSaveFile in the end, with direct write fallback though). Why? Usually, to make file saving more or less atomic, and also to make sure that saving won't result in errors caused by e.g. lack of free space (although this also means that it uses the space of sum of old and new file sizes when saving).

I don't have Kate on Linux Mint but I have noticed issues which lead me to suspect you may have come across a "bug".
Here are two 'similar' issues with hard links that have been reported.
Bug 316240 - KSaveFile: Kate/kwrite makes a copy of a file while editing a hard link
"Hard link not supported" with NTFS on Linux #3241

Related

Check if file has been changed and which line in file

I am looking some solutions which help me to track any changes which have been made on files. I am working on Linux system where a lot of people have access to the same files. Sometimes it is happened, that someone changed something in file and don't notify other users. So I would like to write some script to check if provide file path or files have been changed, if so then write in file let's say "controlfile_File1.txt" something like that "File changed %date, line XXX". I know that I can use md5checksum for that, but I will get only info if file changed but I would like to know which line is changed. I also think about solution to make copy of this file to some place and make some diff between copied file and current file?
Any ideas?
thanks for support.
Your question goes above the possibilities of Linux as a platform: Linux can show you the last modification date of a file, and the last time a file has been accessed (even without modifying the file), but that's it.
What are you are looking for, as already mentioned in the comments, is a version control system. As mentioned indeed, Git is one of them, but there are also others (SourceTree, SourceSafe, Clearcase, ...) each of them having their (dis)advantages.
One thing they all have in common is that modifying such a file does not go that simply anymore: at every time somebody has modified such a file (a file under the version control system), (s)he will be asked why (s)he has done this, this will also be recorded for later reference.

How do you make a specific folder that has subfolders to be uneditable in Sublime Text

How do you make a specific folder that has subfolders to be uneditable in Sublime text? ? I know it's possible but how ? Like i have some old projects that I want to use as a reference to study my old code, but i"m worried that I mistakenly edited some parts of that specific module / file, when I'm mindlessly touring around my code.. How do I do it ? like making a specific folder to be uneditable in sublime text that only modifying it can change it . I mean I tried installing this one package : https://packagecontrol.io/packages/Toggle%20Read-Only
But it still gives me a prompt whenever I want to changed something from a file
Your best bet is to make sure that your source code is covered by some sort of Source Control, such as git or Subversion; this is just always a good idea in general and unrelated to your particular question. Having your code under source control means that when you edit a file (accidentally or on purpose) you can see exactly what you edited or throw those edits away and go back to what you had if you want to. If you also push your code to an external server, such as GitHUb (if you use git) then you also have a cheap and easy off-site backup of your code as well.
That said, if you want to make files uneditable, that's more the job of your file system than the tools that you're using to edit the files (in this case Sublime Text).
All file systems and operating systems should have the concept of a read-only file, which sounds like what you want. A file being marked as read-only stops you from accidentally modifying it; depending on the software that you use, edits are either impossible or will need to be confirmed.
In your case, you can do this in a couple of different ways. If you only want to protect a couple of files, then you would do a Right click and choose Properties; in the bottom of the General tab there is a check box you can check to make that file read-only:
If you have many files, you can do the same thing by editing the properties of the folder that contains the code instead:
When you do this to a folder, the property set works a little differently; since you're modifying the folder as a whole, you need to click the box twice to change it from a square (shown above) to a check mark. When you apply the change, you will be asked if you only want to make files inside of that directory read-only, or all files in that folder as well as all folders under it; choose as appropriate.
Sublime will let you open read-only files and will also let you modify their contents, but when you try to save you will get a warning dialog telling you that the file is write-protected; you need to confirm if you want to actually save changes or not; (other software may not prompt with such a dialog and may just fail):
If you choose to save, you will remove the read-only attribute and make the file normal again.
If you want to make a file completely un-editable so that you can't even accidentally change the file, you can achieve that with a simple plugin in combination with making the file read-only (see this video if you're not sure how to apply a plugin):
import sublime
import sublime_plugin
import os
class ReadOnlyListener(sublime_plugin.EventListener):
def on_load(self, view):
if (os.path.exists(view.file_name())
and not os.access(view.file_name(), os.W_OK)):
view.set_read_only(True)
EDIT: The internal View Package Files command will open package resources from sublime-package files transparently and give them a filename that represents where they would exist on disk if they were not in the package file.
The plugin from the original answer would stop you from being able to use this command by noting that the file is not writable (because it does not exist on disk) and make the view read-only, which stops the file content from being displayed because the view can't be modified.
This is rectified in the edit above by only taking action if the file actually exists on disk (the View Package File command already makes files it loads in this manner read-only if they do not exist on disk).
This makes an event listener that checks every time you open a file to see if the file is writable, and if it's not it makes the view read-only. This distinction is Sublime specific; regardless of the underlying state of the file, you just can't make any changes to such a file at all. That doesn't stop you from saving the file even if you haven't made any changes, which would have the same effect as the above.

Why file is accessible after deleting in unix?

I thought about a concurrency issue (in Solaris), what happen if while reading someone tries to delete the same file. I have a query regarding file existence in the Solaris/Linux. suppose I have a file test.txt, I have open it in vi editor, and then I have open a duplicate session and remove that file, but even after deleting that file I am able to read that file. so here are my questions:
Do I need to thinks about any locking mechanism while reading, so no one able to delete same file while reading.
What is the reason of showing different behavior from windows(like in windows if file is open in in some editor than we can not delete that file)
After removing that file, how I am still able to read that file, if I haven't closed file from vi editor.
I am asking files in general,but yes platform specific i.e. unix. what will happen if I am using a java program (buffer reader) for read file and file is deleted while reading, does buffer reader still able to read the file for next chunk or not?
You have basically 2 or 3 unrelated questions there. Text editors like to read the whole file into memory at the start of the editing session. Imagine every character you type being saved to disk immediately, with all characters after it in the file being rewritten one place further along to make room. That would be awful. Much better that the thing you're actually editing is a memory representation of the file (array of pointers to lines, probably with some metadata attached) which only gets converted back into a linear stream when you explicitly save.
Any relatively recent version of vim will notify you if the file you are editing is deleted from its original location with the message
E211: File "filename" no longer available
This warning is not just for unix. gvim on Windows will give it to you if you delete the file being edited. It serves as a reminder that you need to save the version you're working on before you exit, if you don't want the file to be gone.
(Note: the warning doesn't appear instantly - vim only checks for the original file's existence when you bring it back into the foreground after having switched away from it.)
So that's question 1, the behavior of text editors - there's no reason for them to keep the file open for the whole session because they aren't actually using it except at startup and during a save operation.
Question 2, why do some Windows editors keep the file open and locked - I don't know, Windows people are nuts.
Question 3, the one that's actually about unix, why do open files stay accessible after they're deleted - this is the most interesting one. The answer, guaranteed to shock you when presented directly:
There is no command, function, syscall, or any other method which actually requests deletion of a file.
Underlying rm and any other command that may appear to delete a file there is the system call unlink. And it's called unlink, not remove or deletefile or anything similar, because it doesn't remove a file. It removes a link (a.k.a. directory entry) which is an association between a file and a name in a directory. (Note: ANSI C added remove as a more generic function to appease non-unix people who had no intention of implementing unix filesystem semantics, but on unix, remove is just a rmdir if the target is a directory, and unlink for everything else.)
A file can have multiple links (see the ln command for how they are created), which means that the same file is known by multiple names. If you rm one of them, the others stick around and the file is not deleted. What happens when you remove the last link? Well, now you have a file with no name. But names are only one kind of reference to a file. There are at least 2 others: file descriptors and mmap regions. When the last reference to a file goes away, that's when the file is deleted.
Since references come in several forms, there are many kinds of events that can cause a file to be deleted. Here are some examples:
unlink (rm, etc.)
close file descriptor
dup2 (can implicitly closes a file descriptor before replacing it with a copy of a different file descriptor)
exec (can cause file descriptors to be closed via close-on-exec flag)
munmap (unmap memory region)
mmap (if you create a new memory map at an address that's already mapped, the old mapping is unmapped)
process death (which closes all file descriptors and unmaps all memory mappings of the process)
normal exit
fatal signal generated by the kernel (^C, segfault)
fatal signal sent from another process (kill)
I won't call that a complete list. And I don't encourage anyone to try to build a complete list. Just know that rm is "remove name", not "remove file", and files go away as soon as they're not in use.
If you want to destroy the contents of a file immediately, truncate it. All processes already using it will find that its size has suddenly become 0. (This is destruction as far as the normal file access methods are concerned. To destroy it more thoroughly so that even someone with raw disk access can't read what used to be there, you need to overwrite it. There's a tool called shred for that.)
I think your question has nothing to do with the difference between Windows/Linux. It's about how VI works.
when using VI to edit a file. VI will create a .swp file. And the .swp file is what you are actually editing. At the same time, if other users delete the original file will not effect your editing.
And when you type :w in VI, VI will use .swp file to overwrite the original file.

Append text file with custom footer

Good day,
I am a CNC program not a computer programer. I am using CAM software to make cutting programs for our CNC router. The router is a bit old and can only take files 200-300 kb big. We are doing carvings that require 1-2 megs text files. I am using a program called GSplit ( http://www.gdgsoft.com/gsplit/ ) to divvy up the text file. It generates 10-25+ files with a custom header that our machine can read. All the files are great and it works, but I have to manually add the closing lines/footer to each file. The files that are created and used are normal .txt files but with a specific extension, .ANC.
Is there any way to automate this process of opening each individual file, scrolling to the end and copy/pasting the same 1-2 lines of code? The files are NAME[number].ANC in a contained folder. Would it be possible to just direct to a folder and say "add this 'text' to every file in this folder"?
Thanks for your time.
What OS are you using? Using Unix you can do a simple script on command line. If you are in the directory with the specific files simply execute:
for file in *; do echo "APPEND THIS" >> $file; done
If you are running Windows you should be able to do the same using cygwin (probably you could also use the power shell, but I don't know anything about the that)
I found a program Notepad++ (apparently the last person to find it...). USed the find/replace files option. A regular expression(note sure exactly what these are but I'm sure you guys do) "\s+\z" as to what to look for. It finds the last space or whatever at the end of all the files and then adds the code I need. Easy, free, and I don't need to write any computer code. Thanks for the attempt to help me Dirkk! :)

shell script to create backup file when creating new file in particular directory

Recently I was asked the following question in an interview.
Suppose I try to create a new file named myfile.txt in the /home/pavan directory.
It should automatically create myfileCopy.txt in the same directory.
A.txt then it automatically creates ACopy.txt,
B.txt then BCopy.txt in the same directory.
How can this be done using a script? I may know that this script should run in crontab.
Please don't use inotify-tools.
Can you explain why you want to do?
Tools like VIM can create a backup copy of a file you're working on automatically. Other tools like Dropbox (which works on Linux, Windows, and Mac) can version files, so it backs up all the copies of the file for the last 30 days.
You could do something by creating aliases to the tools you use for creating these file. You edit a file with the tools you tend to use, and the alias could create a copy before invoking a tool.
Otherwise, your choice is to use crontab to occasionally make backups.
Addendum
let me explain suppose i have directory /home/pavan now i create the file myfile.txt in that directory , immediately now i should automatically generate myfileCopy.txt file in the same folder
paven
There's no easy user tool that could do that. In fact, the way you stated it, it's not clear exactly what you want to do and why. Backups are done for two reasons:
To save an older version of the file in case I need to undo recent changes. In your scenario, I'm simply saving a new unchanged file.
To save a file in case of disaster. I want that file to be located elsewhere: On a different computer, maybe in a different physical location, or at least not on the same disk drive as my current file. In your case, you're making the backup in the same directory.
Tools like VIM can be set to automatically backup a file you're editing. This satisfy reason #1 stated above: To get back an older revision of the file. EMACs could create an infinite series of backups.
Tools like Dropbox create a backup of your file in a different location across the aether. This satisfies reason #2 which will keep the file incase of a disaster. Dropbox also versions files you save which also is reason #1.
Version control tools can also do both, if I remember to commit my changes. They store all changes in my file (reason #1) and can store this on a server in a remote location (reason #2).
I was thinking of crontab, but what would I backup? Backup any file that had been modified (reason #1), but that doesn't make too much sense if I'm storing it in the same directory. All I would have are duplicate copies of files. It would make sense to backup the previous version, but how would I get a simple crontab to know this? Do you want to keep the older version of a file, or only the original copy?
The only real way to do this is at the system level with tools that layer over the disk IO calls. For example, at one location, we used Netapps to create a $HOME/.snapshot directory that contained the way your directory looked every minute for an hour, every hour for a day, and every day for a month. If someone deleted a file or messed it up, there was a good chance that the version of the file exists somewhere in the $HOME/.snapshot directory.
On my Mac, I use a combination of Time Machine - which backs up the entire drive every hour, and gives me a snapshot of my drive that stretches back over a year and a half) and Dropbox which keeps my files stored in the main Dropbox server somewhere. I've been saved many times by that combination.
I now understand that this was an interview question. I'm not sure what was the position. Did the questioner want you to come up with a system wide way of implementing this, like a network tech position, or was this one of those brain leaks that someone comes up with at the spur of the moment when they interview someone, but were too drunk the night before to go over what they should really ask the applicant?
Did they want a whole discussion on what backups are for, and why backing up a file immediately upon creation in the same directory is a stupid idea non-optimal solution, or were they attempting to solve an issue that came up, but aren't technical enough to understand the real issue?

Resources