Linux - copying files like on Windows (ignoring case with the same names) - linux

How can I copy files to ignore their case?
For example, I want to copy the test.txt file to the target directory, which already has Test.txt (same name, different case).However, I don't want it to be copied as a new file, but to replace the existing one.
I want to get the same file copy effect as on Windows. Windows is case insensitive, so files and directories with the same name but different case are overwritten.
I would like to improve the installation of mods for games installed with Wine/Proton. I have to use Windows applications like 'Total Commander' to get everything copied properly.

Related

Extract from single zip archive into multiple destination folders using WinRar

Is there an efficient alternative method preferably using WinRar on Windows 10 to extract selected files and folders from a single zip archive into multiple destination folders simultaneously, instead of one operation at a time?
I was sure the 'extract one to many' concept would exist in the software, but nothing is mentioned in the Help facility on how to select multiple destination folders.
Thank you.
I normally open the archive (using WinRar in Windows 10 Home), select all the file and folder content from within the parent folder but not the parent itself (using SELECT A ) and then use 'EXTRACT TO' to navigate and select the destination folder.
Once the archive has been unzipped and overwritten any existing files and folders, I then repeat the process manually, about 10-15 times to extract everything still previously selected into several more unique destination folders until completed.
I would like to be able to select the multiple destination folders in one go, making multiple copies simultaneously.

Copying a file, but appending index if file exists

I have several directories with filenames being the same, but their data inside is different.
My program identifies these files (among many others) and I would like to copy all the matches to the same directory.
I am using shutil.copy(src,dst) but I don't want to overwrite files that already exist in that directory (previous matches) if they have the same name. I'd like to be able to append an integer if it already exists. Similar to the behavior in Windows10 when you copy where you can "keep both versions".
So for example, if I have file.txt in several places, the first time it would copy into dst directory it would be file.txt, the next time it would be file-1.txt (or something similar), and the next time it would be file-2.txt.
Are there any flags for shutil.copy or some other copy mechanism in Python that I could use to accomplish this?

Remote SSH copy files with filenames that contain certain strings

I have two remote servers. One that I am currently connected to and one that I am trying to copy a lot of files to (10.10.0.13)
I have a series of files I need to copy within various directories of the format:
/opt/DR/output/1/a/csva1file.csv
/opt/DR/output/1/a/csva2file.csv
/opt/DR/output/1/b/csvb1file.csv
/opt/DR/output/1/b/csvb2file.csv
/opt/DR/output/1/b/csvb3file.csv
/opt/DR/output/1/b/csvb4file.csv
/opt/DR/output/1/c/csvc1file.csv
...
/opt/DR/output/30/a/csva1file.csv
And this continues for output/1 to output/40 folders. All the folders inside are identical and all the filenames inside will all contain similar strings, just with slight differences depending on the folder they are in.
I want to copy all the files that contain "a1" from any directory to a folder in a remote server:
root#10.10.0.13:/data/landing/a/a1/
Similarly, I want to do this for all b1, c1, c2 etc. files and copy them to their respective places on the remote server.
I cannot seem to find a way to do this that doesn't involve writing multiple lines of code.
I have tried
cd /opt/DR/output/1/a/
scp -r -v *a1* root#10.10.0.13:/data/landing/a/a1/
which works but I want to copy **ALL* the a1 csv files rather than having to do them one by one.
I have looked into globbing but don't think it can be used for my case. I have also looked at using paramiko/glob for python but I couldn't get that to work either.
Ideally, I would like to do this with a bash shell script, but a python script would also work.
Hope this makes sense. Any help would be greatly appreciated. I can copy via SFTP or SCP.

Is it Possible to Delete "C:\cygwin64\usr\share\" Directory, to Decrease The Cygwin Library Size?

I've the Cygwin Packages Library installed om my system (Win7- x64) at location C:\Cygwin64\ .
That directory contains over 185.000 Files ! and its size passed the 5GB this week, Knowing that the packages source directory isn't included .
Now, I want to decrease that size, and of-course I'm going to uninstall some of my packages that I don't need anymore. But first I want to ask about the ability of deleting a specific directory that located in: C:\cygwin64\usr\share
(Please, forgive my ignorant, if my question is silly)
While I was trying to figure out the cause of that large files number, I noticed that, this directory specifically, has over than 90.000 File !!
I don't Know what is that directory used for, but would someone please tell me if I can Delete that folder safely, without affecting on the installed packages? - Thanks :)
I cannot speak for the entirety of the folder, but
awk uses that folder for
include files, which I would miss
delete a column with awk or sed
awk - how to delete first column with field separator
how to remove the first two columns in a file using shell (awk, sed, whatever)

How to create a copy of a directory on Linux with links

I have a series of directories on Linux and each directory contains lots of files and data. The data in those directories are automatically generated, but multiple users will need to perform more analysis on that data and generate more files, change the structure, etc.
Since these data directories are very large, I don't want several people to make a copy of the original data so I'd like to make a copy of the directory and link to the original from the new one. However, I'd like any changes to be kept only in the new directory, and leave the original read only. I'd prefer not to link only specific files that I define because the data in these directories is so varied.
So I'm wondering if there is a way to create a copy of a directory by linking to the original but keeping any changed files in the new directory only.
It turns out this is what I wanted to:
cp -al <origdir> <newdir>
It will copy an entire directory and create hard links to the original files. If the original file is deleted, the copied file still exists, and vice-versa. This will work perfectly, but I found newdir must not already exist. As long as the original files are read-only, you'll be able to create an identical, safe copy of the original directory.
However, since you are looking for a way that people can write back changes, UnionFS is probably what you are looking for. It provides means to combine read-only and read-write locations into one.
Unionfs allows any mix of read-only and read-write branches, as well as insertion and deletion of branches anywhere in the fan-out.
Originally I was going to recommend this (I use it a lot):
Assuming the permissions aren't an issue (e.g. only reading is required) I would suggest to bind-mount them into place.
mount -B <original> <new-location>
# or
mount --bind <original> <new-location>
<new-location> must exist as a folder.

Resources