Copying multiple files knowing their directories - linux

I have multiple files' directories and i need to copy them into specific folder using terminal, how do i do it. I also have access to GUI of the system, as all this is being done in virtual machine using ssh

According to its manpage, cp is capable of copying various source files to one output directory.
The syntax is as follows:
cp /dir1/file1 /dir1/file2 /dir2/file1_2 /outputdir/
Using this command, you can copy files from multiple directories (/dir1/ and /dir2/ in this example) to one output directory (/outputdir/).

Related

lftp mirror command including * wildcards to find correct corresponding Sharefile directory

I am trying to add files from within certain directories from a linux machine to a Sharefile using lftp and the mirror command. There are many instances of the files which are in different directories on the linux machine with different codes in the names, which need to be added to the corresponding Sharefile directories.
Ideally I don't want to have to find the full directory to copy to for each set of files each time when running the mirror command, so would like to know if there is a way of essentially doing the below? That will find the relevant unique directory in the Sharefile that matches what I have on the linux machine, without needing to find the full directory path?
mirror -R /data/path/to/files/A1/B1/ /Sharefile/Constant/Folders/**/**/A1/B1/

How do I make a copy of one directory files(hidden files included) into another directory?

I'm trying to make a copy of one directory files into another directory.
I have Desktop/projectOne and Desktop/projectTwo and I'm trying to copy projectOne files into projectTwo. I need to use terminal for this as I need to copy hidden files also and I'm not familiar with linux commands...
So my question is...
What commands do I have to use to copy all files (hidden files included) from Desktop/projectOne to Desktop/projectTwo?
What commands do I have to use to copy only hidden files from Desktop/projectOne to Desktop/projectTwo?
Thanks in advance.
cp -r
Example: cp -r /oldfolder /home/newfolder
Noticed: if newfolder is already exist it will create new folder in it
/home/newfolder/oldfolder

rsync multiple files from multiple directories in linux

I have multiple directories named by date (ex: 2017-09-05) and inside those directories multiple log.gz files from BRO IDS. I am trying to enter each directory, and get only specific log.gz files by name, and send those to a remote system using rysnc. A log file looks like this:app_stats.00:00:00-01:00:00.log.gz I am attempting to use wildcards to accomplish this.
Ex: rsync -avh --ignore-existing -e ssh -r /home/data/logs/2017-09-* {dns,http}.*.log.gz / root#10.23.xx.xx:/home/pnlogs/
This is close, but its just copying all the files in each folder and ignoring my attempt at just getting http and dns logs as seen in the example. Is this possible to do in one line? Would I need multiple?

How to archive files and sub folders in a location to another place in linux

I am trying to create a shell script to copy folders and files within those folders from one Linux machine to another linux machine. After copying I would like to delete only the files that are copied. I want to retain the folder structure as is.
Eg.
Machine X has a main folder named F with subfolders A,B,C folders in which each of them has 10 files.
I would like to make a copy in such a way that machine Y will have a folder named F with subfolders A,B,C containing the same files. Once the copy of all folders and files are complete, it should delete all the files in source folder but retain the folders.
The code below is untested. Use with care and backup first.
Something like this should get you started:
#!/bin/bash
srcdir=...
set -ex
rsync \
--verbose \
--recursive \
"${srcdir}/" \
user#host:/dstdir/
find "${srcdir}" -type f -delete
Set the srcdir variable and the remote argument to rsync to taste.
The rsync options are just from memory, so they may need tweaking. Read the documentation, especially options regarding deletion, backup, permissions and links.
(I'd rather not answer questions requests that show no signs of effort, but my fingers were itching, so there you go.)
scp the files, check the exit code of the scp and then delete the files locally.
Something like scp files user#remotehost:/path/ && rm files
If scp has failed, the second part of the command won't execute

Not symlinking one or two files inside symlinked directory

Is there a way not to symlink one or two files within a symlinked directory in CentOS?
I've got the entire directory symlinked but there are two css files that I would like to use the current copy for the website
In short: no.
Another way to do this would be to symlink all the files in that directory, except those you want local copy of.
Still another way to go might be using unionfs or aufs to union-mount the original directory and a directory containing the files you need local, with the directory containing local files being "on top".
Say, your original directory is orig, the directory with files that should be local is local, the union directory is union, and you want files from both directories to be writable. Then you can union-mount them like this:
unionfs-fuse local=RW:orig=RW union
And unmount like this:
fusermount -u union
See unionfs manpage (unionfs-fuse(8) at least on Debian) for details.

Resources