I have multiple directories named by date (ex: 2017-09-05) and inside those directories multiple log.gz files from BRO IDS. I am trying to enter each directory, and get only specific log.gz files by name, and send those to a remote system using rysnc. A log file looks like this:app_stats.00:00:00-01:00:00.log.gz I am attempting to use wildcards to accomplish this.
Ex: rsync -avh --ignore-existing -e ssh -r /home/data/logs/2017-09-* {dns,http}.*.log.gz / root#10.23.xx.xx:/home/pnlogs/
This is close, but its just copying all the files in each folder and ignoring my attempt at just getting http and dns logs as seen in the example. Is this possible to do in one line? Would I need multiple?
Related
I am trying to add files from within certain directories from a linux machine to a Sharefile using lftp and the mirror command. There are many instances of the files which are in different directories on the linux machine with different codes in the names, which need to be added to the corresponding Sharefile directories.
Ideally I don't want to have to find the full directory to copy to for each set of files each time when running the mirror command, so would like to know if there is a way of essentially doing the below? That will find the relevant unique directory in the Sharefile that matches what I have on the linux machine, without needing to find the full directory path?
mirror -R /data/path/to/files/A1/B1/ /Sharefile/Constant/Folders/**/**/A1/B1/
I have multiple files' directories and i need to copy them into specific folder using terminal, how do i do it. I also have access to GUI of the system, as all this is being done in virtual machine using ssh
According to its manpage, cp is capable of copying various source files to one output directory.
The syntax is as follows:
cp /dir1/file1 /dir1/file2 /dir2/file1_2 /outputdir/
Using this command, you can copy files from multiple directories (/dir1/ and /dir2/ in this example) to one output directory (/outputdir/).
How can I delete a directory, which contains several files and sub-directories, with curl using the cmd shell (bash, ubuntu)?
I only know the name of the folder I want to delete.
This is a new question. I don't want to delete a file, I want to delete a directory, where I don't know the content.
How can a user download many text files at once from a remote host using scp from the terminal using wildcards? In addition, using the result from the wildcard to save the file in the same named directory (lets assume already exists). The remote directories also contain other files with different names. For instance:
4 files in remote host:
[Remote-host]:1Dir/File.txt -> [Local-host]:1Dir/File.txt
[Remote-host]:2Dir/File.txt -> [Local-host]:2Dir/File.txt
[Remote-host]:3Dir/File.txt -> [Local-host]:3Dir/File.txt
[Remote-host]:4Dir/File.txt -> [Local-host]:4Dir/File.txt
I have tried using the following to no avail. Please assist
scp [remote-host]:'*Dir/File.txt' '*Dir/'
Try the following to retrieve your files:
scp user#host:~"/*Dir/*.txt" .
Or you can try:
scp user#host:"~/*Dir/*.txt" .
It really depends on how your user account is mapped in your environment..
Thanks #thatotherguy for the great answer.
For anyone else thats interested, the following command for rsync works
rsync -a --include '*Dir/' --include 'File.txt' --exclude '*' [Remote-host]: '\*Dir'
This means, include all folders with '*Dir' and files called 'File.txt', exclude everything else. Note that this creates a new directory called *Dir in which all the 1Dir, 2Dir, 3Dir etc. are contained.
Alright so here's the situation.
I wanted to setup a high-availability load-balanced cluster for my 2 linux servers but i realised that was a little bit out of my reach. So i decided to do something similar to save me some work.
So the plan is, i have a server hosting web, ftp and mails called : mars
I have copied the whole server to a new one with similar specs called : higgs
So now i'm going to change all the configuration files to use the proper IP and hostname, and i will change my MX entries on my GoDaddy to use higgs instead of mars. So the plan is that mars still does web but will be able to do mail if higgs goes down and vice-versa. So i want to do daily cronjobs that do rsyncs for my Web files, FTP files, and Mail files but also the password files if possible while excluding .conf files. Is there a way this is possible?
Rsync is ridiculously powerful. If you just need to exclude *.conf files, you can call rsync with the --exclude flag like this:
rsync -a mars: higgs: --exclude='*.conf'
I prefer to keep my excludes in a separate file under version control, then reference that from the rsync command. Your exclude file (rsync_exclude.txt) might look like this:
*.conf
/home/fake_dir_to_exclude
Then call rsync like this:
rsync -a mars: higgs: --exclude-from=rsync_exclude.txt
One other helpful thing for debugging rsync is the -n, --dry-run flag combined with -v, --verbose flag. When called with these, rsync will spit out a file list without transferring anything.