Only get folders using smbclient - linux

I'm struggling to only retrieve subfolders from a remote windows share with the following directory structure using smbclient. Is there a way of issuing a command to only get folders? The command I have so far is:
smbclient //$host/$share -U"$USER%$PASSWORD" -c 'cd RootFolder; prompt; recurse; mget Test*\'
RootFolder/
Test001/
Revisions.txt
Test002/
Revisions.txt
Test003/
Revisions.txt
Test001=2012_12_05.log
Test001=2012_12_06.log
Test001=2012_12_07.log
Test001=2012_12_08.log
... more log files here

You could pipe the output of your command through grep, looking for lines that end with /.
smbclient ... | egrep '/$'
Instead, you could mount the remote windows file system and then use the find command to search for folders. The find command will search recursively for all directories only. This would be my recommended approach. Assuming you mount the windows filesystem as /mnt/win_host...
find /mnt/win_host -type d

Related

I have question about some command in linux

I've learned linux and there's my homework:
Show all files and directories in /usr and save the result in file usr.txt:
Show all files in /usr/bin and stop when the screen is full:
Hint: use command pipe and more
Find all files in /etc that the file name contains the word “log”:
Hint: use command grep
1.I searched copy command on google but there's just some commands about copy file not contents and I also use Y1G command but there're nothing happen.
2.I absolutely don't have any idea about how to pharse "the screen is full" by commands
3.I've use command find path -log /etc but there're no right result
Show all files and directories in /usr and save the result in file usr.txt:
cd /usr && ls -l > usr.txt
Show all files in /usr/bin and stop when the screen is full:
Hint: use command pipe and more
ls -l /usr/bin | more
Find all files in /etc that the file name contains the word “log”:
grep -ir "log*" /etc

Is there a way to download files matching a pattern trough SFTP on shell script?

I'm trying to download multiple files trough SFTP on a linux server using
sftp -o IdentityFile=key <user>#<server><<END
get -r folder
exit
END
which will download all contents on a folder. It appears that find and grep are invalid commands, so are for loops.
I need to download files having a name containing a string e.g.
test_0.txt
test_1.txt
but no file.txt
Do you really need the -r switch? Are there really any subdirectories in the folder? You do not mention that.
If there are no subdirectories, you can use a simple get with a file mask:
cd folder
get *test*
Are you required to use sftp? A tool like rsync that operates over ssh has flexible include/exclude options. For example:
rsync -a <user>#<server>:folder/ folder/ \
--include='test_*.txt' --exclude='*.txt'
This requires rsync to be installed on the remote system, but that's very common these days. If rsync isn't available, you could do something similar using tar:
ssh <user>#<server> tar -cf- folder/ | tar -xvf- --wildcards '*/test_*.txt'
This tars up all the files remotely, but then only extracts files matching your target pattern on the receiving side.

How to download all files from a Linux server using SCP which contan a given String

I need to download all 492 files from a Linux directory which contain a given String within the file. I can't quite manage to find a command which can accomplish this from my searching so far. Could anybody help me out?
Cheers.
Use grep to filter the files with a given string and loop over them to scp like this
for file in $(grep <some-pattern> <directory>); do scp $file <remote>; done;
Just in case if you need to filter out also the files in subdirectories of directory add the -R option to grep

Scp bulk files from current directory to another directory

I need to transfer a bunch of files from a production host to my local machine. I'm already in the directory that I need to transfer the files from. I know the names of log files that I need to transfer to my local machine. They are log.timestamp.hostnames and these tend to be long. How can I transfer in bulk using scp ? Is there an easier way than just typing the long file names ? Can I get it out from a filename ?
Use wildcards:
scp log.* user#host:/target/directory
If you didn't want to copy over all of your files in the current directory (which would just be using ./*), what you could do is parse all of the files in your current directory and run a regular expression on it to match up log.timestamp.hostname then pipe that into scp. For the regex I found this example regex with find. To send big files here is an example: scp syntax. Something along the lines of:
scp $(find . -regextype sed -regex ".*/log\.[a-z0-9\-]\.[a-z0-9\-]") user#remote:~/
You will probably want to modify the regex to make it work.
This command line execution option helped to solve my issue of transfering a subset of files. As the AIX unix does not provide the -regextype option with find I used the grep command instead in order to retrieve files tab1.msg to tab9 msg
scp $(find . -name "*" | grep tab.\.msg) user#host:/tmp

Verify copied data between Windows & Linux shares?

I just copied a ton of data from a machine running Windows 7 Ultimate to a server running Ubuntu Server LTS 10.04. I used the robocopy utility via PowerShell to accommplish this task, but I couldn't find any informaiton online regarding whether Robocopy verifies the copied file's integrity once it is copied to the server.
First of all, does anyone know if this is done inherently? There is no switch that explicitly allows you to add verification to a file transfer.
Second, if it doesn't or there is uncertainty about whether or not it does, what would be the simplest method to accomplish this for multiple directories with several files/sub-directories?
Thanks!
The easiest mechanism I know would rely on an md5sum and Unix-like find utility on your Windows machine.
You can generate a manifest file of filenames / md5sums:
find /source -type f -exec md5sum {} \; > MD5SUM
Copy the MD5SUM file to your Linux machine, and then run:
cd /path/to/destination
md5sum --quiet -c MD5SUM
You'll get a list of files that fail:
$ md5sum --quiet -c /tmp/MD5SUM
/etc/aliases: FAILED
md5sum: WARNING: 1 of 341 computed checksums did NOT match
Much easier method is to use unix uilities diff and rsync
With diff you can compare two file but you can compare also two directories. With diff I would recomend this command:
diff -r source/directory/ destination/directory
-r forces diff to analyse directorysor recursively
The second option is to use rsync which is ment to sync files or directories but with -n option you can use it also to analyes differencies between directories. rsync also works even when the files are not on the same host it means on one there can be remote host and you can acess it even with ssh. Pluss rsync is really flexible with its meny options avaialbe
rsync -avn --itemize-changes --progress --stats source/directory/ destination/directory/
-n option makes rsync do a "dry-run", meaning it makes no changes on the
-a otion includes in it "Recursive mode,symbolic links,file permissions, file timestamps, file owner parameter, file group paratmeter
-v increase verbosity
--itemize-changes output a change-summary for all updates
Here you can find even more ways how to compear directories:
https://askubuntu.com/questions/12473/file-and-directory-comparison-tool
On rsync wikipedia page you can find windows alternative programs for rsync

Resources