I think it should be possible to use a combination of the scp command and the head command to copy only the first line of a file to a remote system, but haven't been able to come up with the right command to make it happen.
Given an scp command like this:
scp /shared/myfolder/myfile.txt myuser#myserver:/newlocation/myotherfolder/myfile.txt
I'd like to send only the first line of myFile.txt to the remote system. I could use this command first:
head -1 myfile.txt >> myfile2.txt
and then scp myfile2.txt, but it would be helpful to have this in a single command.
scp doesn't read from stdin or work with bash process substitution, but you can
try hacks like:
head -1 myfile.txt | ssh myuser#myserver "cat > /shared/myfolder/myfile.txt"
Related
I have a filter command saved in a file (say filter.txt). The command could be something similar to following
grep "PASS"
Then I also have a output file of a testcase (say output.log). I want to us the filter command saved in the filter.txt file on the output.log.
I was looking for something like
cat output.log | `cat filter.txt`
But seems like it does not work. Is there a proper way to do this?
This works:
cat output.log | bash filter.txt
You need some program (like bash) that interprets the lines in filter.txt as commands to be executed.
Learning sed, and I was using a live editor so I can experiment/see changes.
sed -nf '/START FROM HERE/,${/NEXTLINE/{n;p;q}}'
When trying to run the same code, on Linux, I receive error No such file or Directory when I execute as ./xxx.sed text0.txt
I've tried a couple of things but I am not sure how to use sed like this.
The -f option means that the next argument is the name of a file containing the sed commands. So you need to put
/START FROM HERE/,${/NEXTLINE/{n;p;q}}
in the file xxx.sed. Then you do:
sed -nf xxx.sed test0.txt
If you want to be able to execute xxx.sed as a command, it needs a shebang line:
#!/usr/bin/sed -nf
/START FROM HERE/,${/NEXTLINE/{n;p;q}}
Then you can make the file executable and do:
./xxx.sed file0.txt
i am trying to copy files from remote machine to local machine using scp
scp -r username#hostname:/directory .
I want only the file to be copied instead of directories
ie)
directory
|directory2
| file1
| file2
file12
After copying all the files the structure should be of this
localdirectory
|file1
|file1
|file12
Is this possible using scp?
Sergius is right, you can use find and scp in conjunction to achieve this. However you need to run find on remote machine over ssh first and then scp it.
You can combine find and scp, something like this:
find localdirectory | xargs scp {your parameters}
find - returns all files, xargs - will collect their full paths and gives them as argument for scp
try:
scp -r username#hostname:{/directory/directory2/file1,/directory/directory2/file2,/directory/file12} localdirectory
or just scp one by one
I searched the Internet, but maybe I used the wrong keyword, but I couldn't find the syntax to my very simple problem below:
How do I redirect a file as command line arguments to the Linux command "touch"? I want to create a file with "touch abc.txt", but the filename should come from the filename "file.txt" which contains "abc.txt", not manually typed-in.
[root#machine ~]# touch < file.txt
touch: missing file operand
Try `touch --help' for more information.
[root#machine ~]# cat file.txt
abc.txt
Try
$ touch $(< file.txt)
to expand the content of file.txt and give it as argument to touch
Alternatively, if you have multiple filenames stored in a file, you could use xargs, e.g.,
xargs touch <file.txt
(It would work for just one, but is more flexible than a simple "echo").
I am writing shell script first time, I want to download latest create file from FTP.
I want to download latest file of specific folder. Below is my code for that. But it is downloading all the files of the folder not the latest one.
ftp -in ftp.abc.com << SCRIPTEND
user xyz xyz
binary
cd Rpts/
mget ls -t -r | tail -n 1
quit
SCRIPTEND
help me with this, please?
Try using wget or lftp utility instead, it compares file time/date and AFAIR its purpose is ftp scripting. Switch to ssh/rsync if possible, you can read a bit about lftp instead of rsync here:
https://serverfault.com/questions/24622/how-to-use-rsync-over-ftp
Probably the easiest way is to link last version on server side to "current", and always get the file pointed. If you're not admin of the server, you need to list all files with date/time, grab the information, parse it, decide which one is newest, in the meantime state on the server can change, and you find yourself in more complicated solution than it's worth.
The point is, that "ls" sorts output in some way, and time may not be default. There are switches to sort it e.g. base on modification time, however even when server responds with OK on ls -t , you can't be sure it really supports sorting, it can just ignore all switches and always return the same list, that's why admins usually use "current" link (ln -s). If there's no "current", to make sure you have the right file, you need to parse list anyway ( ls -al ).
http://www.catb.org/esr/writings/unix-koans/shell-tools.html
Looking at the code, the line
mget ls -t -r | tail -n 1
doesn't do what you think. It actually grabs all of the output of ls -t and then tail processes the output of mget. You could replace this line with
mget $(ls -t -r | tail -n 1)
but I am not sure if ftp will support such a call...
Try using an FTP client other than ftp. For example, curlftpfs available at curlftpfs.sourceforge.net is a good candidate as it allows you to mount an FTP to a directory as if it is a local folder and then run different commands on the files there (including find, grep, etc.). Take a look at this article.
This way, since the output comes form a local command, you'd be more certain that ls -t returns a properly sorted list.
Btw, it's a bit less convoluted to use ls -t | head -1 than ls -t -r | tail -1. They produce the same result but why reverse and grab from the tail when you can just grab the head :)
If you use curlftpfs then your script would be something like this (assuming server ftp.abc.com and user xyz with password xyz).
mkdir /tmp/ftpsession
curlftpfs ftp://xyz:xyz#ftp.abc.com /tmp/ftpsession
cd /tmp/ftpsession/Rpts
cp -Rpf $(ls -t | head -1) /your/destination/folder/or/file
cd -
umount /tmp/ftpsession
My Solution is this:
curl 'ftp://server.de/dir/'$(curl 'ftp://server.de/dir/' 2>/dev/null | tail -1 | awk '{print $(NF)}')