I am having an issue. I am using the SCP command to transfer files from my desktop of my mac osx too my virtual server. The thing is I ran the command and successfully transferred one file from my desktop over to the server, no problem.
So i use the same command which is:
scp filename_I_want_to_transfer user#serverip:
So basically that looks like scp test user#10.0.0.0:
(I just used random IP for example)
Anyways on the second file I'm trying to transfer which is also in the document format I continually get "No such file or directory".
Any ideas on why this might be happening?
To send a file from the local host to another server use:
scp /path/to/file.doc user#<IP or hostname>:/path/to/where/it/should/go/
To get a file from another server to the local host use:
scp user#<IP or hostname>:/path/to/file.doc /path/to/where/it/should/go/
This is the format I reliably use for copying from a location to another location. You can use absolute path, or relative/special character path, such as
scp suiterdev#fakeserver:~/folder/file .
which would be "Securely copy the file named file in $HOME/folder/ (~ equivalent to ~suiterdev or $HOME) as user suiterdev from host fakeserver to the current directory (.).
However you'll have to take care that special characters (see the shell's filename expansion mechanism) used for the remote path are not expanded locally (because that typically is not what you want).
Well for me am using Ubuntu 15.10 and this is what worked for me.
scp user#host.com:path/to/file.txt /home/to/local/folder/
instead of
scp user#host.com:/path/to/file.txt /home/to/local/folder/
Note that after user#host.com host i do not include the forward slash i immediately append the folder after the ":"
Scp uses the target user's home directory as default directory (as a relative path), so when you need an absolute path, then use one (starting with a slash (/)).
I know this is way too late for this to help you, but it may help others who had the same problem as me
for my case my pc is set up to use backwards slash "\" instead of forward slash "/" and changing to backwards slashes removed the errors
But I only had to change the slashes to backward slashes on my pc's directory as my raspberry pi uses forward slashes.
I know it is a bit confusing but it worked for me.
Related
scp -r /Users/Brain/Desktop/tree.png account#address:/home/directory
I successfully connect to server and enter password, but receive this message "/Users/Brain/Desktop/tree.png: No such file or directory found"
I know the file exists, it is sitting on my desktop and I can open it.
Any guidance would be much appreciated!!
Tried looking at this post but it did not help:scp files from local to remote machine error: no such file or directory
Typo? For a location like /Users, better odds are suggested for a person with the name Brian over one like Brain. After reversing the vowels, what happens with this command?
ls -l /Users/Brian/Desktop/tree.png
When presented with unexpected errors for file(s) known to exist, there's usually an issue with one pathname component. Start with the full path and drop trailing components until there's no error, eg:
ls /Users/Brain/Desktop/tree.png
ls /Users/Brain/Desktop
ls /Users/Brain
ls /Users
Some shells can trim a pathname component from the previous command with :h ; try repeating this:
!!:h
After typing the above, another possible shortcut is UP-arrow RETURN
I have to create a full website copy / mirror of https://www.landesmuseum-mecklenburg.de. This copy has to run locally on a Windows 7 system, which has no network connection. Windows 7 has a path length limitation (255), which is reached in my case. How do I circumvent this?
I create a static website copy on a Debian system with: wget --mirror --adjust-extension --convert-links --restrict-file-names=windows --page-requisites -e robots=off --no-clobber --no-parent --base=./ https://www.landesmuseum-mecklenburg.de/.
This way I got nearly everything I need. Some special URLs / images are downloaded via a seperated URL-list via: wget -c -x -i imagelist.txt
After I create an archive out of this files and transfer them to my Windows 7 Test system and extract them to a local Webserver (called "MiniWebserver") I can visit http://localhost/ and everything seems to work.
But some deep links, especially images have a path length in the windows NTFS filesystem of over 255 characters. All these images are not displayed in the local webpage.
I tried the -nH --cut-dirs=5 option of wget, but with no acceptable result (the index.html gets overwritten each time).
Another idea was to use the DOS path shorten compatbility feature, so that long directory names would be translated to 8 character names. E.g. longdirname translated to longdi~1. But I have no idea how to automate this.
Update: Yet another idea
One more thing, which came to my mind was to use hashes over the entire path (e.g. md5) and use this instead of the full path + filename. Additionally all URLs in the downloaded .html files must be substituted processed too. But again: I have no idea how to accomplish this using Debian command line tools.
You can try the following:
wget --trust-server-names <url>
--trust-server-names
If this is set to on, on a redirect the last component of the
redirection URL will be used as the local file name. By default it is
used the last component in the original URL.
I have a problem with deleting or even accessing folders on linux server.
The folders are located in wp-content of wordpress.
The problem is I can't open server folder listing in winscp because the folders have weird names.
Example names:
If I execute ls -l I can see I have required permissions and names like :
??? < - folder name example
??
I tried openning in filezilla which successfully connects to wp-content folder(winscp can't even do that) , but after entering the wp-content I can't open above mentioned folders or even rename them .
I tried ssh-ing into linux server but I can't manage to cd into above folders because it says that it can't find the file/directory.
what are the options for deleting files with special characters?
Tried using single quotes and backlashes, but when clicking tab nothing happens...
is it possible to delete all folders except the required ones - then I could name which ones to leave and delete all others.
You will need to use double quotation marks around the name of the file and asterisk wildcard as far as I know (asterisk means zero or more characters).
Have you tried this:
rm -rf -- *" ### "
where ### are the special characters
This website might be helpful:
https://www.computerhope.com/unix/urm.htm
Good luck!
Can any one help me writing a shell script to Download files from Linux/UNIX system?
Regards
On UNIX systems, such as Linux and OSX, you have access to a utility called rsync. It is installed by default and is the tool to use to download files from another UNIX system.
It is a drop-in replacement for the cp (copy) command, but it is much more powerful.
To copy a directory from a remote system to yours, using SSH, you would do this:
rsync username#hostname:path/to/dir .
(notice the dot at the end, this means 'place everything here please', you can also give the name of the local dir where the files should be placed.)
To download only some specific files, use this:
rsync 'username#hostname:path/to/dir/*.txt' .
(notice the quotes: if you omit them, your shell will try to expand the *.txt part locally, will fail and give you an error.)
Useful flags:
--progress: show a progress bar
--append: if a file has only partially downloaded, resume it where it left off
I find the rsync utility so useful, I've created an alias for it in my shell and use it as a 'super-copy':
alias cpa 'rsync -vae ssh --progress --append'
With that alias, copying files between machines is just as easy as copying files locally:
cpa user#host:file .
Making it even better
Since rsync is using SSH, it helps to setup a private/public key pair, so you don't have to type in your password every time:
How do I setup Public-Key Authentication?
Futhermore, you can write down your username in your .ssh/config file and give the remote host a short name: read about it here.
For example, I have something like this:
Host panda
Hostname panda.server.long.hostname.com
User rodin
With this setup, my command to download files from the panda server is just:
cpa panda:path/to/my/files .
And there was much rejoicing.
On a computer with IP address like 10.11.12.123, I have a folder document. I want to copy that folder to my local folder /home/my-pc/doc/ using the shell.
I tried like this:
scp -r smb:10.11.12.123/other-pc/document /home/my-pc/doc/
but it's not working.
So you can use below command to copy your files.
scp -r <source> <destination>
(-r: Recursively copy entire directories)
eg:
scp -r user#10.11.12.123:/other-pc/document /home/my-pc/doc
To identify the location you can use the pwd command, eg:
kasun#kasunr:~/Downloads$ pwd
/home/kasun/Downloads
If you want to copy from B to A if you are logged into B: then
scp /source username#a:/destination
If you want to copy from B to A if you are logged into A: then
scp username#b:/source /destination
In addition to the comment, when you look at your host-to-host copy options on Linux today, rsync is by far, the most robust solution around. It is brought to you by the SAMBA team[1] and continues to enjoy active development. Most distributions include the rsync package by default. (if not, you should find an easily installable package for your distro or you can download it from rsync.samba.org ).
The basic use of rsync for host-to-host directory copy is:
$ rsync -uav srchost:/path/to/dir /path/to/dest
-uav simply recursively copies -ua only new or changed files preserving file & directory times and permissions while providing -v verbose output. You will be prompted for the username/password on 10.11.12.123 unless your have setup ssh-keys to allow public/private key authentication (see: ssh-keygen for key generation)
If you notice, the syntax is basically the same as that for scp with a slight difference in the options: (e.g. scp -rv srchost:/path/to/dir /path/to/dest). rsync will use ssh for secure transport by default, so you will want to insure sshd is running on your srchost (10.11.12.123 in your case). If you have name resolution working (or a simple entry in /etc/hosts for 10.11.12.123) you can use the hostname for the remote host instead of the remote IP. Regardless, you can always transfer the files you are interested in with:
$ rsync -uav 10.11.12.123:/other-pc/document /home/my-pc/doc/
Note: do NOT include a trailing / after document if you want to copy the directory itself. If you do include a trailing / after document (i.e. 10.11.12.123:/other-pc/document/) you are telling rsync to copy the contents, (i.e. the files and directories under) document to 10.11.12.123:/other-pc/ without also copying the document directory.
The reason rsync is far superior to other copy apps is it provides options to truly synchronize filesystems and directory trees both locally and between your local machine and remote host. Meaning, in your case, if you have used rsync to transfer files to /home/my-pc/doc/ and then make changes to the files or delete files on 10.11.12.123, you can simply call rsync again and have the changes/deletions reflected in /home/my-pc/doc/. (look at the several flavors of the --delete option for details in rsync --help or in man 1 rsync)
For these, and many more reasons, it is well worth the time to familiarize yourself with rsync. It is an invaluable tool in any Linux user's hip pocket. Hopefully this will solve your problem and get you started.
Footnotes
[1] the same folks that "Opened Windows to a Wider World" allowing seemless connection between windows/Linux hosts via the native windows server message block (smb) protocol. samba.org
If the two directories (document and /home/my-pc/doc/) you mentioned are on the same 10.11.12.123 machine.
then:
cp -ai document /home/my-pc/doc/
else:
scp -r document/ root#10.11.12.123:/home/my-pc/doc/