microsoft ftp ls wrong filename result - iis

When creating a file in my ftp directory with the following name:
$^_°9+[µù§#é'(².txt
the name returned when I use 'ls' is this:
$^_░9+[Á¨º#Ú'(▓.txt
the UTF-8 is enabled.
How can I fix this to get a correct name

I cannot reproduce your problem.
mkdir $^_°9+[µù§#é'(².txt
The above command creates the $^_°9+[µù§#é'(².txt folder on the FTP server.
Windows 10 IIS server. Windows 10 ftp.exe client.
My FTP script is obviously in UTF-8 encoding.

Related

Sending .csv file from linux to windows

I want to send files (txt or csv) from linux to windows.
I already have a script to get information and put it into a .txt or .csv file, tried with many ways to send this file from linux to my computer.
there is a ping from server to my computer IP, but when i use below commands it gives:
ssh: connect to host 10.10.X.X port 22: Connection timed out
scp -r fname.lname#10.10.X.X:/home/ test.txt
or
scp test.txt fname.lname#10.10.X.X:/C:/Data
Please could you help, simply I wanna have a copy of file (that I have it in server) in my computer, to use it.
there is some similar questions with no answer here.
You need a ssh server installed on windows. Windows does not currently have out of the box ssh server. They are thinking of implementing OOB ssh servers in future releases of windows 10.
Have a look at this link https://winscp.net/eng/docs/guide_windows_openssh_server
Also, if the file transfer that you want is a one time transfer, you can use putty with a reverse scp to retrieve the file or you can use WINscp ( https://winscp.net/eng/download.php )
I usually use the command 'nc' for file transmission.
But since on Windows you have to install a cygwin to use nc, so I think the simplest solution may be like the following.
On linux, go to the directory of those files, and then type:
python -m SimpleHTTPServer 1234
Then on windows you can visit 10.10.X.X:1234 in your browser, and download those files.
Note that 1234 can be replaced by any other port which is not currently used on linux.

FTP an error occurred opening that folder on the FTP server

I have implemented filezilla ftp in my server correctly, however I cannot connect to it via windows 10. You can find the issue when trying to connect here http://i.imgur.com/6mudn5c.png
Additionally here is the server logs: http://i.imgur.com/M5GIvfy.png
Try disabling/enabling Use Passive FTP in Internet Options - Advanced
It worked for me.
Checkout the file permissions. The root folder (usually the username) should have access to FTP with file read permission.

Using SCP command to download files from Linux server to client server

I'm creating files on a Linux server that I'm logged into and I'm adding the ability for the user to download these files from the Linux server on to the connecting computer. I'm writing a scrip and using the scp command to download these files:
scp data.txt user#usraddress:/home/usr
However, I don't want to specify "user#usraddress:/home/usr" to be just my computer. I want whoever is logged onto the linux server to be able do download these files. Is there a way to get the address of the connecting computer?
How would I do this?
Forgive me if this seems elementary, I'm very new to scripting.
When you open a remote session in a GNU/Linux machine, the ssh server sets the environment variable SSH_CONNECTION with some connection information. You can use this variable and the $USER variable to fill that parameters:
scp data.txt $USER#${SSH_CONNECTION%% *}:/home/$USER
Note that as far as I know you couldn't assume the client home directory is at /home. As said by chepner, you could omit the destination directory to use the default location, the home directory.
scp data.txt $USER#${SSH_CONNECTION%% *}:

Apache and selinux, use of linux command in retrieving files

how can we access an Apache server using a Linux command, to retrieve file.
the file which is to be retrieved has been copied to a directory.
Without knowing more about what you are doing, I would suggest looking into wget or curl.
For example: to copy a file available on a web server via url to the current directory using the wget command:
wget http://www.example.com/path/to/file.txt
or, if you are accessing your web server by IP address:
wget http://192.168.1.1/path/to/file.txt

ftp: Name or Service not known

in command line
> ftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
Work on one computer but does not work on my other one. Error returned
ftp: ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/: Name or service not known
I also tried the raw IP address which is
> ftp ftp://130.14.250.10/1000genomes/ftp/data/
But it didn't work.
What is the problem here? how can I fix this?
The ftp command accepts the server name, not a URL. Your session likely should look like:
ftp ftp-trace.ncbi.nih.gov
(Server asks for login and password)
cd /1000genomes/ftp/data/
mget *
This depends on the ftp client you are using. On Mac OSX (ftp client from BSD), for example, the default command line ftp client accepts the full url, while for example in CentOS the default client doesn't, and you need to connect just to the hostname. So, it depends on the flavor of linux and the installed default ftp client.
Default ftp client in CentOS (ARPANET):
ftp ftp-trace.ncbi.nih.gov
cd 1000genomes/ftp/data
If you want to use the full url in CentOS 5.9 or Fedora 18 (where I tested it), you could install an additional ftp client. For example ncftp and lftp have the behavior you are looking for.
ncftp, available through yum or your favorite package manager:
ncftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
NcFTP 3.2.2 (Aug 18, 2008) by Mike Gleason (http://www.NcFTP.com/contact/).
Connecting to ...
...
Logged in to ftp-trace.ncbi.nih.gov.
Current remote directory is /1000genomes/ftp/data
lftp, also available through your favorite package manager:
lftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
cd ok, cwd=/1000genomes/ftp/data
lftp ftp-trace.ncbi.nih.gov:/1000genomes/ftp/data>
Another, more efficient, way to retrieve a page, is using wget or curl. These work for http, ftp and other protocols.
It looks to me like the computer that isn't working is already adding the ftp: to the URL, have you tried removing it from yours and seeing if that works?
> ftp ftp-trace.ncbi.nih.gov/1000genomes/ftp/data

Resources