FTP an error occurred opening that folder on the FTP server - iis

I have implemented filezilla ftp in my server correctly, however I cannot connect to it via windows 10. You can find the issue when trying to connect here http://i.imgur.com/6mudn5c.png
Additionally here is the server logs: http://i.imgur.com/M5GIvfy.png

Try disabling/enabling Use Passive FTP in Internet Options - Advanced
It worked for me.

Checkout the file permissions. The root folder (usually the username) should have access to FTP with file read permission.

Related

jenkins.plugins.publish_over.BapPublisherException: Failed to connect and initialize SSH connection Message [Auth fail]

I am learning to use Jenkins to deploy a .Net 5.0 application on an AWS EC2 server. This is the first time I am using Linux server and Jenkins for .Net (I'm am a life long Windows guy), and I am facing an error while trying to publish my artifacts over SSH to Web Server.
My setup:
Jenkins server is an AWS EC2 Linux AMI server.
Web Server is also an AWS EC2 LInux AMI server.
My Jenkins is correctly installed and working. I am able to build and run unit test cases without any issues.
For Deploy, I am using 'Publish Over SSH' plugin, and I have followed all steps to configure this plugin as mentioned here https://plugins.jenkins.io/publish-over-ssh/.
However, when try to 'Test Configuration', I get the below error,
Failed to connect or change directory
jenkins.plugins.publish_over.BapPublisherException: Failed to connect and initialize SSH connection. Message: [Failed to connect session for config [WebServer]. Message [Auth fail]]
I did a ping test from Jenkins server to Web Server, and it is a success.
I'm using the .pem key in the 'Key' section of 'Publish over SSH'. This key is the same key I use to SSH into the web server.
The below link suggests many different solutions, but none is working in my case.
Jenkins Publish over ssh authentification failed with private key
I was looking at the below link which describes the same problem,
Jenkins publish over SSH failed to change to remote directory
However in my case I have kept 'Remote Directory' as empty. I don't know if I have to specify any directory here. Anyways, I tried creating a new directory under the home directory of user ec2-user as '/home/ec2-user/publish' and then used this path as Remote Directory, but it still didn't work.
Screenshot of my settings in Jenkins:
I would appreciate if anyone can point me to the right direction or highlight any mistake I'm doing with my configuration.
In my case following steps solved the problem.
Solution is based on Ubuntu 22.04
add two line in /etc/ssh/sshd_config
PubkeyAuthentication yes
PubkeyAcceptedKeyTypes +ssh-rsa
restart sshd service
sudo service sshd restart
you might consider the following:
a. From the screenshot you’ve provided, it seems that you have checked the Use password authentication, or use different key option which will require you to add your key and password (inputs from these fields will be used in connecting to your server via SSH connection). If you use the same SSH key and passphrase/password on all of your servers, you can uncheck/untick that box and just use the config you have specified above.
b. You might also check if port 22 of your web server allows inbound traffic from the security group where your Jenkins server/EC2 instance is running. See reference here.
c. Also, make sure that the remote directory you have specified is existing otherwise the connection may fail.
Here's the sample config

ftp -s:filename login error

I'm trying to ftp files from a windows computer to a Linux server (Redhad 6.1 with vsftpd). I can successfully login from the windows computer using:
ftp *servername*
and entering my credentials. However I need to ftp to the server and place a file in a directory because the users in the company will only have control of placing the file needed on the server. I am using the
ftp -s:filename.ftp *servername*
command and get an error during the login:
530 Login incorrect.
Login failed.
here is the code in the filename.ftp file:
user *username* *password*
sudo cd /dbx2/ekiexport
bi
put myfile.csv
bye
I have even tried:
user *username* pass *password*
sudo cd /dbx2/ekiexport
bi
put myfile.csv
bye
I know I can just ftp to the server to place the files there myself, but I need the users to run the script when they need to access the file. That's why I need to use the ftp -s:filename

fix file permission to read write access squid configuration via php code in linux server centos with lamp setup

For my new project i have to configure cent os linux server with lamp setup and install squid proxy server. Installed machine will act as a server in client side. The main purpose of the count the amount of bandwidth, their mac address , ip address will be logged in server it is will act like a proxy server. Every user will be assigned with bandwidth, total browsing hours, username, password etc.
Each user can access INTERNET via installed proxy server after logging with the username and password defined to them.
User management, mac address, ip addreess fetching all will be done using php code with linux command enabled for mac address from client machine and also for blacklist website filter.
For every action of this project have to access squid configuration file located in /etc/squid/squid.conf for enabling internet support with mac filter and even iptables.
But when my php code try to access the /etc/squid/squid.conf for processing . It is unable to access the file for read, write, append operation.
In my server side i have define the file permission for /etc/squid/squid.conf with read write access with the code executing in my linux server
chmod 777 /etc/squid/squid.conf
After assign the permission also the person logged in to the server unable to access the squid file for processing.
Even the tried
chmod 666 /etc/squid/squid.conf
but no help
How i have to configure my file permission for /etc/squid/squid.conf so even my i logged user via apache server can access the file for processing
is selinux running ? you can temporary disable it with
setenforce 0
selinux may block apache to read your squid config as they are in 2 differents context even if the file is 777 and apache was running as root(bad ideal) . Here some redhat documentation about squid and selinux doc

ftp: Name or Service not known

in command line
> ftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
Work on one computer but does not work on my other one. Error returned
ftp: ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/: Name or service not known
I also tried the raw IP address which is
> ftp ftp://130.14.250.10/1000genomes/ftp/data/
But it didn't work.
What is the problem here? how can I fix this?
The ftp command accepts the server name, not a URL. Your session likely should look like:
ftp ftp-trace.ncbi.nih.gov
(Server asks for login and password)
cd /1000genomes/ftp/data/
mget *
This depends on the ftp client you are using. On Mac OSX (ftp client from BSD), for example, the default command line ftp client accepts the full url, while for example in CentOS the default client doesn't, and you need to connect just to the hostname. So, it depends on the flavor of linux and the installed default ftp client.
Default ftp client in CentOS (ARPANET):
ftp ftp-trace.ncbi.nih.gov
cd 1000genomes/ftp/data
If you want to use the full url in CentOS 5.9 or Fedora 18 (where I tested it), you could install an additional ftp client. For example ncftp and lftp have the behavior you are looking for.
ncftp, available through yum or your favorite package manager:
ncftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
NcFTP 3.2.2 (Aug 18, 2008) by Mike Gleason (http://www.NcFTP.com/contact/).
Connecting to ...
...
Logged in to ftp-trace.ncbi.nih.gov.
Current remote directory is /1000genomes/ftp/data
lftp, also available through your favorite package manager:
lftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
cd ok, cwd=/1000genomes/ftp/data
lftp ftp-trace.ncbi.nih.gov:/1000genomes/ftp/data>
Another, more efficient, way to retrieve a page, is using wget or curl. These work for http, ftp and other protocols.
It looks to me like the computer that isn't working is already adding the ftp: to the URL, have you tried removing it from yours and seeing if that works?
> ftp ftp-trace.ncbi.nih.gov/1000genomes/ftp/data

FTP getting 550 Permission denied Apache Ubuntu Server 11

I just set up FTP today, and I am getting a 550 Permission denied error.
I tried chmod-ing the directory(/var/www/site1).
Any ideas?
Check if your username and password is correct and port number.
the other reason can be you may not be having the permission to use the ftp in your network.
I had the same error message transferring files from the server to my laptop. I changed the firewall settings to allow the server IP address. The error no longer appears.
In my case changing the chmod permissions did not affect it.

Resources