We have a non-root account on a Linux server, so it can not install package
we have to ftps to another machine in this server.
If I try this :
>ftp [machinName]
after inserting username, it shows the following error:
534 policy requires ssl ftp
how can I handle FTP over SSL without external package such as lftp, curl, ...
My Linux: SUSE Linux version 11
Most *nix servers come with some scripting language installed. PHP or Python have FTPS (FTP over TLS/SSL) functions.
So you can write a PHP/Python script for your FTPS task.
PHP: How to Send File over secure FTP SSL Protocol.
Python: FTPES - FTP over explicit TLS/SSL in Python
Related
I have been using Bitwise SSH client to transfer files etc on a Linux server.
I now have a new Linux server, and need to change my level of access via SUDO. I'm not sure how to do that with this client.
Is there a config I can use that automatically elevates my access using SUDO?
I have a webdav server on my linux machine with SSL authentication. I can mount this webdav at localhost, at remote linux machine and also at remote MacOS machine. It ask to accept the certificate and it mount.
Now I am trying to mount at windows 10. First, windows does not show any certificate to be accept. I transfer the certificate created on some remote linux machine like this: echo -n | openssl s_client -connect 145.117.144.230:443 | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > /etc/irods/ssl/145.117.144.230.crt and installed at the windows 10. I have installed at local machine for all users. The Certificate store I have used was "Trusted Root Certification Authorities". I also have used "Intermediate Certification Authorities". All that I use shows that the certificated was imported successfully. So I try to map the network drive https://ugp-repmed.fedora20.ebiocloud.amc.nl with credentials user/pass and show this message:
The mapped network drive could not be created because the following error has occurred: Mutual Authentication failed. The server's password is out of date at the domain controller.
If I try to connect through the browser I can accept the certificate and I can access, but read-only. I need to map the network drive and this error is stucking me. Does anybody know how to solve? Thanks
Is the authenticaton to the WebDAV server BASIC or DIGEST? With Windows 10 I believe the redirector only support DIGEST unless you modify the registry.
Try setting a DWORD BasicAuthLevel in HKLM\SYSTEM\CurrentControlSet\Services\WebClient\Parameters to "1"
Values are 0 (default) - 2
0 - Basic authentication disabled
1 - Basic authentication enabled for Secure Sockets Layer (SSL) shares only
2 or greater - Basic authentication enabled for SSL shares and for non-SSL shares
For my new project i have to configure cent os linux server with lamp setup and install squid proxy server. Installed machine will act as a server in client side. The main purpose of the count the amount of bandwidth, their mac address , ip address will be logged in server it is will act like a proxy server. Every user will be assigned with bandwidth, total browsing hours, username, password etc.
Each user can access INTERNET via installed proxy server after logging with the username and password defined to them.
User management, mac address, ip addreess fetching all will be done using php code with linux command enabled for mac address from client machine and also for blacklist website filter.
For every action of this project have to access squid configuration file located in /etc/squid/squid.conf for enabling internet support with mac filter and even iptables.
But when my php code try to access the /etc/squid/squid.conf for processing . It is unable to access the file for read, write, append operation.
In my server side i have define the file permission for /etc/squid/squid.conf with read write access with the code executing in my linux server
chmod 777 /etc/squid/squid.conf
After assign the permission also the person logged in to the server unable to access the squid file for processing.
Even the tried
chmod 666 /etc/squid/squid.conf
but no help
How i have to configure my file permission for /etc/squid/squid.conf so even my i logged user via apache server can access the file for processing
is selinux running ? you can temporary disable it with
setenforce 0
selinux may block apache to read your squid config as they are in 2 differents context even if the file is 777 and apache was running as root(bad ideal) . Here some redhat documentation about squid and selinux doc
I am trying to access a source code repository through CYGWIN, right after I have successfully checked out files through windows explorer, but it keeps giving me a 'host not found error in CYGWIN.
Trying to execute a svn co command for checkout in CYGWIN.
In the SVNTortoise network settings I have included my proxy server and username/password already.
Please let me know what I am missing here.
Below is the exact scenario from my cygwin bash while I am doing so:
bash-3.2$ cd
bash-3.2$ svn --version
svn, version 1.6.3 (r38063)
compiled Jul 9 2009, 10:30:30
Copyright (C) 2000-2009 CollabNet.
Subversion is open source software, see http://subversion.tigris.org/
This product includes software developed by CollabNet (http://www.Collab.Net/).
The following repository access (RA) modules are available:
* ra_neon : Module for accessing a repository via WebDAV protocol using Neon.
- handles 'http' scheme
- handles 'https' scheme
* ra_svn : Module for accessing a repository using the svn network protocol.
- with Cyrus SASL authentication
- handles 'svn' scheme
* ra_local : Module for accessing a repository on local disk.
- handles 'file' scheme
* ra_serf : Module for accessing a repository via WebDAV protocol using serf.
- handles 'http' scheme
- handles 'https' scheme
bash-3.2$ svn co http://repositoryname/checkoutfile
svn: OPTIONS of 'http://repositoryname/checkoutfile': Could not resolve hostname `repositoryname': Unknown host (htp://repositoryname.com)
As far as I'm aware, the Cygwin tooling won't use the proxy settings you've set up for TortoiseSVN.
Try setting the proxy in your shell before running the commands:
export http_proxy="http://<user>:<password>#<proxy>:<port>"
svn ...
If that works, you can add the export ... line to your .bash_profile.
in command line
> ftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
Work on one computer but does not work on my other one. Error returned
ftp: ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/: Name or service not known
I also tried the raw IP address which is
> ftp ftp://130.14.250.10/1000genomes/ftp/data/
But it didn't work.
What is the problem here? how can I fix this?
The ftp command accepts the server name, not a URL. Your session likely should look like:
ftp ftp-trace.ncbi.nih.gov
(Server asks for login and password)
cd /1000genomes/ftp/data/
mget *
This depends on the ftp client you are using. On Mac OSX (ftp client from BSD), for example, the default command line ftp client accepts the full url, while for example in CentOS the default client doesn't, and you need to connect just to the hostname. So, it depends on the flavor of linux and the installed default ftp client.
Default ftp client in CentOS (ARPANET):
ftp ftp-trace.ncbi.nih.gov
cd 1000genomes/ftp/data
If you want to use the full url in CentOS 5.9 or Fedora 18 (where I tested it), you could install an additional ftp client. For example ncftp and lftp have the behavior you are looking for.
ncftp, available through yum or your favorite package manager:
ncftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
NcFTP 3.2.2 (Aug 18, 2008) by Mike Gleason (http://www.NcFTP.com/contact/).
Connecting to ...
...
Logged in to ftp-trace.ncbi.nih.gov.
Current remote directory is /1000genomes/ftp/data
lftp, also available through your favorite package manager:
lftp ftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/data/
cd ok, cwd=/1000genomes/ftp/data
lftp ftp-trace.ncbi.nih.gov:/1000genomes/ftp/data>
Another, more efficient, way to retrieve a page, is using wget or curl. These work for http, ftp and other protocols.
It looks to me like the computer that isn't working is already adding the ftp: to the URL, have you tried removing it from yours and seeing if that works?
> ftp ftp-trace.ncbi.nih.gov/1000genomes/ftp/data