FileZilla uploads files with 000 permissions [closed] - file-permissions

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I've been having problems with FileZilla a few days now. I have a server running on CentOS with Apache etc. And am uploading Wordpress to a new domain on my server. I've done this a million times before and am currently running 3 other properly working Wordpress sites on my server.
However, when I upload the files to the new domain folder all files receive a 000 permissions. Folders get a normal 755. Manually chmodding the files works but I'm not going to chmod 2000 files and figure out which ones need more permissions.
I have no idea why all of the sudden the files have no permissions, I've changed nothing in the way I connect to the server. I connect over SFTP with the same installation of FileZilla I always use.
I'm on OSX 10.10.1 with the most recent version of FileZilla. I have downloaded the most recent .zip with WordPress, have extracted it and am uploading the files by drag and drop.

This is a known bug in FileZilla 3.10.0-beta3 through 3.10.0.1.
https://forum.filezilla-project.org/viewtopic.php?t=34953
Either upgrade to 3.10.0.2 or later. Or use another SFTP client.

Related

How can I access FTP through a linux-server SSH? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I recently acquired a project from a former colleague, but the only information he gave me was the SSH key for a server and a bitbucket repository. I need to access the FTP of the server so I can change the files of the website.
I have zero experience with SSH or console commands. I have the repository but I don't know how to upload it. A friend of mine said that it's possible to pull a repo to the server, but I don't know how to even transcend in the folders of the server. I have just the console.
It says that the server's image is - ubuntu-1604-xenial-v20180127
And these are the only options I have - http://prntscr.com/p31inf
Also note that the website is running on Magento and I have no idea how it works. I'm a wordpress developer.
What you want is sftp: https://www.digitalocean.com/community/tutorials/how-to-use-sftp-to-securely-transfer-files-with-a-remote-server
So just use a ftp-client like filezilla and select sftp as protocol
Btw, magento is a little bit different than wordpress. Good luck ;)

Website migration from a server with cPanel to one without? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I want to migrate some websites (files and databases) from a server with cPanel accounts to a new server running on Ubuntu where is not cPanel.
Which would be the quickest and simple method?
Thank you in advance !
Since the server doesn't have any control panel, you have only manual migration option.
Setup a web server and add websites.
Assign separate home directory to each website.
Setup database server and migrate databases.
You can generate full cPanel backup and then extract it on new server. You will get web contents from homedir folder and MySQL.
The quickest way is to do a complete backup, and download the file. It should be in the root directory for that user.
The path and filename looks like:
/home/username/backup-todaysdate_username.tar.gz
Download that, and extract the files.
You'll want the /homedir (which is where the website is) and the /mysql folder contains a backup of the databse in .sql format.
Then copy everything up to your new server, and you'll have to extract the mysql database into mysql using the MYSQL CLI. Here's some instructions on that -> Importing large sql file to MySql via command line

How to download whole web directory with multiple download streams? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I want to download a directory with 100s of large files (500MB - 1.5GB each.) The only problem is there is a download speed limit, and it takes nearly an hour just to download a single file.
What built in command or package on Linux would be able to download all files in a web directory with multiple download streams? If I ever have to restart, I would need the program to ignore already downloaded files.
Read the man page for wget. It supports exactly what you want.
Note that most sites will ban you for downloading too many files too quickly. Someone is paying for that bandwidth, and if you leech too much, it becomes a tragedy of the commons.
Additionally to wget, you can use axel to open multiple streams for a single file, in case you need more speed on single files as well.

Uninstall CouchDB Completely Mac OSX [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I've played around with my couchdb configuration and I would like to start fresh again, remove the app and downloading a new one didn't seem to remove the configuration completely, where I still have to access with my username and password, and my dbs are still there.
Looking around on internet, I don't seem to find a solution to fix the issue on mac os x (mavericks)
Well, I've done some things like:
removing the cache folder from ~/Library/Caches
removing any related file/folder in ~/Library/Application Support/CouchDB
checking the config file from ~/Library/Preferences/couchdb-server.ini seems to be an alias to config file in Application Support CouchDB folder
I also removed a couple of files that I don't remember their names
Still installing a fresh download couchdb is not working application and now I am getting the json message:
{"error":"unauthorized","reason":"Name or password is incorrect."}
P.S: My System is Mac OS X Mavericks and Couchdb 1.5.1
Well, it seems I've found the missing file very soon, after I posted my question, so for anyone who gets this problem in future, this is the culprit file:
/Users/kouiti/Library/Preferences/org.apache.couchdb.plist

Vagrant file access issues with NFS [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm running vagrant with an Ubuntu server 10.04 LTS box. The local folder on my Mac is mounted with NFS and all works fine except for one thing.
I'm developing a PHP project which uses Twig as template system but whenever I modify a template on my Mac and save it, it required two refreshes before Twig can load the template. The first refresh i'm getting an exception telling me that the file cannot be found. the second refresh is fine and the template loads just fine.
When switching back from NFS to the default VirtualBox filesystem all is fine but my dev site become terribly slow, almost to the point it's unworkable.
Anyone has an idea in what direction to look or has a clear solution?

Resources