I want to somehow automatically upload files every 5 minutes. I want to upload/transfer the files from my linux vps to my web host.
What I'm trying to do is upload some logs files generated on my vps to my web host so administrators can access it with an .htaccess file.
use wput along with cron to ftp files to your host
wput [options] [file]... ftp://[username[:password]#]hostname[:port][/[path/][file]]
You will probably have to install the tool as its not included by default (at least it hasnt been on most of my installs)
You'll want to set up a cron job for this. The Wikipedia page for this has a nice overview of how the crontab file is laid out. However, you should check your distribution's documentation for better information (they could be using a different version or a completely different cron daemon).
The line you'd add to the crontab would look something like this:
*/5 * * * * <user to run command as> <your command>
See also: http://www.unixgeeks.org/security/newbie/unix/cron-1.html
Hopefully your web host provides SCP or FTP servers to allow you to copy files over. How do you transfer files when you're uploading your web site files?
If it's ftp, use the ftp command:
ftp -u user:password#host/destination_folder/ sourcefile.txt
If it's scp, use the scp command:
scp foobar.txt username#host:/some/remote/directory
Related
I have two machines, both running on linux with centos 7.
I have installed the rsync packages on both of them and i am able to sync a directory from one machine to the other.
Right now i am doing the syncing manually, each time i want to sync i am running the next line:
rsync -r /home/stuff root#123.0.0.99/home
I was wondering if there is a way of configuring the rsync to do the syncing automatically, every some amount of time or preferably when there is a new file of sub directory in the home directory?
Thank you for your help.
Any help would be appreciated.
If you want to rsync every some amount of time you can use cronjobs which can be configured to run a specific command each amount of time and if you want to run rsync when there is an update or modification you can use lsyncd. check this article about use lsyncd
Update:
As links might get outdated, I will add this brief example (You are free to modify it with what works best for you):
First create an ssh key on the source machine and then add the public key at the ~/.ssh/authorized_keys file on the destination machine.
In the source machine update this file ~/.ssh/config with the following content:
# ~/.ssh/config
...
Host my.remote.server
identityfile ~/.ssh/id_rsa
IdentitiesOnly yes
hostname 123.0.0.99
user root
port 22
...
And configure your lsyncd with the following then restart lsyncd's service
# lsyncd.conf
...
sync {
default.rsyncssh,
source="/home/stuff",
host="my.remote.server",
targetdir="/home/stuff",
excludeFrom="/etc/lsyncd/lsyncd.exclude",
rsync = {
archive = true,
}
}
...
You can setup an hourly cron job to do this.
rsync in itself is quite efficient in that it only transfers changes.
You can find more info about cron here: cron
I've just set up my first cron-jon to run a stock script every night.
Running it manually works fine.
It's stored in /admin/stock_update.php
The command i'm running is /usr/bin/php -q /admin/stock_update.php
But I'm getting emails saying no input file is specified?
Any ideas?
Cheers
Network services almost never expose actual paths on the server's hard disk drive and even if they could it isn't a behaviour you can rely on. So the fact that your file is located at /admin/stock_update.php in the FTP server doesn't say much about actual location on disk, which is what local command-line utilities expect.
In PHP, you can find path on disk of current file with the __FILE__ magic constant. You can create a test script:
<?php
var_dump(__FILE__);
... upload it to the same FTP location and execute through the web server. If that's not an option because files in your FTP account in not visible from the web you can run the file from cron and check the email.
Do you have CloudLinux kernel installed on that server and CageFS filestyem? If yes try running this:
cagefsctl -w cpaneluser; cagefsctl -m cpaneluser
Then try running the cron again
I don't know if this was an effect of the shellshock attack which my server was victim to (or another attack that worked) but it basically enabled the hacker to overwrite my SSH config file when the server rebooted.
This new file used wget to load in a file from a website, then another library of hack functions which I guessed he then used to run hacks/DOS from my server. I caught it pretty fast and ideally want to upgrade but because I have cancer and just had a big operation it is too much effort at the moment.
Therefore I did a lot of house keeping, changing passwords, removing shell access, reverting back to DASH, replacing the default shell for root and any other users to another folder with symbolic links, restoring the config file for SSH, removing CGI functionality from config files e.g
ScriptAlias /cgi-bin/ /home/searchmysite/cgi-bin/
#
allow from all
#
Removed AW stats and Webalizer for all virtual min sites.
I already had DenyHosts and Fail2Ban installed.
I also blocked in/outbound traffic to the IPs of the sites he was getting the files from.
However it seems since this change I have lost the visual cron manager from webmin.
When I go to the menu item "Scheduled Cron Jobs", it says, "The command crontab for managing user Cron configurations was not found. Maybe Cron is not installed on this system?"
However I can see in the file system it exists.
When I run crontab -l or crontab -e I get "Permission Denied"
whoami shows "root"
I did think at the time of the hack this was all related and he had used SSH and a Cron job to get his hack running.
What I want to know is how I can get the CronTab manager back.
All the cron jobs are still running such as importing feeds into my websites, running scheduled emails and so on, what I don't know is how to resolve this without a full rebuild.
If I had the time and energy I would do that but I am totally drained and before this hack everything was just running smoothly and my websites which bring me in money were working fine.
They currently are still working fine and I regularly check my logs for IPs that look odd, have strong htacess rules for xss/sql/path travesal/file hacks and ban whole countries from Cloudflare which the site sits behind. So I don't "think" the machine is compromised at the moment even if it is old - could be wrong though!
details of box
Operating system Debian Linux 5.0 Virtualmin version 3.98.gpl GPL WebMin Version: 1.610 Kernel and CPU Linux 2.6.32.9-rscloud on x86_64
So if anyone can help me get my crontab manager back that would be great.
Thanks
1) check if chattr exists, if not, download a new one.
2) type whereis crontab, then chattr -isa /path/to/crontab.(usually /usr/bin/cron) then chmod crontab back to it original settings.
3) navigate to /var/spool/ and
chattr -isa cron
cd cron
chattr -isa crontabs
4) remove cron entry in /etc/cron.weekly
Look in /etc/cron.weekly for any new
how can we access an Apache server using a Linux command, to retrieve file.
the file which is to be retrieved has been copied to a directory.
Without knowing more about what you are doing, I would suggest looking into wget or curl.
For example: to copy a file available on a web server via url to the current directory using the wget command:
wget http://www.example.com/path/to/file.txt
or, if you are accessing your web server by IP address:
wget http://192.168.1.1/path/to/file.txt
What is purpose of "wgetrc" in the below commands.
-sh-3.00$ WGETRC=/hom1/spyga/spp/wgetrc_local wget --directory-prefix=/home1/spyga/spp/download ftp://127.0.0.1/outgoing/DATA.ZIP
wgetrc_local files having the credentials of ftp server.
normally i am downloading the files from ftp server using below command.
-sh-3.00$ wget --ftp-user=xyz--ftp-password=12345 ftp://localhost/outgoing/DATA.ZIP
what is the different between above commands.
Please help me out to understand the commands.
Thanks you.
The first command simply specifies an alternative configuration file to use instead of the default ~/.wgetrc. You could also specify it using --config=/hom1/spyga/spp/wgetrc_local as argument to wget.
This file can contain wgetrc commands that change the behaviour of wget. In this case it's probably done so user and passwords don't have to be supplied on the command line. Specially on multiuser systems it is a security risk to pass passwords on the command line, as that can possibly be viewd by other users, so it's a little better to store them in a file with restricted access permissions instead. This way only processes started by the owner of the file can access it.
Another use of the wget startup file is to change it's default settings, user agent etc...
It's all documented here.