I have a bash script to backup a database and send it to another server, running the script on ssh (root) it sends the file correctly, but when using cPanel cron, i got this error:
cd: Fatal error: pseudo-tty allocation failed: No such file or directory
put: Fatal error: pseudo-tty allocation failed: No such file or directory
It looks like to fail on lftp changing to uploads folder
Cron
/bin/sh /home/test/backup/script.sh >> /home/test/backup/log.txt 2>&1
Bash
/bin/lftp sftp://user:pass#domain.com:22/uploads -e "put $FILE2; bye"
I assume your issue is this: You are logging in via SFTP using a ssh key (otherwise every time you try to sftp you would normally be required to enter a password and that will mess the cron). Probably you have the ssh key saved under the user root but when you execute the cron, it executes as a cpanel user (unless you executed directly in the root crontab). If executed as a cpanel user, and that user doesn't have the ssh key, then the cron hangs asking for the sftp password. Please be sure that the ssh private key used to SFTP as root is also added to the cpanel user's account under which the cron is executed. It should work then
Related
I am using a bash script in Linux to transfer files to a server. My script is running from cron and I have directed output to a file but I cannot know from logs if the file has been transferred to B server or not.
This is the cron:
1>>/home/owais/script_test/logs/res_sim_script.logs 2>>/home/owais/script_test/logs/res_sim.logs
And the FTP is as below:
cd ${dir}
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
lcd $dir
cd $destDir
bin
prompt
put FILENAME
bye
The only thing that I get in the logs is:
Local directory now Directory_Name
Interactive mode off.
Instead of using FTP, there is rsync. Rsync is a fast and extraordinarily versatile file copying tool. It can copy locally, to or from another host over any remote shell, or to, or from a remote rsync daemon.
More information at the following webpage, https://linux.die.net/man/1/rsync
I have used ftp -inv Host << EOF >> LogFilePath and it worked. Thank you all for the support
Am trying to send MySQL backup from server1 to server2 for backups storage
My command in shell..
scp myfile user#root:/file destination /
I use ssh-agent for paraphrase key pass..
In crontab normal just link to my shell When I use commands .. bash myshel.sh Is successful send but in crontab no will give Root mail saying Permission denied (publickey,gassapi-with -mic,password)
root /usr/Backup/myshell.sh
I want to transfer some files from my local to remote, like github does it. I want to happend it very smooth like in shell script. I tried creating one shell script which automates the process of ssh authentication without password but for first time it exposes my remote server password. I dont want to do it that way. Like in git we can't see their server password. Is there any possible way that we can do ?
I used this article script to automate ssh login. http://www.techpaste.com/2013/04/shell-script-automate-ssh-key-transfer-hosts-linux/
As i mentioned, you can use the scp command, like this:
scp /local_dir/some*.xml remote_user#remote_machine:/var/www/html
This requires that you need connect to the remote machine without password, only with ssh key-authentication.
Here is a link: http://linuxproblem.org/art_9.html to help you.
The important steps: (automatic login from host A / user a to Host B / user b.)
a#A:~> ssh-keygen -t rsa
a#A:~> ssh b#B mkdir -p .ssh
a#A:~> cat .ssh/id_rsa.pub | ssh b#B 'cat >> .ssh/authorized_keys'
I want to run a shell script using SSH which takes resource from other machine while the script is in some other machine, all on the same network. I don't want to copy the resource to the local machine.
Note: The shell script takes .txt file as input
If you have script.sh on server1 and file.txt on server2, you can connect through ssh to server1, and then do:
[user#server1]$ ssh user#server2 "cd mydir && cat file.txt" | ./script.sh
Try this:
ssh USER_NAME#HOST_ADDRESS "BASH_SCRIPT_FILE_PATH"
You will need to provide password whenever required.
If your script is in Machine A, you can't run that on Machine B without copying it over. First, copy the script over to Machine B using scp
[user#machineA]$ scp /path/to/script user#machineB:/home/user/path
Then, just run the script
[user#machineA]$ ssh user#machineB "/home/user/path/script"
This will work if you have given executable permission to the script.
OR
Try this one..
<hostA_shell_prompt>$ ssh user#hostB "ls -la"
That will prompt you for password, unless you have copied your hostA user's public key to the authorized_keys file on the home of user .ssh's directory. That will allow for passwordless authentication (if accepted as an auth method on the ssh server's configuration)
I not fully understand your question. Other answers gave "How to run remote script?"
But i think question is Remote script has to take remote file, even I not sure about this
Login Remote PC using ssh.
Install sshfs if not installed .
Then mount other remote machine directory which has the file you want to use in script to local directory. This can be done using sshfs
Then run the script with file from locally mounted directory
Then unmount the directory when you finished.
Somewhat large procedure.
Mounting remote directory with sshfs
man sshfs
I am running an automation script to automate the login and some other commands to be run on a remote target using plink. I used following approach to do a automatic login and saving the RSA key:
user#ubuntu~$ echo -e 'y\n' | plink root#<target ip> -pw <password> "pwd"
This command saves the key when run through command line, but when run using script, is inconsistent in saving the RSA key. Consider username and password being passed as correct, it prompts the error message as the Connection refused, as 'y' is not fetched in the prompt input.
Many times, it will prompt for accepting the key again and again as I have many simultaneous consecutive plink commands used in my script. Ideally, it shouldn't ask for user input more than once. I checked, 'sshhostkeys' file which was not present in ~/.putty folder, which is the cause for prompt for user input each time plink is run.
Has anyone faced this problem earlier? Any fix for this , or any hack/workaround for this?
P.S: Using expect scripts, or manually saving a profile using putty, or manually running the plink command and saving the key for once, is being ruled out (not to be considered).
Got the solution, actually the issue was with permission assigned to the $HOME/.putty directory. The ownership information for the folder was also root, thus when I was trying to run
user#ubuntu~$ echo -e 'y\n' | plink root#<target ip> -pw <password> "pwd"
I was getting the prompt for '(y/n)' repeatedly as the key was not getting saved in .putty folder due to the permission issues. Above command when once run was not able to create file sshhostkeys file due to which it was asking again and again for saving the key, each time it tries to save the key but was not able to save as it didn't have root permission. This issue is resolved by assigning rwx permission for all other (sudo chmod 707 ~/.putty) or other approach can be changing the ownership information to the user running the script by 'chown'.