I want to execute some linux commands like docker run nginx on a remote Ubuntu server. Let's say host A using my client interface on another host B developed in symfony4 and then the server (host A) will send some info after executing the command to the client interface on host B to be displayed on it.
How to achieve this?
at first you need to log in:
ssh username#ip_address_of_server_with_symfony4
Then
cd /path/to/symfony4
Then
docker exec symfony_container_php php bin/console command:name --arguments
If you need to run single command from your local computer, you also can use ssh -t
ssh -t user#ip full;command;separated with;-es
Related
I just pulled the Jenkins image from docker hub by using below command.
sudo docker pull jenkins
sudo docker run -p 7071:7071 -p 50000:50000 jenkins this command help me to run the jenkins image inside the docker container
Now the problem is that I just want to run the Jenkins console to create a some sample test jobs. When I tried to access this from some windows machine I'm not able to connect and It returned 404 error code.
Tried to connect the Jenkins from Windows machine.
http://<ip address>:7071 -> this is failing to connect.
http://<ip address>:50000 -> This returning the Jenkins Agent Protocol details
Output:
Jenkins-Agent-Protocols: JNLP-connect, JNLP2-connect, JNLP4-connect,
Ping Jenkins-Version: 2.60.3 Jenkins-Session: cba34bd8 Client:
XXXXXXXX Server: YYYYYYYY
Can someone help me since I'm new to docker + jenkins world and want to know how to connect the dockerized jenkins that is hosted in the Linux box.
Thanks in advance.
The main port used by Jenkins for access via the web console is 8080 and so add:
docker run ... -p 8080:8080
The situation is like this:
There's a public shared folder containing command file on a remote server saying ServerRemoteA. From my laptop's Win10 cmd terminal I can run \\ServerRemoteA\sharedfolder\Windows\SomeCommand.exe successfully.
Now I ssh-logged to another LinuxHost and I should run its Linux version sharedfolder/Linux/SomeCommand on the same host ServerRemoteA, from the Linux terminal
I was refused to use "ssh user#host "command" " by the server. But I can run the exe version from Windows machine.
How can I do?
If you are connected to a remote host the program will run in the remote host, so the client does not matter if it is Linux or Windows. Thus, If you connect from:
Linux/Windows -> Windows: \\ServerRemoteA\sharedfolder\Windows\SomeCommand.exe
Linux/Windows -> Linux: sharedfolder/Linux/SomeCommand
if the SSH is in Linux if you just want to run one command you can use the following command:
ssh -l <username> ServerRemoteA "sharedfolder/Linux/SomeCommand"
Or if the server is in Windows from Linux ssh client:
ssh -l <username> ServerRemoteA "C:\\sharedfolder\Windows\SomeCommand.exe"
New user to GitLab and trying to set my Project up for the first time.
I've setup Gitlab with docker and think (I've setup a local server for it using docker??).
I've then gone created my project and added a SSH key but when I try to use the command ssh -T git#gitlab.com it fails.
I think its because I have a different domain instance name.
My problem is: what is my domain instance name and how do I find out ?
To access gitlab I just type in localhost in the browser and besides that I think its linked to one of my emails but neither works in the command
If your connection to the web GUI is localhost so your hostname/domain is the same for ssh connection.
You should make sure to open SSH port (22) to the container when you run it.
Add to docker run command -p 2222:22 this map container port 22 SSH to host port 2222 because port 22 is taken on the host by SSH already.
Edit
Jest test it on computer
After you open port for gitlab container something like this.
docker run -dit --name gitlab -p 2222:22 -p 8080:80 gitlab/gitlab-ce
Note the -p 2222:22 that probably what you missing.
You should be able to connect using ssh with this command
ssh -T git#localhost -p 2222
Good luck.
To deploy application on linux ubuntu server I have bunch of SSH commands that i currently run using PuTTY. The server has local account serviceaccount1. In PuTTY i connect to server using serviceaccount1 and executes the following commands one by one
cd /home/serviceaccount1/cr-ml
script /dev/null
screen -S data_and_status
cd cr-ml/notebooks
source activate crml
unset XDG_RUNTIME_DIR
jupyter kernelgateway --api='kernel_gateway.notebook_http' --seed_uri='data_and_status_api.ipynb' --port 8894 --ip 0.0.0.0
...
...
and so on
Now i want automate this using Jenkins. I installed SSH plugin, configured credential using SSH Username serviceaccount1 with private key
Then created a new jenkins project and added a build step Execute shell scripts on remote host using ssh and then add all the above commands.
When i build the jenkins project, it get stuck at executing 2nd command script /dev/null
i see the following console output
To me, it seems the culprit is the screen -S data_and_status command. Once you start a screen, I don't think you would be able to execute the subsequent commands over the SSH connection.
Alternatively, you can try using a tool like Ansible to run a bunch of commands against a remote server.
I've created an Amazon EC2 AMI running CentOS Linux 5.5 and PostgreSQL 8.4. I'd like to be able to create an SSH tunnel from my machine to the instance so I can connect to the PostgreSQL database from my development machine (JDBC, Tomcat, etc.) I haven't modified the PostgreSQL configuration at all as of yet. I can successfully SSH into the instance from a command line, and have run the following command to try and create my tunnel:
ssh -N -L2345:<My instance DNS>:5432 -i <keypair> root#<My instance DNS>
I don't receive any errors when initially running this command. However, when I try and use psql to open up a connection on localhost:2345, the connection fails.
Any thoughts as to why this is happening?
The first <My instance DNS> should be localhost. And you probably don't want to/need to ssh as root.