Can't connect to "Jenkins-On-Azure" - azure

I created a Jenkins linux vm on Azure on a new resource group.
I followed the steps described here:
Create a Jenkins server on an Azure Linux VM from the Azure portal.
So I ran the command ssh -L 127.0.0.1:8080:localhost:8080 jenkinsadmin#jenkins2517454.eastus.cloudapp.azure.com
(changed the username and dns name to my own) on my linux vm and it seems fine (no errors).
Now whenever I try to connect from my own computer (not on azure) on port 8080 I get on the linux vm the following message: channel 2: open failed: administratively prohibited: open failed and It doesn't let me log in into Jenkins.
How can it be solved?
Thank you

This is not a NSG issue. You don't need add port 8080 on Azure NSG rules.
If you want to connect from your computer with http://localhost:8080/, you should need create a SSH tunnel on your local computer. You could do it with putty.
Configure the Tunnel
Also, you could install Linux on Windows. Please refer to the following steps:
1.Install Linux on Windows.
2.Open Power shell on execute bash
3.Execute sudo -i and ssh -L 127.0.0.1:8080:localhost:8080 jenkinsadmin#jenkins2517454.eastus.cloudapp.azure.com
Now, I could access http://localhost:8080/ on my local computer.(The default user name is admin).

In order to access from external network, you need to "add inbound port rule" as follows:
For more details, refer "Create Jenkins server on an Azure Linux VM from the Azure Portal".

Related

Nginx refuses ssh, locked outside

I deployed a ubuntu VM in the Azure microsoft cloud. I might have forgotten to allow ssh connections when I set up the firewall. So now whenever I try to ssh in my VM I get a ssh timeout. I checked using nmap and see that only the http and https ports are open. Is there any way to reconfigure the firewall so I can allow ssh. Note its not azure thats blocking the ssh connection, its my nginx server itself. I set nginx to auto restart when restarting the VM so that won't be a solution.
In the worst case I would just delete the VM and make a new one, but this would mean I have to reinstall everything.
thx for helping
Edit I will just make a new VM.
There is a panel inside the azure portal where you can execute a remote command. I used this to execute sudo ufw allow ssh, now I can connect back to the vm!
Since the Run command option might not be available for your resource group and for your access rights, another possibility is to use Azure CLI command-line tool: if you have it installed, you need to log in via az login, and then you need to type:
az vm run-command invoke -g YourResourceGroup \
-n YourVM_Name --command-id RunShellScript \
--scripts "sudo ufw allow ssh"

Connecting to Azure Container Services from Windows 8.1 Docker

I've been following this tutorial to set up an Azure container service. I can successfully connect to the master load balancer via putty. However, I'm having trouble connecting to the Azure container via docker.
~ docker -H 192.168.33.400:2375 ps -a
error during connect: Get https://192.168.33.400:2375/v1.30/containers/json?all=1: dial tcp 192.168.33.400:2375: connectex: No connection could be made because the target machine actively refused it.
I've also tried
~ docker -H 127.0.0.1:2375 ps -a
This causes the docker terminal to hang forever.
192.168.33.400 is my docker machine ip.
My guess is I haven't setup the tunneling correctly and this has something to do with how docker runs on Windows 8.1 (via VM).
I've created an environment variable called DOCKER_HOST with a value of 2375. I've also tried changing the value to 192.168.33.400:2375.
I've tried the following tunnels in putty,
1. L2375 192.168.33.400:2375
2. L2375 127.0.0.1:2375
3. L22375 192.168.33.400:2375
4. L22375 127.0.0.1:2375 (as shown in the video)
Does anyone have any ideas/suggestions?
Here are some screenshots of the commands I ran:
We can follow this steps to setup tunnel:
1.Add Azure container service FQDN to Putty:
2.Add private key(PPK) to Putty:
3.Add tunnel information to Putty:
Then we can use cmd to test it:

Unable to connect with Azure Container Services - Kubernetes

I am working on setting up environment for deploying microservices.
I have gotten as far as building my code and deploying to a registry but having problem running it in Azure Container Services.
I am following this guide to connect to ACS: https://learn.microsoft.com/en-us/azure/container-service/container-service-connect
But i fail on the step: Download Cluster Credentials
Using the given command
az acs kubernetes get-credentials --resource-group=<cluster-resource-group> --name=<cluster-name>
Ofc changing the reseource group and clustername to the correct names from my portal. I get an error:
[WinError 10049] The requested address is not valid in its context
(if i change resource group or clustername to something else I get other errors so seems it can find those at least)
When i try to search for the error it seems to be some IP adress problem but can't figure out what to do. Tried running same command from other network (from home) to make sure work firewall is not blocking something.. but I get the same error
Any help appriciated!
This command copy the cluster credentials to your machine. Background processes are ssh to your cluster VM and copy the credentials.
So, you should ensure you could ssh to the master VM manual. If you could not ssh to master VM manual, az command also could not do it. You could get your master-dns-name on Azure Portal.
ssh -i id_rsa <user>#<master-dns-name>
Notes: If az command does not work and you could ssh to master VM, you could download credentials to your machine. They are same. You could check your link about this.
You also need check your azure cli version. You could use the following commands
az --version
My version is 2.02. It works for me.

Enabled UFW, now can't connect to my ubuntu azure vm

I turned on UFW on my ubuntu linux vm in azure, and now I can't connect to it over ssh (or anything). On both the private and public IP.
Is there any way for me to connect to my vm now? e.g. the equivalent of like an iLo interface for physical machines?
Here is an msdn blog describing exactly this situation.
In a nutshell:
Logon to the Azure portal
VM Name > Extensions > Add > Select “Custom Script for Linux” > Create
Upload the bash script I've appended below. Call it ufw_disable.sh
Set the command as sh ufw_disable.sh
Click OK, and wait for the script to deploy and execute.
The script will be run as root, so there is no need to do sudo inside the script (in fact this will cause things to fail).
ufw_disable.sh:
ufw disable
ufw status
Even simpler with the latest Azure...what a lifesaver:
Azure Portal > Your VM > Run command > RunShellScript
In the textbox for Linux Shell Script, type:
ufw disable
ufw status
Click Run button. Done.
You can add your new IP from the cloud shell.
az vm run-command invoke -g VMResourceGroup -n VM --command-id RunShellScript --scripts "ufw allow from 74.125.90.78 to anyport 22"
Change the ip address and the port to your own values.
Azure portal provides the easiest way to get into the serial console through its portal. Follow these steps
Azure Portal > Your VM > Support + Troubleshooting > Serial Console.
Now even if you firewall is blocking ssh, you can access this serial console and simply disable it using sudo ufw disable or add the 22 ssh port to the firewall
Method Provided by #Vince somehow did not work for me
even further simple solution
Setup your serial console
Pic for serial console path
first setup Username and password by going to support& trobleshott -> reset password
open serial console -> log in
sudo ufw disable [bamm...]
Run this on the machine as root:
ufw allow ssh
A more strict approach would be:
ufw limit ssh

Could not connect to VM created with Azure command line tools

I am trying to use the Azure Command Line Tools (http://www.windowsazure.com/en-us/manage/linux/how-to-guides/command-line-tools/) to create an Ubuntu 12.04 VM.
I am issuing the following commands:
azure vm create xxxxxxxxxx.cloudapp.net b39f27a8b8c64d52b05eac6a62ebad85__Ubuntu-12_04_1-LTS-amd64-server-20121218-en-us-30GB azureuser mypassword --location "West Europe"
azure vm endpoint create xxxxxxxxxx 22 22
azure vm start xxxxxxxxxx
This seems to create and start the VM successfully.
I try to connect via SSH to the VM using the following command (on Mac OS X)
ssh azureuser#xxxxxxxxxx.cloudapp.net
However, when I try to SSH into the VM, it seems that password authentication is disabled on the VM as I am getting the following error:
Permission denied (publickey).
I would like to add that connecting via SSH to an Ubuntu VM created trough the Azure Management portal works absolutely fine. This issue only appears when the VM was created through the Azure command line tools.
Has anybody encountered a similar issue and knows how to solve it?
You need to use the --ssh switch on your azure vm create command to enable ssh. Adding the endpoint has no effect.
According to the Windows Azure command-line tool for Mac and Linux documentation you can only add ssh connectivity via the azure cli when the virtual machine is created.

Resources