I have deployed a LAMP stack to a virtual machine in Azure, by following this tutorial:
https://learn.microsoft.com/en-us/azure/virtual-machines/linux/tutorial-lamp-stack
It's all up and running. However, I can't figure out how to manage the files on the server, and/or copy/upload files to the server.
I can ssh into the VM using the Azure Cloud shell, but I don't seem to have access to my local files if I do it that way. So I installed the Azure CLI on my local machine but when I try to open an ssh session to the server I get 'permission denied (publickey)'.
I've looked into secure copy - scp - and have tried connecting to the server with Putty and WinSCP, but the error that I get is 'No supported authentication methods available (server sent: publickey)'
I'm new to Apache and just can't figure out how to list the files on the server or manage them at all...
When you use the secure copy "scp", there is one point you should pay attention to. If you create the Azure VM with setting the user as azureuser, and then you just can use the command scp /path/file azureuser#domainName:/home/azureuser/filename to copy the file. Because you just have the permission of the user "azureuser" so that you just can copy the file from outside to the vm directory /home/azureuser no matter you use a password or the ssh public key.
Update
If you create the Azure VM with ssh public key, you need to store the key in where you want to connect the VM. For example, you want to connect to the VM in local Windows 10. The key should be stored in the directory "C:\Users\charlesx\.ssh". So that you can connect to the VM, also with scp command.
I solved this by using puTTY and WinSCP. Whereas before I had been using the Azure Cloudshell commands to create the VM and generate the ssh keys - so I could connect to the VM using Cloudshell but since I didn't know where the auto-generated keys were stored, I couldn't connect on my local machine.
My solution was to create the VM through the Azure portal UI interface. I used puTTYgen to generate ssh key pairs on my local machine, then I input the public key into the Azure UI when creating the VM. Once the VM was provisioned in Azure, I could connect to it using puTTY and install LAMP and any other command-line stuff that way.
I also used WinSCP to copy the files to where I wanted - I could have done it command-line with scp, but I'm a visual person and it was useful to be able to see the directory structure that had been created. So a combination of the two worked well for me.
Related
I'm logged into root on my google cloud compute instance (Linux) and want to transfer a file onto my local desktop. I can simply do this through the browser as specified here https://cloud.google.com/compute/docs/instances/transfer-files only if I'm not root, but I cannot do it as root.
Why is this and how can I allow downloading as root? I need to intermediary cp from root to myself in order to download the file.
Using the SSH from the browser window lets you use SSH to connect to a Compute Engine virtual machine (VM) instance from within the Google Cloud Console.
Compute Engine manages your SSH keys for you whenever you connect to a Linux instance from your browser, creating and applying SSH key pairs when needed. You cannot manage the SSH keys that are used to connect from the browser. Instead, user access to connect from the browser is controlled by Cloud Identity and Access Management roles [1].
To connect through the browser, you must be a project member who is a compute instance admin [2]. The “root” user in the VM is an internal user for the VM, the "root" user is not a project member here. Google doesn't have any access to the VM as the “root” user of the VM.
After you (project member) have been granted access, connect to a Linux instance directly from your web browser in the Cloud Console and could transfer file [3] to the VM and then you could copy your files as required path.
[1] https://cloud.google.com/iam/docs
[2] https://cloud.google.com/compute/docs/ssh-in-browser
[3] https://cloud.google.com/compute/docs/instances/transfer-files
I am converting a service to run on a Linux Container. Currently, the service runs in IIS in a Windows VM.
It runs as a Lan User that has permissions to the database. Thus the connection string uses Integrated Security.
But Containers cannot join a domain. So, as I understand it, that option is out.
I researched this for Windows Containers and found that it supports running as a Group Managed Service Account (gMSA) on the container host, and that calls made as "Network Service" are swapped to the gMSA. (Allowing use of a domain user via the container host.)
But I cannot seem to find a similar feature for Linux containers.
Do all processes run in Linux containers just put usernames and passwords in to their database connection strings?
Or is there a better way to convey identity in a Linux Container?
To give a few more details on my particular setup:
Running a Linux container
Running .NET Core 2.2
Running in Kubernetes (eventually)
Database is Microsoft SQL Server Running on Windows
Would help to know a bit more of your setup, but with the information at hand there are 3 options as I could see.
Option 1:
Manage the credentials with for docker secrets as per
https://docs.docker.com/engine/swarm/secrets/
docker container exec <CONTAINER_ID> \
bash -c 'mysqladmin --user=wordpress --password="$(< /run/secrets/old_mysql_password)" password "$(< /run/secrets/mysql_password)"'
Option 2:
Depending on what kind of DB you're using you could add the password to the configuration, for example in my.conf for mysql.
[client]
password = 123
Option 3:
Depending on your network stack, you could set the permissions in the database instead. Allowing the IP access to the database. But I would however recommend one of the other options.
So since it is a linux environment and I believe you want to use windows authentication you can use similar Ad authentication.
Check here
his tutorial explains how to configure SQL Server on Linux to support Active Directory (AD) authentication, also known as integrated authentication. For an overview, see Active Directory authentication for SQL Server on Linux.
This tutorial consists of the following tasks:
Join SQL Server host to AD domain
Create AD user for SQL Server and set SPN
Configure SQL Server service keytab
Secure the keytab file
Configure SQL Server to use the keytab file for Kerberos authentication
Create AD-based logins in Transact-SQL
Connect to SQL Server using AD Authentication
update I don't think windows domain integrated authentication can be used.I don't think there can be any integrated authentication then. try dsn so that your code does not have username password. https://www.easysoft.com/products/data_access/odbc-sql-server-driver/getting-started.html
I'm new to Azure; I wanted to take advantage of being able to run PrestaShop (e-commerce software) and Azure marketplace has single VM plan. I followed this video and got it up and running. Trouble is to login to the site's Admin interface you need to know the secret folder that is randomly created by the installer. I have tried the Azure Storage Explorer , but nothing useful is displayed. I also tried to login using putty and SSH, but keep getting access denied. I suspect I need to configure an endpoint for port 22, as described here in order to get ftp working, but apparently this is not possible with a free subscription (?).
Any help as to how I can find that folder name would be appreciated.
With Azure Free Trial Subscription, I can successfully login into the PrestaShop Azure Linux VM without any issue.
Note: No need to configure an endpoint for port 22.
To connect to your Linux virtual machine using SSH, use the following command: ssh username#IPAddress and password.
If you are facing an issue with your login, you can reset the password.
I had deployed magento directly on Microsoft azure platform using BITNAMI(Linux platform), Everything works cool instead i cant able to access the physical directories of mlinux machine. Due to that facing trouble with installing the theme.
If you want access to the filesystem inside the machine where you're running your application, you should connect to your Azure instance via SSH. For example, you can follow this guide.
Also, if you want to know how to obtain your credentials, you can follow this guide.
get your key file or credentials to access the server via SSH / SFTP
confirm SSH PORT ( 22 or 8888 )
in bitnami, Magento is located at : /opt/bitnami/apps/magento/htdocs/
make sure to allow firewall settings for port ( SSH )
you can find all information here
I have created a new Ubuntu VM running Docker on Azure . I have used password as the means of Authentication method . I have received the message that VM has been created is is running . However the Connect button is disabled and only Restart and Stop are enabled . Im not sure how to resolve this issue.
RDP (Remote Desktop Protocol) is a proprietary protocol used for Windows. How can we use RDP to connect to a Linux VM (virtual machine) remotely? so you can not connect Linux VM in remotely
You'll need to install an SSH client on the computer you want to use to log on to the virtual machine. There are many SSH client programs that you can choose from.
please follow below link:-
https://azure.microsoft.com/en-in/documentation/articles/virtual-machines-linux-classic-log-on/
https://azure.microsoft.com/en-in/documentation/articles/virtual-machines-linux-classic-remote-desktop/
http://www.codeproject.com/Articles/682948/Remote-desktop-connection-to-Ubuntu-VM-in-the-Azur