I am trying to get credentials for my Azure Kubernetes Cluster. Ran the script to fetch details on Azure cloud shell and got a .config file. I wonder if there is a way to download the file from my Azure Cloud Shell session?
At the cloud shell prompt just type: download yourfilename
When you use Azure Cloud Shell, you need to create an Azure File Share or use the existed. Cloud shell will mount File Share to the system. And the mount path you can use command mount to take a look. The result will like this:
As I suggest, you can copy the .config file to the path like this: /home/RG/clouddrive/.cloudconsole, then you can download the file from File Share.
For more details, you can take a look at another case here.
List your files
Azure Cloud Shell File listing
You may notice the clouddrive folder (check image link above). This folder is mounted to an Azure File Share, as specified here: https://learn.microsoft.com/en-us/azure/cloud-shell/persisting-shell-storage#how-cloud-shell-storage-works
Get Azure File Share Information:
Azure File Share
With the Get-CloudDrive command you obtain the Azure File Share metadata that will help you to find the download page for your files in Azure Portal. Alternatively you may use the df command
Login to Azure Portal and locate the Azure File Share:
Azure File Share at Azure Portal
Related
I have a requirement to copy a file from C: Drive of a remote desktop (RDP) to Azure blob storage.
The RDP server is accessible only through the Jump box.
How can I get the file that is on the RDP to Azure Storage, which linked service can I use to create the connection.
Is there a straight forward way in Azure to do this or some workaround needs to be done for this.
Thanks in advance !
Upload the file to a file share
The easiest way would be to mount a file share directly to your machine.
You can find detailed instructions on how to do so:
https://learn.microsoft.com/en-us/azure/storage/files/storage-files-quick-create-use-windows
To sum the article up, the instructions are:
In the Azure portal, create a file share in the storage account.
From the storage account, click connect
A pane will pop up on the right. Choose a drive letter which is unused on your VM and then copy the command.
Paste the command in a Powershell terminal in the virtual machine.
Once the file share is mounted, you can simply copy your file to the drive and it will be uploaded to Azure.
Upload the file to Blob Storage
You would have to install the Azure CLI:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-cli
and then use the az copy command.
I have the Matomo Docker Image from https://github.com/bitnami/bitnami-docker-matomo that I run in a Web App for Container on Azure with my own Azure Container Registry (ACR).
Also, I have an Azure Storage Account with a File Share available.
What I would like to achieve is to mount a persistent storage (File Share from Az Storage Account) to it so I don't loose the config and plugins installed of Matomo.
I tried using the Mount Storage (Preview), but I couldn't get it to work.
Name: matomo_data
Storage Type: Azure Files
Mount path: /bitnami
As described in: https://github.com/bitnami/bitnami-docker-matomo#persisting-your-application
This didn't work.
I also tried via the setting WEBSITES_ENABLE_APP_SERVICE_STORAGE = true on the Web App for Containers, but apparently seems not to do anything either.
I would appreciate any hints here, as otherwise I would have to make a custom docker image, push it to the registry, with a custom docker compose file, which I would like to avoid.
Thanks a lot in advance for any hints on this!
To mount the Azure File Share to the Web App for Container, as I think, it's not simple persistent storage, it's a share action. See the Caution below:
Linking an existing directory in a web app to a storage account will
delete the directory contents. If you are migrating files for an
existing app, make a backup of your app and its content before you
begin.
So, if you want to mount the file share to the web app to persist the storage, you need to upload all the files needed to the file share first. And the steps that mount the Azure File Share to the Web app are here. It shows for Windows, and for Linux is also the same way.
But I will suggest you'd better use the persistent storage following the steps here. This way will create persistent storage at the beginning and will not delete the directory contents.
I have a web based .Net application whose artifacts are being uploaded to Azure cloud through FTP Upload task. The issue is, it does upload the artifact but it is a zip file. How can I have it unzipped over the target location as there is no option of unzipping in FTP upload task.
I do not have the FQDN or IP of the Azure cloud server as it a PaaS based infrastructure, all I have is FTP location.
You cannot unzip file on FTP server. No matter what client/library/framework you are using. FTP protocol simply does not allow that.
See also:
Can we unzip file in FTP server using C#
How to unzip files via an FTP connection?
Based on my understanding, if you want to use Azure DevOPs FTP Upload task you need a FTP server address, username and password.
If it is that case, you could use the Azure logic App FTP(add or modify file) trigger to extract the file.
If it is not working for you and Azure storage is acceptable.
My workaround is that you could use the [Azure File copy] task to copy the file to your azure storage. Then you can control it by yourselves, for example: you could use the Azure function blob trigger to extract the file with you customized code.
The question is quite vague, but it sound like you might be trying to upload to an Azure WebApp which has FTP and also zip deploy functionality that uses the Kudu interface.
https://learn.microsoft.com/en-us/cli/azure/webapp/deployment/source?view=azure-cli-latest#az-webapp-deployment-source-config-zip
Using this Azure CLI command it will push your zip and deploy/unpack it into the WebApp for you.
PS. It's impossible to FTP without a DNS name or IP so you will have one of them in specified in the FTP location you've been given
I have successfully implemented Jenkins to deploy to a server hosted locally, but now I need to create a job to deploy to a Azure hosted website running on PaaS. Both the Jenkins host and Website hosts are Windows machines.
I have found a link for setting up a virtual machine template for Azure Slave plugin, but there is no VM because it is IaaS and I dont have additional slaves in this case.
I am asking about the plug ins and process flow please.
Which Azure Plugin should I use in Jenkins (if any)?
E.g. Azure PublisherSettings Credentials plugin
Do I use the Get-AzurePublishSettingsFile and Import-AzurePublishSettingsFile ?
Would these contain all the relevant details required for Jenkins to know
where to copy to?
Would I create a zip file of the build, upload the zip to BLOB storage,
and then extract it to the website?
Is it possible to upload a zip file and then proceeding to extract the files once the whole file has been uploaded?
If the connection is interrupted at any stage while uploading 1000 individual files then the website will be unstable and therefore I need to investigate a single file upload with extraction thereafter.
So if I were you I'd do the following:
1. Install jenkins powershell plugin, install Azure PowerShell commandlets.
2. Create a job in Jenkins that creates a the zip file and uploads it to Azure Storage
3. Create an ARM template to deploy Azure WebApp from the zip file in Azure Storage.
4. Create a job to deploy said template.
So the ARM template would take the zip file and upload it to the Azure WebApp and the WebApp would handle all the hassle with the zip file internally.
I have a VM I want to copy files to, a console app I want to run on the VM. How do I do this as the Remote Desktop won't let me copy files.
You should just be able to copy/paste like normal. You can check to make sure rdpclip.exe is running on the VM.
If that doesn't work, you can always open your local drive using \\tsclient\c from within the RDP session. To share your local drive you can save the RDP file from the Management Portal website, then right-click the .rdp file and select Edit. Then switch to the Local Resources tab and click More under 'Local devices and resources' and check the drives that you want to share.
In my case simple restart of rdpclip.exe did the job, so try it.
From my point of view the simplest and most reliable way is to use an Azure File Share.
Create a new storage account.
Create a File Share in the storage account.
Navigate to the File Share.
Click "Connect" and paste the commands to the PowerShell console on your client and on your Azure VM. Commands for Linux and MacOS are available as well.
Transfer files to and from the File Share.
The process is more or less automatable. I wrote about it on my blog:
https://engineerer.ch/2020/08/16/copy-large-files-to-an-azure-vm/