creating custom image in azure fails - azure

I'm new to azure cloud. I'm trying to upload my application vhd file in azure storage cloud and wanted to create an image out of it, so that I could launch a vm. I was able to upload file to storage and while trying to create an image it fails with InvalidParameter.

You can create a managed image from the VHD file that you upload to Azure Storage. Follow the steps in this document, it references the PowerShell command. If you need more help please let me know.

Related

Copy file from Remote Desktop (RDP) to Azure blob storage

I have a requirement to copy a file from C: Drive of a remote desktop (RDP) to Azure blob storage.
The RDP server is accessible only through the Jump box.
How can I get the file that is on the RDP to Azure Storage, which linked service can I use to create the connection.
Is there a straight forward way in Azure to do this or some workaround needs to be done for this.
Thanks in advance !
Upload the file to a file share
The easiest way would be to mount a file share directly to your machine.
You can find detailed instructions on how to do so:
https://learn.microsoft.com/en-us/azure/storage/files/storage-files-quick-create-use-windows
To sum the article up, the instructions are:
In the Azure portal, create a file share in the storage account.
From the storage account, click connect
A pane will pop up on the right. Choose a drive letter which is unused on your VM and then copy the command.
Paste the command in a Powershell terminal in the virtual machine.
Once the file share is mounted, you can simply copy your file to the drive and it will be uploaded to Azure.
Upload the file to Blob Storage
You would have to install the Azure CLI:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-cli
and then use the az copy command.

How to mount a volume (Azure File Share) to a bitnami-based docker image on Azure (Web App for Container)?

I have the Matomo Docker Image from https://github.com/bitnami/bitnami-docker-matomo that I run in a Web App for Container on Azure with my own Azure Container Registry (ACR).
Also, I have an Azure Storage Account with a File Share available.
What I would like to achieve is to mount a persistent storage (File Share from Az Storage Account) to it so I don't loose the config and plugins installed of Matomo.
I tried using the Mount Storage (Preview), but I couldn't get it to work.
Name: matomo_data
Storage Type: Azure Files
Mount path: /bitnami
As described in: https://github.com/bitnami/bitnami-docker-matomo#persisting-your-application
This didn't work.
I also tried via the setting WEBSITES_ENABLE_APP_SERVICE_STORAGE = true on the Web App for Containers, but apparently seems not to do anything either.
I would appreciate any hints here, as otherwise I would have to make a custom docker image, push it to the registry, with a custom docker compose file, which I would like to avoid.
Thanks a lot in advance for any hints on this!
To mount the Azure File Share to the Web App for Container, as I think, it's not simple persistent storage, it's a share action. See the Caution below:
Linking an existing directory in a web app to a storage account will
delete the directory contents. If you are migrating files for an
existing app, make a backup of your app and its content before you
begin.
So, if you want to mount the file share to the web app to persist the storage, you need to upload all the files needed to the file share first. And the steps that mount the Azure File Share to the Web app are here. It shows for Windows, and for Linux is also the same way.
But I will suggest you'd better use the persistent storage following the steps here. This way will create persistent storage at the beginning and will not delete the directory contents.

Copy Azure blob to local machine as soon as blob is created

I'm trying to create a Windows service that will detect when a new blob is uploaded to a certain container on Azure and download them onto the local machine immediately. I know I can have a blob trigger running locally but there doesn't seem to be any way to put this into a service. Does anyone have any ideas?
You should be able to do this with using the standard WebJobs SDK with a blob trigger, but running as a service instead of a console app.
You can find more information about using the blob trigger with the SDK directly here: https://github.com/Azure/azure-webjobs-sdk/wiki/Blobs

Upload to blob tfs 2017

We are trying to upload the artifact to blob storage from TFS build server. AzCopy task needs the azure subscription details, which is not available to us. We need to upload the artifacts to azure blob storage using azure blob storage connection string. Is there a way to upload files to blob storage using connection string only.
Anything you can do from PowerShell you can do from build and release. There is a task named "PowerShell" and one named "Azure PowerShell". If you don't have the Azure subscription details I doubt you will be able to use the "Azure PowerShell" task. However, if you have a PowerShell script you run locally that works you might be able to simply run it as part of your build with the "PowerShell" task.
Option two is have someone that knows the details to create an Azure Service Endpoint for you. They never have to share the details with you to make the connection. Once the connection is created you can use it without having to ever know the details.

image folder overwrite on window azure

i am new to the windows azure web application deployment.
i developed mvc web application and publish to the windows azure cloud platform.
i have one folder name Messages, that contains the images that i have upload via application. now after user upload images in web app once the app is published on cloud.
next time when i republish the application to the cloud
that "Messages" folder contents (images) are removed.
can you please help me, how to resolve this?
Regards, Brijesh vaidya
This is the expected behavior. Anytime you redeploy your application, new VMs are created for your application. You should not store anything that you want to persist on VM. Instead store them in blob storage. So in your case, you should upload the image and once the image is uploaded, transfer it to blob storage. You may want to check out this hands-on-lab in Azure training kit: https://github.com/WindowsAzure-TrainingKit/HOL-IntroToCloudServices-VS2012

Resources