Connect to On-Prem Network share drive from Azure - azure

I have user credential and folder path such as the following:
User: UserName
Password: Password
FilePath: \website.com\Private\ClientServices\
Using this, I can map this as a network drive on my local windows explorer. However, I have a Logic App in Azure that I would like to access this folder structure. I have tried FileShare but it doesn't seem to be able to pull or push files to this location.
How can I go about doing this?
I tried using FileShare but I can't connect to the On-Prem drive from it. I am unclear how I might actually accomplish this.

Related

Unable to Publish artifacts to file share in Azure Devops

We are trying to setup CI/CD for a winform app developed using DotNet core 6. We have setup the build pipeline and it's producing the correct set of artifacts but the problem is we are unable to publish these artifacts to network file share drive (DFS server). While connecting to network file share we are getting error Incorrect username or password. Do we need a service account which would connect to network file share ? If there is a service account needed we don't see any option in release pipeline under which task in Azure Devops we need to pass username and password as the Publish artifact task allow us to define the network location without using any credentials. Can anyone please suggest.
Do we need a service account which would connect to network file share
?
Seems like you asked here too. For a self-hosted agent, yes, we need a service account that can connect to the network file share.
Basically, you can try to setup the agent with the account which has the correct permission (write permission) to access the network share. (specify the account as the agent service account).
Alternately, you can use Windows Machine File Copy task to copy the artifacts to the network share. In the task you can specify the username and password to access the network share.
Can be accomplish through service principal/agent that has write permission to the drive. You can then create scripts in the build pipeline to transfer the file to the Azure storage account blob container.

How to upload the files on Azure App Service over FTPS using PowerShell

I am trying to upload files from my local system to Azure app service over FTPS.
I am able to achieve this using Filezilla.
Now, I want to automate this process using Powershell.
Configuration of my app service looks like :
URL ftps://waws-prod-dm1-017.ftp.azurewebsites.windows.net/site/wwwroot
Scope App Credentials
Username appname$appname
Password aq8----------------ddddF-------rssssg--------------z
More Information :
I want to download the App Service artifact on my local system
whatever artifact get downloaded on my local system,
I want to upload the same back to the app-service if required.
I have used the PowerShell from here
I am unable to do the same.
Can you please guide me regarding PowerShell.
try the following solution for Powershell i think it will work for Azure Web app as well.
Upload file to SFTP using PowerShell

Copy file from Remote Desktop (RDP) to Azure blob storage

I have a requirement to copy a file from C: Drive of a remote desktop (RDP) to Azure blob storage.
The RDP server is accessible only through the Jump box.
How can I get the file that is on the RDP to Azure Storage, which linked service can I use to create the connection.
Is there a straight forward way in Azure to do this or some workaround needs to be done for this.
Thanks in advance !
Upload the file to a file share
The easiest way would be to mount a file share directly to your machine.
You can find detailed instructions on how to do so:
https://learn.microsoft.com/en-us/azure/storage/files/storage-files-quick-create-use-windows
To sum the article up, the instructions are:
In the Azure portal, create a file share in the storage account.
From the storage account, click connect
A pane will pop up on the right. Choose a drive letter which is unused on your VM and then copy the command.
Paste the command in a Powershell terminal in the virtual machine.
Once the file share is mounted, you can simply copy your file to the drive and it will be uploaded to Azure.
Upload the file to Blob Storage
You would have to install the Azure CLI:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-cli
and then use the az copy command.

How to mount a volume (Azure File Share) to a bitnami-based docker image on Azure (Web App for Container)?

I have the Matomo Docker Image from https://github.com/bitnami/bitnami-docker-matomo that I run in a Web App for Container on Azure with my own Azure Container Registry (ACR).
Also, I have an Azure Storage Account with a File Share available.
What I would like to achieve is to mount a persistent storage (File Share from Az Storage Account) to it so I don't loose the config and plugins installed of Matomo.
I tried using the Mount Storage (Preview), but I couldn't get it to work.
Name: matomo_data
Storage Type: Azure Files
Mount path: /bitnami
As described in: https://github.com/bitnami/bitnami-docker-matomo#persisting-your-application
This didn't work.
I also tried via the setting WEBSITES_ENABLE_APP_SERVICE_STORAGE = true on the Web App for Containers, but apparently seems not to do anything either.
I would appreciate any hints here, as otherwise I would have to make a custom docker image, push it to the registry, with a custom docker compose file, which I would like to avoid.
Thanks a lot in advance for any hints on this!
To mount the Azure File Share to the Web App for Container, as I think, it's not simple persistent storage, it's a share action. See the Caution below:
Linking an existing directory in a web app to a storage account will
delete the directory contents. If you are migrating files for an
existing app, make a backup of your app and its content before you
begin.
So, if you want to mount the file share to the web app to persist the storage, you need to upload all the files needed to the file share first. And the steps that mount the Azure File Share to the Web app are here. It shows for Windows, and for Linux is also the same way.
But I will suggest you'd better use the persistent storage following the steps here. This way will create persistent storage at the beginning and will not delete the directory contents.

Client-Side: Accessing Windows Azure Drive?

I am developing an Azure application, part of which involves users browsing an online filesystem. TO do this, I am trying to use the Windows Azure drive, but I can't figure out how to access it from client side, or how to make it accessible on the server side.
At the moment, I only know how to make the drive:
CloudStorageAccount devStorage = CloudStorageAccount.DevelopmentStorageAccount;
CloudBlobClient client = devStorage.CreateCloudBlobClient();
CloudBlobContainer container = new CloudBlobContainer("teacher", client);
CloudDrive.InitializeCache(localCache.RootPath,
localCache.MaximumSizeInMegabytes);
CloudDrive drive = new CloudDrive(container.GetPageBlobReference("drive1").Uri, devStorage.Credentials);
drive.Create(50);
I am using C# as my development language.
All help is greatly appreciated!
There are couple of things you need to understand with Windows Azure Cloud Drive:
Cloud drives are actual Page Blobs which are stored on Windows Azure Blob storage and mount as a drive (you will get a drive letter depend on your machine drive statistics) in a machine where you can provide Windows Azure Run time environment.
Programmatic it is very easy to mount a cloud drive in your code as you showed in your example however one thing is missed that is to be sure to have Windows Azure RunTime environment where this code can run.
I have written a utility to mount azure drive within Windows Azure VM (Web, Worker or VM Role) located here:
http://mountvhdazurevm.codeplex.com/
You can run above tool directly in Windows Azure VM and can also this the exact same code in your Compute Emulator (Windows Azure Development Fabric) so the bottom line is as long as you can provide Windows Azure Runtime environment, you can mount a Page blob VHD drive.
I have seen several cases where someone asked me to mount a Windows Azure Page Blob as drive in local machine (client and server, anywhere) and the actual hurdle was to bring Windows Azure Run time in local environment because it is not available. In some cases a few person went ahead and tries to use Windows Azure SDK to have Windows Azure runtime made
available in their desktop, created a dummy web role and then mount the VHD which was mounted in local machine and a drive letter was made available as well. I am not sure about such kind of solution because this is not Windows Azure compute emulator is designed.
Hope this description provide you some guideline.
I'm not sure I understand your question properly, but it sounds like you want multiple client applications - presumably on machines that are not on Azure - to access your Azure drive?
Unfortunately, Azure drives can only be accessed from Azure web/worker or VM role instances.
I've written a WebDAV Server which runs on an Azure Website which will allow clients, including Windows Explorer and Office to connect to Azure Storage. It uses a combination of Table and Blob Storage to store the file structure and files. I've tested it with Windows Explorer and Word 2013. Although this isn't a clouddrive solution it's still using Azure Storage as a backend and it's accessible from WebDAV clients. You might find it useful..
https://github.com/ichivers/AzureDAV
One additional point to the existing answers. You can always download the blob backing your Cloud Drive and mount it on a local system. The blob is really just a VHD. However, the download time isn't going to trivial unless the drive is small.
Erick

Resources