Azure Linux Web App Service - System.IO.IOException: Readonly File Error - linux

The web app already running on .NET Core 3.1 LTS with IIS (windows server 2019)
Recently, I deployed as an Azure Web App Service but I encountered a file write error.
The application trying to create a new file for some business requirement.
Error message; System.IO.IOException: Read-only file system
Has anyone encountered this problem on Azure Linux Web App?
Solved: The Azure Linux Web App Service doesn't support directly file uploading to a regular folder under the wwwRoot. If you can running on the linux web app service, you need to use blob / storage etc..

Azure web app recommends that the wwwroot folder has read and write permissions. Whether it's Linux or Windows, it's the same.
In addition, it is not recommended to put the content of file operations in the project directory. For example, excel files generated under a certain business or uploaded image files in the deployment file. And the files are recommended to be stored in azure storage.

From the information which you have provided we can understand that there is any issue in creating a file in a specific directory where you don’t have access to it.
Refer to this SO thread where we got the insights to disable read-only mode, thanks to MarkXA.
Make sure that you have provided correct path whether it is any storage account or local storage where you are getting files. Also, if possible, elaborate your questions with full error trace, path you are using to access.

Related

Access VM Shared Directory from Linux App Service

we have the new asp.net core web application running on Azure as App Service.
Because of the backward compatibility, we have a bunch of files (from the old version of the application) stored on VM Windows machine running at Azure too. Those files must be there!
And we need to access them from Linux App Service as files and directories as they are.
We wanted to use File Share. But because of the App Service sandbox, it is not possible.
Any help?
As of now you have option of mounting or using azure storage with Linux App Service.
https://learn.microsoft.com/en-us/azure/app-service/configure-connect-to-azure-storage?tabs=portal&pivots=container-linux
You can think of using or moving your filesystem from Azure VM to azure storage and further use Linux App Service mounting to Azure Storage.
The above article contains video with every step on how to do that.

Azure web hosting using FTP / MS WebDeploy

Can I host a web application created on .net core 2.1 with sql server as database to azure web app service using CI tools / MS WebDeploy?
The following points I want to take care:
The application is using file system for temp storage and file storage
Deployment should be managed by some CI tools such as jenkins
After deployment, the app settings file should be modified with some keys/server details
Log files(stored on app root) should be accessible by application administrator
Is there a way to create a virtual directory same as in IIS and upload the files using FTP or similar protocols..?
All your doubts about deploying .net core 2.1 web app are achievable.
Suppose our projects are all completed and uploaded to github.
Questions and explanations about your concerns:
About the connection configuration using the database, you can directly configure it in web.config. If you are using azure sql server, find the connection string, set up the firewall, and pass the SSMS test, you can test the connection in the code. It can also be added in the Configuration -> Application settings -> Connection strings in the portal. After the addition, the priority is higher than the configuration in web.config, which will override the configuration and not modify the web.config file.
Regarding the use of file storage, you can use azure storage services or not. Looking specifically at the business, for example, very small pictures, documents and other files can be stored in the current program running directory, which is consistent with the original development at the code level. When publishing, you need to include the MyFiles file in the publishing process, or wait for the publishing to be completed and add folders manually in kudu, or the program can judge. It is recommended to use the program to judge that the subsequent program upgrade will not lose data.
The confidential information in the app settings file can actually be configured in web.config or appsetting.json. Make sure that the offline project is running properly when you are debugging locally, and then you can publish it. The rest is configured in the portal as in the first explanation.
The Log Files file storage can fully achieve the effect you want. It should be enough to set the owner permissions of this app services. For details, please refer to the official documentation.
Virtual directories and virtual applications, I have a better answer in another post here, you can refer to it.
Steps:
First of all, we can create a web app in portal and select .net core 2.1. Create appservices, and click Deployment Center when finished.
Follow the prompts step by step, and wait until the Action in github is completed, and the release is successful.

Azure File Storage - IIS - ASP.net application

I have a legacy asp.net application (EG:www.mycompany.com). There are around 10 folders inside that application and one of them has lot of images (reads/write) (EG www.mycompany.com/images/1.jpg) around 3 TB.
We are migrating this application to Azure VM. What we trying to do here is, keep all the 9 directories of that application inside the VM disk and move the images folder alone to Azure Storage.
So we created an Azure file share, created an local account with the same credentials as Azure Storage. Gave the local account IIS_USR group and then run the web application under this user.
We created a virutal directory called "images" inside the web application and linked that to say "\XXXX.file.core.windows.net\images".
The problem i am facing now is, we are able to read the file and show it in the web browser, but we are unable to upload a new image. When trying to upload an image from the web browser (thru the web application), it actually creates a folder called "images", because the code behind it uses server.mappath.
Is there any other alternative implementation without an code change.
We ended up creating a symbolic link for the images folder, that points to azure storage. Created a local vm user with same credentials as the azure storage account and ran IIS with that local user.
Everything worked fine.

How do I protect a target Directory in Azure from deployment

I am using VS Team Services to build and deploy my Asp.net MVC application to Azure. When the application is running, users can upload files to a directory.
The problem is that when I run a new build and deploy task in Team Services it overwrites or deletes the files that were uploaded by my users.
It seems like Team Services erases the target location before it does the deployment.
Is there a way in Team Services to tell it not to delete a specific directory when it is deploying an update to the application?
If I cant do that then is there a way to automate the copying of the original files before the deployment, then write them back after the deployment?
I've been searching google most of the day and cant seem to find an answer.
Thanks
Tony
I believe there is a better approach. Your user data should be saved to Azure blob storage, not uploaded to the web deployment location.
Here is a good getting started tutorial on using Azure storage: https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-blobs
It is not recommended for user files to be stored in the same directory as the web site. Even if you simply load them into file storage on your VM, it should be in its own location.
There is Remove Additional Files at Destination option in Additional Deployment Options of Azure App Service Deploy step/task.
So If you check this option, it will delete additional files on AzureRM Web APP otherwise, it won’t delete additional files.

FTP'ing a Suave app to Azure

Having never used Azure before I'm attempting to deploy a simple F# Suave app to Azure using FTP. Ultimately I want to deploy via github but I initially thought FTP'ing it would be the easy first step. According to https://suave.io/azure-app-service.html it should be straight forward.
These are the steps I followed
Created a new web app in Azure including a resource group
and app service plan. All on the Free Tier.
Downloaded the publishsettings XML file that Azure created.
Cloned this repo: https://github.com/isaacabraham/fsharp-demonstrator
Used FileZilla to connect via FTP using the creds
from step 2.
Uploaded the files (via FTP) from
fsharp-demonstrator/src/SuaveHost (which includes the necessary web.config file) from the repo cloned at step 3 to
the site\wwwroot on Azure.
Navigated to Azure site url.
Then I receive the error:
The specified CGI application encountered an error and the server terminated the process.
(When I look at the folders on Azure under site\wwwroot I don't see any obj or bin folders. I don't think any msbuild process occurred. That doesn't seem right.)
Anybody got any idea what the problem is?
I suspect the issue is that when you deploy via FTP, then Azure does not automatically run the deploy script specified in the .deployment file.
The build.fsx script uses Kudu service to deploy the built files, so it might be easier to just use Github deployment rather than FTP - this way, Azure will do the deployment for you.
If you want to deploy via FTP, you'll need to build the project locally and upload the output. I'm not sure how to best do this with Isaac's Kudu-based demo though (ultimately, you need web.config that points to your built executable like this)

Resources