cannot delete wwwroot on azure - node.js

I am trying to run an NodeJs app on Azure app services for Linux. After a failed deployment I would like to delete the wwwroot folder and start from scratch, but I am not able to delete it.
I have tried using ssh, bash, ftp and the Kudu REST API, but every time the result is the same.
This is the response from the DELETE call:
{"Message":"Cannot delete directory. It is either not empty or access is not allowed."}
Looks like there's an hidden .bin folder inside /node_modules that is blocking the delete operation.
Any hint?

I ran into this same issue today and resolved it by deleting the following App settings from Azure; WEBSITE_RUN_FROM_ZIP and WEBSITE_RUN_FROM_PACKAGE (Both settings do the same thing, just different names).
This was following instructions in Vikas Guptas blog on 17 Aug.
Not sure if you are using VSTS (Azure devops) but it may be helpful to some people to know that using version 4 of Azure App Service Deploy automatically sets the Website_Run_From_Package flag.

Related

VSTS deployment not working properly

I've got .Net solution Built in VSTS. My Release also succeeded but I'm not able to see my WebApplication running. Find the screenshot attached
Please find my Deploy azure app service task screenshot attached .
Please suggest if i miss any configuration.
I see your package or folder to deploy in Deploy Azure App Service task is pointed to drop folder. Instead it should be pointed to zip file inside the drop folder.
Eg: $(System.DefaultWorkingDirectory)/TestApp/drop/TestApp.zip
Try this and update if any other issue.
It is the content of hostingstart.html, which is in the wwwroot folder of app service, you can check the files by accessing https://[web app name].scm.azurewebsites.net/DebugConsole and go to site\wwwroot folder.
If the app service can’t recognize the web project (e.g. Global.asax) and there aren’t the default page files (Web service Application settings.), it will display hosttingstart.html page)
So, you need to check the files in release artifact.
If it is the web deploy package (zip file), you need to publish it through Web Deploy. (Check Publish using Web Deploy option and specify package path in Package or folder box (e.g. $(System.DefaultWorkingDirectory)/**/*.zip), you can clear additional files by checking Remove additional files at destination option)

Azure: 409 Conflict: Cannot delete directory. It is either not empty or access is not allowed

I use Azure with node js App Service.
When i try to delete empty folder using ftp or kudu app in node_modules folders i get error: 409 Conflict: Cannot delete directory. It is either not empty or access is not allowed. This folder is totally empty. How i can delete it?
Check AppSettings of Web App and delete the setting ‘WEBSITE_RUN_FROM_PACKAGE’ or changing the setting value to 0.
0 - write mode;
1 - read-only mode
All Azure Web Apps (as well as Mobile App/Services, WebJobs and Functions) run in a secure environment called a sandbox. Each app runs inside its own sandbox, isolating its execution from other instances on the same machine as well as providing an additional degree of security and privacy which would otherwise not be available. For more details, refer to this article.
For directory level access please refer the below:
Home directory access (d:\home):
Every Azure Web App has a home directory stored/backed by Azure Storage. This network share is where applications store their content. This directory is available for the sandbox with read/write access.
The sandbox implements a dynamic symbolic link in kernel mode which maps d:\home to the customer home directory. This is done to remove the need of the customer to keep referencing their own network share path when accessing the site. No matter where the site runs, or how many sites run on a VM, each can access their home directory using d:\home.
Local directory access (d:\local):
This is a temporary directory and can be deleted when no longer needed. This directory is a place to store temporary data for the application. The application naturally has read/write access to this directory.
Note that the d:\local folder in the scm site (where Kudu runs) is not the same as the one in the main site (where the web app runs). As a result, they cannot see each other's local files.
Incase if you haven’t tried this already, use the rmdir directoryname /s /q command in Kudu Console to delete the directory and see if that works.
You can also run this command from Web App console. To access, Goto Web app -> Development Tools -> Console
Hope this helps.
There is a Process explorer tab in the header.
Open Process explorer
Kill the task that's using the folder or file
You will be able to delete the folder in question
This is how I resolved the same issue.
Here is my solution: adding this option to your step deploying to the app service.
Hope it would help
You may use the command api to do that. Craft a CMD or PowerShell command that'll clean your wwwroot and then execute it.
I tried all of the upper solutions and didn't worked. Blackhole!
I stopped the app service, removed the files and started again. It worked.

How to update a folder in an Azure AppService upon Checkin into a VSO TFS repository

I have a website hosted as an azure web app. It's an asp.net website that's in a vs solution. On folder of that website is my products documentation, all of it as static resources (html and images). These static resources are located in a folder in another vs solution (this is the actual products solution). Both solutions are TFS based in VSO.
As of now, i have a webjob running in the context of the website that is basically doing a "tfs get" on the documentation folder and placing it's contents into the documentation folder on the website. This is working, however, the vms the website is running on do change quite frequently and the mechanism to create a workspace is bound to the machine, not to the disk drive. Thus, i cannot get only the changes but i must always get all the content which right now takes about 20 minutes and creates unnecessary load on the website. (This is why i'm only running this webjob once a week.)
Now i'm looking for a better way to do this. I would like to only get the files that have changed, making this a lot faster and let cpu/drive costly.
I did not find a way to create a workspace on the webserver that isn't vanishing each time the webservers vm changes. (if it was possible to somehow attach the workspace to the drive instead of to the machine name, that would solve the problem.)
i was also looking at my continues build definition that i have running for the products solution. as part of that, i think it's possibly to create a deployment where the documentation folder is copied to the app services's documentation folder. This way i could get rid of the "special" webjob, but i'd still copy all the docs files each time. (also, the build agent for that is running on premises, so i'd also have to copy those files from premises up to the cloud when they're actually already there inside vso.). So basically, i don't think this option is a lot of use for my case.
Obviously if i moved the static docs resources from the products solution to the websites solution, i could simply use the automatic deployment that is available for website projects from vso to azure web app. Unfortunately, for various other reasons (one of which being, the static resources are partially created automatically from the .cs sources in the products solution) i simply cannot move the docs folder from the products solution to the websites solution.
So does anybody have a suggestion for a method where i could update the documentation folder in the web app based on changes in the corresponding VSO folder?
You can upload the updated files to Azure app service by using Kudu API.
Simple steps:
Create a Continuous Integration build
Check Allow Scripts to Access OAuth Token option in Options tab
Add PowerShell step/task to check the changes with REST API. (Refer to Calling VSTS APIs with PowerShell)
Add Azure PowerShell step/task to upload files to app service by using kudu api. (Refer to Remove files and folders on Azure before a new deploy from VSTS)
Here is what i ended up doing:
Created a ServiceHook in VSO that is wired to "Web Hooks". The hook is called upon each Check-In and filtered based on the directory i want. (All of this can be done using the existing functionality in VSO.)
The hook calls an Azure "functionapp" (which is easy to do, because functionapps have a "HttpTrigger" mechanism which fits in nicely here.)
The hook passes the id of the checkins ChangeSet to the function app.
The FunctionApp puts that id into an Azure (Storage) Queue.
This triggers an azure webjob which listens on that queue. That webjob uses the ChangeSet-Id to get the changes from VSO and acts on the changetype for each change. (e.g.: Downloads or deletes a file.)

How do I protect a target Directory in Azure from deployment

I am using VS Team Services to build and deploy my Asp.net MVC application to Azure. When the application is running, users can upload files to a directory.
The problem is that when I run a new build and deploy task in Team Services it overwrites or deletes the files that were uploaded by my users.
It seems like Team Services erases the target location before it does the deployment.
Is there a way in Team Services to tell it not to delete a specific directory when it is deploying an update to the application?
If I cant do that then is there a way to automate the copying of the original files before the deployment, then write them back after the deployment?
I've been searching google most of the day and cant seem to find an answer.
Thanks
Tony
I believe there is a better approach. Your user data should be saved to Azure blob storage, not uploaded to the web deployment location.
Here is a good getting started tutorial on using Azure storage: https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-blobs
It is not recommended for user files to be stored in the same directory as the web site. Even if you simply load them into file storage on your VM, it should be in its own location.
There is Remove Additional Files at Destination option in Additional Deployment Options of Azure App Service Deploy step/task.
So If you check this option, it will delete additional files on AzureRM Web APP otherwise, it won’t delete additional files.

FTP'ing a Suave app to Azure

Having never used Azure before I'm attempting to deploy a simple F# Suave app to Azure using FTP. Ultimately I want to deploy via github but I initially thought FTP'ing it would be the easy first step. According to https://suave.io/azure-app-service.html it should be straight forward.
These are the steps I followed
Created a new web app in Azure including a resource group
and app service plan. All on the Free Tier.
Downloaded the publishsettings XML file that Azure created.
Cloned this repo: https://github.com/isaacabraham/fsharp-demonstrator
Used FileZilla to connect via FTP using the creds
from step 2.
Uploaded the files (via FTP) from
fsharp-demonstrator/src/SuaveHost (which includes the necessary web.config file) from the repo cloned at step 3 to
the site\wwwroot on Azure.
Navigated to Azure site url.
Then I receive the error:
The specified CGI application encountered an error and the server terminated the process.
(When I look at the folders on Azure under site\wwwroot I don't see any obj or bin folders. I don't think any msbuild process occurred. That doesn't seem right.)
Anybody got any idea what the problem is?
I suspect the issue is that when you deploy via FTP, then Azure does not automatically run the deploy script specified in the .deployment file.
The build.fsx script uses Kudu service to deploy the built files, so it might be easier to just use Github deployment rather than FTP - this way, Azure will do the deployment for you.
If you want to deploy via FTP, you'll need to build the project locally and upload the output. I'm not sure how to best do this with Isaac's Kudu-based demo though (ultimately, you need web.config that points to your built executable like this)

Resources