iisnode.yml disappears during deployment - azure-web-app-service

I am using VSTS's Deploy Azure App Service task to deploy a node.js app to an Azure App Service Web App, supplying a custom iisnode.yml, web.config, and deploy.cmd.
My VSTS build artifacts looks fine through the explorer:
site
config
dist
public
.deployment
deploy.cmd
iisnode.yml
package.json
server.js
web.config
yarn.lock
The task has checkedmarked:
Publish using Web Deploy
Remove additional files at destination
The task's Deploy Azure App Service.log shows (it says Updating instead of Adding for iisnode.yml because I manually added one before this deploy).
...
2017-05-07T05:51:51.4939189Z Info: Updating file (MyRepo\iisnode.yml).
2017-05-07T05:51:51.4939189Z Info: Updating file (MyRepo\package.json).
...
And the deploy.cmd is pretty much just using yarn instead of npm from the scripts you can generate with the azure cli.
However, in the Azure debug console, it does not list iisnode.yml:
PS D:\home\site\wwwroot> ls
Directory: D:\home\site\wwwroot
Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 5/5/2017 5:27 PM config
d---- 5/5/2017 5:27 PM dist
d---- 5/7/2017 7:07 AM node_modules
d---- 5/5/2017 5:27 PM public
-a--- 5/7/2017 6:43 AM 32 .deployment
-a--- 5/7/2017 6:43 AM 3657 deploy.cmd
-a--- 5/7/2017 6:43 AM 4509 package.json
-a--- 5/7/2017 6:43 AM 148 server.js
-a--- 5/7/2017 6:43 AM 2556 web.config
-a--- 5/7/2017 6:43 AM 269807 yarn.lock
This causes the node process with run without the desired settings. What is removing iisnode.yml?

The work around I did was either put all the settings in web.config or just the line:
<iisnode configOverrides="iisnodeoverride.yml"/>
It seems changing from the default name made it stick.

Related

Azure DevOps Pipeline - Configure Unit Tests for UWP Project

I'm pretty new to Azure DevOps and I am trying to get it to run the Unit tests (MSTest) as part of the pipeline. I'm using the default generated yaml for UWP. According to the documentation for unit tests I should have something like:
- task: VSTest#1
displayName: Unit tests
inputs:
testAssembly: '**/*test*.dll;-:**\obj\**
This is a high level of the file structure in question (relative to the yaml file):
Pipeline.yml
Project (folder)
Project.sln
ProjectDatabase (folder)
bin (folder)
obj (folder)
ProjectDatabase.csproj
ProjectDatabase.Test (folder)
bin (folder)
obj (folder)
ProjectDatabase.Text.csproj
ProjectDataAccess (folder)
bin (folder)
obj (folder)
ProjectDataAccess.csproj
ProjectDataAccess.Test (folder)
bin (folder)
obj (folder)
ProjectDataAccess.Test.csproj
Each time I've tried varying the path but running the Pipeline just returns:
##[warning]No test assemblies found matching the pattern: '**/**/*test*.dll;-:**\**\obj\**'.
Am I even going down the right path and if so, am I missing something? Thanks in advance and I greatly appreciate any assistance.
I was not thinking and ran the Unit Tests separately from the build itself so the appxrecipe files were not present.
To have the tests run I had to use:
- task: VSTest#2
displayName: Unit Tests
inputs:
platform: 'x64'
configuration: '$(BuildConfiguration)'
testSelector: 'TestAssemblies'
testAssemblyVer2: | # Required when testSelector == TestAssemblies
**\Release\ProjectName.UWP.Test.build.appxrecipe
**\Release\ProjectName.DataAccessUWP.Test.build.appxrecipe
**\Release\ProjectName.DataUWP.Test.build.appxrecipe
!**\*TestAdapter.dll
!**obj**
searchFolder: '$(Build.SourcesDirectory)'
resultsFolder: '$(System.DefaultWorkingDirectory)TestResults'

Do I need to add node_modules to .gcloudignore when running "gcloud builds submit ./project-folder"?

This is my project structure:
project/
node_modules/
src/
.gcloudignore
cloudbuild.yaml
Dockerfile
package.json
Here is how I'm building it:
gcloud builds submit ./project --config=./project/cloudbuild.yaml --project=$PROJECT_ID // AND SOME SUBSTITUTIONS
This is my cloudbuild.yaml file:
steps:
# BUILD IMAGE
- name: "gcr.io/cloud-builders/docker"
args:
- "build"
- "--tag"
- "gcr.io/$PROJECT_ID/$_SERVICE_NAME:$_TAG_NAME"
- "."
timeout: 180s
# PUSH IMAGE TO REGISTRY
- name: "gcr.io/cloud-builders/docker"
args:
- "push"
- "gcr.io/$PROJECT_ID/$_SERVICE_NAME:$_TAG_NAME"
timeout: 180s
# DEPLOY CONTAINER WITH GCLOUD
- name: "gcr.io/google.com/cloudsdktool/cloud-sdk"
entrypoint: gcloud
args:
- "run"
- "deploy"
- "$_SERVICE_NAME"
- "--image=gcr.io/$PROJECT_ID/$_SERVICE_NAME:$_TAG_NAME"
- "--platform=managed"
- "--region=$_REGION"
- "--port=8080"
- "--allow-unauthenticated"
timeout: 180s
# DOCKER IMAGES TO BE PUSHED TO CONTAINER REGISTRY
images:
- "gcr.io/$PROJECT_ID/$_SERVICE_NAME:$_TAG_NAME"
And here is my Dockerfile:
FROM node:12-slim
WORKDIR /
COPY ./package.json ./package.json
COPY ./package-lock.json ./package-lock.json
COPY ./src ./src
RUN npm ci
From my configuration file, since nothing is being told to copy the node_modules folder, it seems unnecessary to add node_modules to .gcloudignore. But is it?
I'm asking this because I saw this answer that said:
When you run gcloud builds submit... you provide some source code and either a Dockerfile or a configuration file. The former is a simple case of the second, a configuration file containing a single step that runs docker build....
Configuration files (YAML) list a series of container images with parameters that are run in series. Initially Cloud Build copies a designated source (can be the current directory) to a Compute Engine VM (created by the service) as a directory (that's automatically mounted into each container) as /workspace.
If it copies the source, will it copy node_modules as well? Should I add it to .gcloudignore or is it not necessary?
Yes you can skip the node module because you don't use them in your build (and it's huge and long to upload). In your command npm ci, I'm sure you download the dependencies, so, add the node_modules to your .gcloudignore (and .gitignore also)

Deploying Angular Universal 9 to Google App Engine

I am not sure the situation has been changed but it seems I got stuck with the versions I am using.
Previously, in Angular 7, we were able to generate server files for Angular Universal at the root level so we could have node main.js in app yaml and Google App Engine just found the way to run our web application. It seems this is not possible anymore for Angular 9.
We are using Angular SSR for our production web site. It compiles all the server files in /dist-server folder. There is a docker file to deploy it on Google App Engine:
FROM node:12-alpine as buildContainer
WORKDIR /app
COPY ./package.json ./package-lock.json /app/
RUN npm install
COPY . /app
RUN npm run build:ssr // This will create dist/ and dist-server/ folders in the docker
FROM node:12-alpine
WORKDIR /app
COPY --from=buildContainer /app/package.json /app
COPY --from=buildContainer /app/dist /app/dist
COPY --from=buildContainer /app/dist-server /app/dist-server
EXPOSE 4000
CMD ["npm", "run", "serve:ssr"]
In package.json we have :
"serve:ssr": "node dist-server/main.js",
In order to start the deployment, we type gcloud app deploy in the terminal and everything works fine for this process. The main problem is this takes almost 25 mins to finish. The main bottleneck for the time consuming part is the compiling.
I thought we could have compiled the repo on our local dev machine, and copy only dist/ and dist-server folder to the docker and add node dist-server/main.js to run our web application in the docker file. Whenever I tried to copy only dist and dist-server folder. I got below error message:
COPY failed: stat /var/lib/docker/tmp/docker-builder{random numbers}/dist: no such file or directory
I also tried to compile the main.js which is the main server file for angular universal at the same level as app.yaml. I assumed this is required according to Google App Engine Node JS deployment rule since there is an example repo from Google. I cannot compile our main.js file into the root folder, it gives below error message:
An unhandled exception occurred: Output path MUST not be project root directory!
So I am looking for a solution to which does not require Google App Engine to rebuild our repo (since we can do this in our dev machine and upload the compiled files for the sake of time saving) to make the deployment process faster.
Thanks for your help
I have found that the .dockerignore file had dist and dist-server folder in it. I have removed those entries. I am able to compile and deploy the docker file on google app engine now.

Azure App Service setting ASPNETCORE_ENVIRONMENT=Development result in 404

I notice this issue when I deploy my asp.net core MVC application docker image to Azure App Service. Either setting ASPNETCORE_ENVIRONMENT in Dockerfile ENV ASPNETCORE_ENVIRONMENT Development or setting in docker-compose
environment:
- ASPNETCORE_ENVIRONMENT=Development
I always get 404 when accessing website.
The weird part is that if I set ASPNETCORE_ENVIRONMENT to any other value (Staging, Production, etc.), 404 will be gone and website can be accessed normally
How to reproduce:
Create a asp.net core MVC project (just create a bare bone project, don't change any code or add any logic)
Build this project on local machine (Dockerfile is like below)
FROM microsoft/dotnet:2.2-sdk AS build-env
WORKDIR /app
# Copy necessary files and restore
COPY *.csproj ./
RUN dotnet restore
# Copy everything else and build
COPY . ./
RUN dotnet publish -c Release -o out
# Build runtime image
FROM microsoft/dotnet:2.2-aspnetcore-runtime
COPY --from=build-env /app/out .
# Start
ENV ASPNETCORE_ENVIRONMENT Development
ENTRYPOINT ["dotnet", "CICDTesting.dll"]
Push this image to Azure container registry
I have a webhook attached with App Service, so deployment will be triggered automatically
From the Log Stream I can see the image is pulled successfully and container is up and running
Access the website, then it gives 404
If I change ENV ASPNETCORE_ENVIRONMENT Development to ENV ASPNETCORE_ENVIRONMENT Staging, and repeat build and deploy steps, website will be accessible.
It's the same if I remove ENV ASPNETCORE_ENVIRONMENT Development from Dockerfile and configure it in docker-compose

pm2 deploy master latest

I have to be missing something. I have a basic ecosystem.json layout, with maybe the one other post deploy procedure for webpack to build the production set.
I get no errors on pm2 deploy ecosystem.json staging, yet I never get the current ref. The only time I get the latest is when I wipe the directories out and pm2 deploy ecosystem.json staging setup.
I've tried pm2 deploy ecosystem.json staging update with no luck.
Deployment is stuck at the original deploy commit. I confirm this with pm2 deploy ecosystem.json current.
What am I missing?

Resources