Bitbucket-Laravel-Azure Web App: Artifacts not uploaded - bitbucket-pipelines

Artifacts is not uploaded after successful test pipeline. What's wrong with the code below?
Here's the result:

You are using a variable in your artefact name, which is currently not supported. As artifacts section is a separate post-script behaviour, your Test step is perfectly green.
To solve the issue, use a constant name for your artefact. Alternatively, you can combine the packaging and deployment steps into a single one:
script:
- echo "Packaging and deploying to test environment"
- zip -r example-$BITBUCKET_BUILD_NUMBER.zip .
- pipe: ...

This advice works https://stackoverflow.com/a/72570848/14740671 but I need to separate the deployment. I also tried uploading the file using this code:
- pipe: atlassian/bitbucket-upload-file:0.3.2
variables:
BITBUCKET_USERNAME: $BITBUCKET_UPLOAD_USERNAME
BITBUCKET_APP_PASSWORD: $BITBUCKET_UPLOAD_PWD
FILENAME: "example-1.zip"
It has no errors but file is still no seen in the Artifacts and azure deployments has "FileNotFoundError: [Errno 2] No such file or directory: '/opt/atlassian/pipelines/agent/build/example-1.zip'"

Related

Does Azure WEBSITE_RUN_FROM_PACKAGE really mean don't unpack the zip archive?

I'm doing a zip deploy of a .NET Framework web app to an Azure App Service via a GitHub workflow.
I have set WEBSITE_RUN_FROM_PACKAGE to 1 in the Azure console's Settings / Configuration / Application settings page. I've also tried setting WEBSITE_RUN_FROM_ZIP to 1 there just in case (although I think this is an obsolete flag).
The package is building correctly in GitHub and I can see it showing up in my Kudu debug console, under C:\home\site\wwwroot (as MyPackageName.zip) as well as in C:\home\data\SitePackages (as 20220512205318.zip, for example).
The deploy portion of my YAML is:
deploy:
runs-on: windows-latest
needs: build
environment:
name: 'Test'
url: ${{ steps.deploy-to-webapp.outputs.webapp-url }}
steps:
- name: Download artifact from build job
uses: actions/download-artifact#v2
with:
name: ASP-app
- name: Deploy to Azure Web App
id: deploy-to-webapp
uses: azure/webapps-deploy#v2
with:
app-name: ${{ env.AZURE_WEBAPP_NAME }}
publish-profile: ${{ secrets.AZUREAPPSERVICE_PUBLISHPROFILE_XYZsecret }}
package: .
And the .PublishSettings I've uploaded to GitHub looks like:
<publishData>
<!-- Which one of these 3 profiles is my YAML using? I don't actually know. -->
<publishProfile profileName="mywebappname-test - Web Deploy" publishMethod="MSDeploy" etc="foobar">
<databases/>
</publishProfile>
<publishProfile profileName="mywebappname-test - FTP" publishMethod="FTP" etc="foobar">
<databases/>
</publishProfile>
<publishProfile profileName="mywebappname-test - Zip Deploy" publishMethod="ZipDeploy" etc="foobar">
<databases/>
</publishProfile>
</publishData>
The zip package is not getting automatically unpacked. The MSFT support rep I talked to suggested that this was the problem, and indeed when I download the package to my machine and drop it into Kudu's Tools/Zip Push Deploy page, I see that the package is unpacked, and I can get the site to work by setting the appropriate Physical Path to match the '/' Virtual path. Specifically the Kudu Tools Zip Push causes my web.config and favicon.ico etc. files to show up in:
C:\home\site\wwwroot\Content\D_C\a\foo\bar\good\boy\obj\Test\Package\PackageTmp
and I can go to the Azure console for my app service, navigate to Settings / Configuration/ Path Mappings, Virtual applications and directories, and edit the existing entry to:
Virtual path: /
Physical Path: site\wwwroot\Content\D_C\a\foo\bar\good\boy\obj\Test\Package\PackageTmp
Type: Application
and then see my site come up in a browser.
However, when don't do anything to unpack the archive, and I leave the entry as:
Virtual path: /
Physical Path: site\wwwroot
Type: Application
I can't see my site in a browser and instead just see "You do not have permission to view this directory or page." When I then dig in to the logs in Kudo, I see 403.14 - Forbidden errors on my main site and a 404.0 - Not Found error on C:\home\site\wwwroot\favicon.ico. (Like the rest of my files, favicon.ico is still inside the zip archive at [...]\foo\bar\good\boy\obj\Test\Package\PackageTmp\favicon.ico.)
My questions are:
Should my web app be able to run at all with just my zip file sitting there as C:\home\site\wwwroot\MyPackageName.zip? Or does it really need to be unpacked as the MSFT rep indicated?
If it is supposed to run this way, any ideas on what am I missing? I assume it's something in my YAML (which of the 3 publishProfile settings is it actually choosing here?) or in Settings / Configuration/ Path Mappings or Application settings, but I have no idea what at this point and I'm running out of ideas.
Thanks, Eric
Should my web app be able to run at all with just my zip file sitting there as C:\home\site\wwwroot\MyPackageName.zip?
Pretty much, yes, just not in wwwroot. When WEBSITE_RUN_FROM_PACKAGE is enabled, the application is run from the archive directly as a read only directory mount. Nothing is copied to wwwwroot or anywhere else.
If it is supposed to run this way, any ideas on what am I missing?
My understanding is package deploy from GitHub is not supported or rather GitHub archives are incompatible with run from package on App Service.
dwellman's response was the correct answer to my original question, but I'll add some more detail here on how I used this information to get my deployment to work properly. I feel like the inability to read the zip archive's internal index XML file(s) to find the correct relative path is an Azure flaw, but until it's addressed, I hope others may find this useful.
My first step was to abandon the idea of deploying as a zip file. It's possible I could have still made this work by doing some post-processing to zip things up in a different format without the nested folders, but I decided in my case that the benefits of a single-file deployment weren't worth the cost. To stop deploying as a zip file, I manually edited the .pubxml file I was passing in as msbuild option /p:PublishProfile=AzureCI.pubxml. The changes I made were to change PackageAsSingleFile from true to false, and change DesktopBuildPackageLocation from a zip file path to a folder path.
This alone was enough to get my site to get deployed to Azure as individual files instead of a zip archive. The files were still buried in an ugly folder structure, but I could at least see them in Kudu and get the site to work by applying the same Settings / Configuration/ Path Mappings, Virtual applications and directories adjustment I describe in my original question.
I could have stopped there, but I wanted to be able to just use the default virtual path and not have my Azure configuration be so dependent on my upstream processes. In other words, I wanted to just have my web.config and favicon.ico etc land directly in C:\home\site\wwwroot instead of deep in the weeds of a subfolder structure. To make this work, I changed the package argument to webapps-deploy in my YAML from . to the appropriate path as follows:
- name: Deploy to Azure Web App
id: deploy-to-webapp
uses: azure/webapps-deploy#v2
with:
app-name: ${{ env.AZURE_WEBAPP_NAME }}
publish-profile: ${{ secrets.AZUREAPPSERVICE_PUBLISHPROFILE_XYZsecret }}
package: .\Archive\Content\D_C\a\foo\bar\good\boy\obj\Test\Package\PackageTmp
This caused the deployment process to pick off just the files I needed from the build and drop them into C:\home\site\wwwroot. I could then revert the path mapping kludge and be on my way.

Azure Devops path for Power Platform CD

I'm Trying to setup a Release CD for D365 (or Power Platform) using the "Power Platform Deploy Package" task:
https://learn.microsoft.com/en-us/power-platform/alm/devops-build-tool-tasks
I can see the build and release are flowing correct except for the very last part.
I can tell it works all the way to the "Download Artifact" part "
Download Artifact
Do I need anything else beside those 2?:
pipeline
I get this "Package File not specified or not found" error:
error
same error with several combinations of env variable paths. tried the exact path and still does not work
Am I forgetting anything?
path
The code is produced with the VS CRM Package and compiled just fine. I only updated the .net framework version to 4.7.2.
vs template
If you are using Hosted agent instead of Self-hosted agent, we need to specify the path using predefined variables $(Build.ArtifactStagingDirectory) instead of D:\a\r1\a, then it should be work.
Also we could add the task Copy files to filter the .dll file and copy them to another path, then specify the Package File path.
Note: I just share the yaml sample, you could enter the variable to class edit mode(UI)
- task: CopyFiles#2
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)'
inputs:
Contents: |
$(Build.ArtifactStagingDirectory)/_Demo-CI/drop/bin/Debug/*.dll
TargetFolder: '$(Build.ArtifactStagingDirectory)'

What is the best way to have .env file at pipeline job level in azure devops

Could you please suggest what is the best way to have .env file available for my azure devops pipeline. (Note we are ignoring the .env file to be pushed to Git repository)
My Node.js utility application code base is in azure-devops Git.
I have an azure build pipeline (YML version) which is a CI pipeline (doing just compile & test).
Unit test uses API call which needs API secret token to run.
These tokens are stored in .env file (I used dotenv package of Node.js)
But we are not pushing .env file to Git.
So how should I make .env file available to my CI pipeline.
You can use secure files on azure devops.
1, First upload the .env file to azure devops Secure file
Go to your azure devops project portal. Pipelines--> Library--> Secure files--> +Secure file.
2, Then add Download Secure file Task in your yaml pipeline to download .env file to the agent.
- task: DownloadSecureFile#1
inputs:
secureFile: '.env'
if the task is given the name mySecureFile, its path can be referenced in the pipeline as $(mySecureFile.secureFilePath). Alternatively, downloaded secure files can be found in the directory given by $(Agent.TempDirectory)
3, Then you can Copy File task to copy the .env file to the right place.

Gulp-compiled CSS folder missing from the Azure DevOps pipeline Build Artifact

A little background...
I have a small dotnet core application that is hosted on Azure and is being built and deployed using Azure DevOps Pipelines. Before we started using the DevOps Pipelines the CI was hooked up directly to Azure which compiled fine but took an actual lifetime to deploy, hence the decision to move.
However, the build pipeline no longer compiles or outputs the sass/css folder
Everything else works okay - I check in, the Build pipeline picks up my commits and has the following steps:
Restore [.NET Core]
Build [.NET Core]
Publish [.NET Core]
Publish Build Artifact
Part of step 3 (Publish) uses a Gulp task:
gulp.task('prod', function (callback) {
runSequence('clean','set-prod',
['icon-sprite', 'logo-sprite', 'images', 'sass', 'modernizr', 'mainjs', 'adminjs'],
callback);
});
And locally (and previously) this generated five folders:
icons
img
js
logos
css (now mysteriously missing in action)
Variations I've tried
I've tried deleting my local css folder and running the CLI dotnet publish exactly the same way the Pipeline does and that appears to work fine locally.
I've also stripped the sass task way back in case that was causing an issue somewhere in the pipeline, so that now looks like this:
return gulp.src('src/sass/style.scss')
.pipe(sass({outputStyle: 'compressed'}))
.pipe(gulp.dest('wwwroot/dist/css));
I can see all of the output in the console logs on the Pipeline and it successfully executes the sass task:
2019-01-02T14:43:51.3558593Z [14:43:51] Starting 'sass'...
2019-01-02T14:43:51.9284145Z [14:43:51] Finished 'sass' after 524 ms
There are no other errors or warnings in the build script and everything completes and fires off the Release pipeline (which copies the artifact up to the Azure site).
Speculation
I would expect an error somewhere... but nothing - all of the green ticks are downright cheerful... so I'm a little stumped at what may or may not be happening! I can only think that there must be some dependency or something missing in the Pipeline environment? Orrrrr maybe I'm missing a Pipeline step?
Any help or nudges or ideas would be greatly appreciated! Thank you for sticking it out through my small essay and for any help you can provide :)
Something I've done in this situation before is changing the Publish Build Artifact task to upload everything in the build folder. My guess is that right now the 'Path to Publish' value in that task is set to $(build.artifactStagingDirectory). Change it to $(build.SourcesDirectory). After running the build again you'll see that the entire build directory has was uploaded. This includes your source code and any other folders like you have on your local environment. From there you can figure out if the CSS folder is actually missing, or if it ended up in some other folder location.
If the folder ends up in a weird location you can either add a file copy task to move the CSS folder to the proper folder in $(build.artifactStagingDirectory) or make a change to the Gulp task. Whatever is better for your scenario.
Once you find the location, you can fix the Publish Build Artifact task.
I was having the exact same issue. I was able to get everything working locally without issue. gulp would generate the css folder just fine. dotnet publish -c release would do the same. However, when ran through the pipeline, no css folder.
The thing that I find the most strange, is that there is a sibling folder (scripts) that is used in the same way the css gulp task is used, but that folder makes it just fine. Here's my css task:
gulp.task('min', function() {
return gulp.src('wwwroot/css/**/*.css')
.pipe(cssnano({zindex:false}))
.pipe(gulp.dest('wwwroot/dist/css/'));
});
but, this task does works both locally and in the pipeline:
gulp.task('build-js', function() {
return gulp.src('wwwroot/scripts/**/*.js')
.pipe(concat('site.bundle.js'))
.pipe(uglify())
.pipe(gulp.dest('wwwroot/dist/scripts/'));
});
I ended up just giving up since this is legacy code anyways and settled on a workaround:
Add the Copy Files task right after your gulp task with the below configuration:
Or, if you like YAML:
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: wwwroot/dist/css'
inputs:
SourceFolder: wwwroot/css
Contents: '*.css'
TargetFolder: wwwroot/dist/css

Gitlab CI Web Deployment

So we are currently moving away from our current deployment provider: Beanstalk, which is great but we are on the top tier and we keep running out of space or hitting our repository limits. So we are moving away so please do not suggest any other SaaS provider.
I personally use Gitlab for my own projects and a few company projects and it's amazing we use a self hosted version on our local server in our company building.
We have CI setup and currently are using the following deployment code (I have minified the bits just to the deployment for development) - this uses the shell executer for deploying as we deploy to an existing linux server.
variables:
HOSTNAME: '<hostname>'
USERNAME: '<username>'
PASSWORD: '<password>'
PATH_DEV: '/path/to/www'
# Define the stages (we can add as many as we want)
stages:
# - build
- deploy
# The code for development deployment
deploy_dev:
stage: deploy
script:
- echo "Deploying to development environment..."
- rm .gitlab-ci.yml
- rsync -urltvz --filter=':- .gitignore' --exclude=".git" -e "sshpass -p"$PASSWORD" ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" * $USERNAME#$HOSTNAME:$PATH_DEV
- echo "Finished deploying."
environment:
name: Development
url: http://dev.domain.com
only:
- envdev
The Problem:
When we use the above code to deploy it's perfect and works really well, and it deploys all the code after optimisation etc, but we have found a little bug here.
When you delete a file then the rsync command will not delete the file, now I did some searching and found the --remove flag you can add, and it worked - but it deleted all the user uploaded content as well. Now I added the .gitignore in to the filtering, so it would ignore some the files in their (which are usually user generated) or configuration files or/and libraries (npm, etc.). This is fine until a user started uploading files using the media manager in our framework which stores in a folder that is not in the .gitignore file and it can't because it contains other files, as we also add our own files in there so they're editable by the user, so now I am unsure how to manage this.
What we are looking for is a CI setup, which will upload file changes to the server, so it would search through the latest commits, and find the latest files that have been changed and then push only them files up. Of course I would like to do this with the Gitlab CI still, so any ideas examples or tutorials would be amazing.
Thanks in advance.
~ Danny
May it helps: https://github.com/banago/PHPloy
Looks this tool designed for php project, but I think it can use other web deployment.
how it works:
PHPloy stores a file called .revision on your server. This file contains the hash of the commit that you have deployed to that server. When you run phploy, it downloads that file and compares the commit reference in it with the commit you are trying to deploy to find out which files to upload. PHPloy also stores a .revision file for each submodule in your repository.

Resources