I have just begun in Azure DevOps.
I made the CI/CD pipeline for our dot net project. CI successfully done and generated the artifact, but in CD getting failed always, configured deployment type = zip;
I am completely new for Azure, so anyone look at this issue earlier and sorted out. Pls share your experience.
Your array before red line says "There is not enough space on the disk". Please check your disk - probably storing or replacing the zip is not possible because of missing space.
From the steps on the screenshot you shared, I do not see any step to archive the artifact files as a ZIP file.
Before the deployment, you need to use the Archive Files task to archive all the required artifact files as a ZIP file. Then execute the deployment with this ZIP file.
Related
I am working on a script solution to capture status changes of an Azure DevOps repo.
Basically here is what we need:
The script is supposed to create a file list that contains ALL files in an Azure repo, and save it to local drive.
If the newly generated file list has changes, i.e., a file gets removed/added, the script should create a separate file list that records these changes.
Step 2 I can take care of myself.
But since I am not familiar with the DevOps API, could anyone help me on this?
Thank you in advance
For the first question, you can fetch repo to local drive, and list files in this directory.
If you could fetch repo to local drive, you can git log -1 for latest change commit.
I apologize for my english, im using a translator.
I need make a deployment on Azure Devops using Continous Deployment but i need exclude a file from the repository (i cant change the repository) then create a deployment in IIS.
I have a WebService file in the repository but i cant ignore or delete it from git. I need to use Azure Devops to ignore it then make a continuous deployment.
Another approch is also to use .artifactsignore. By including this file you can describe which files you want to ignore before building the artifacts package. The tricky part here is correct placing of the file.
Where you save the .artifactignore file depends which path you have specified for the publish pipeline artifact task in your pipeline definition.
Here is good example which helped me to use it:
https://www.jfe.cloud/control-pipeline-artifacts-with-artifactignore-file/
You can use the task Delete Files, as covered below:
# Delete files
# Delete folders, or files matching a pattern
- task: DeleteFiles#1
inputs:
#SourceFolder: # Optional
#Contents: 'myFileShare'
#RemoveSourceFolder: # Optional
The input source folder can be folder $(rootFolder) or $(Build.ArtifactStagingDirectory).
I've had a similar problem and solved it this way. Especially .git or .vsfolder.
During the release pipeline I'm launching Selenium tests. Those tests, in case of fail, make screenshots. I'm looking for a way to upload them so I could look through them and check what went wrong.
I manage to zip them, but unfortunately none of the upload methods are working on release pipeline.
Is there a way to save/upload files during release pipeline?
Here's and old answer I've found useful:
How do you publish files back to VSTS Release Management as part of a release?
So if you zip the images and upload them like this, the zipfile appears as a part of logfiles.
On a sidenote, I think you can explore these for more options:
https://github.com/microsoft/azure-pipelines-tasks/blob/master/docs/authoring/commands.md
The current setup is that we're using gulp to build our VS solution using MSBuild and an Azure DevOps release pipeline to deploy our build artifacts via the Kudu Zip Deploy API (via PowerShell) to our Azure App Service.
Kudu appears to copy files that are unchanged, which appears to be causing unnecessary slowness on the target server because it causes the server to restart. Here's one example:
The contents of this file have not changed (as well as other binary files), but what probably has changed is the timestamp due to the way we're generating / regenerating some of these artifact files.
I have tried to see if Kudu can be configured to ignore timestamps, but there doesn't seem to be an option for it, and it might also not be a good solution. According to the Kudu zip deploy docs:
Efficient file copy: Files will only be copied if their timestamps don't match what is already deployed. Generating a zip using a build process that caches outputs can result in faster deployments.
Other ideas include it being a misconfiguration in the solution / file settings, or an issue with the way we're building via gulp. Any ideas on how I can prevent these unchanged files from being copied?
I have a solution. 2 projects within the solution produce XML documentation, that I need to copy to the bin folder web root when I deploy to azure.
Locally, I notice that when I build my solution, those 2 XML files get copied with the DLL into my web/bin folder. When I run my un-modified deploy.cmd file locally, I also notice that kudosync picks those up and hapilly puts them into my artifacts/wwwroot/bin folder.
But - when I deploy to azure by pushing to github, the local deployment temp folder on azure doesn't contain the XML files, and thus they don't get picked up. I added some post-build "DIR" commands to the deploy.cmd file to see what is going on, and the XML files just aren't there in the %DEPLOYMENT_TEMP%\bin\ folder.
Anyone know what's going on here?
Aha - it's because when you build from MSBUILD, it doesn't generate the XML docs for the related projects. I was getting them locally because I'd at some point built from VS, which generated them.