I am developing a VSTS task and I am having problems with references to different modules.
My first question: When building a task, you need to add the VstsTaskSDK. Do I need to this by copying the module into the TaskRoot/ps_modules? Or is there a certain flag when building the task that can this?
If I need to copy it in the root, how am I going to handle multiple tasks? Copy it in the root of every task? Is there a neater way to do this? Store the sdk in one place and copy it somehow?
I have used the https://github.com/Microsoft/vsts-tasks repo for samples and noticed that shared code is available in "Tasks/Common". Where are the manifest files? I also would like to have a common folder and would like to be able to reference (and copy in the task-package on build), any ideas how?
It seems you are using VSTS DevOps Task SDK. The SDK should be packaged with the task in a ps_modules folder. The ps_modules folder should be in the root of the task folder.
Example layout: Consider the following layout where MyTask is the root folder for the task.
MyTask
| MyTask.ps1
│ task.json
└───ps_modules
└───VstsTaskSdk
[...]
VstsTaskSdk.psd1
Instead of "Tasks/Common", you can use MSBuild Task as an example. More information about manifest files, please refer to this article.
Related
Trying to deploy a multi-project app to azure using pipelines. After trying various combinations (pipeline log is showing about 75/80 runs in the last couple of days), it looks like the problem is with the Dockerfile by Visual Studio 2019 or with the Azure Pipeline somewhere.
Here's what I've narrowed it down to:
Project A-
create a vs asp.net core webapp project, say test1sp,
select the checkbox which says create solution and project in the same folder,
select docker support (I selected Linux) or add it later
no code added, the boilerplate code runs fine as-is
add it to GitHub
create a project/pipeline in azure, source Github, I use the classic editor without YAML
create a docker build/push task and setup, I choose the most basic options, subscriptions, etc.
build works great, I also added a deploy to app service task and it deploys to the app service
Project B - my project is called demo8
Same as project A, except for step #2 - do NOT select create solution and project in the same folder. Follow the rest of the steps and now you should get this error.
Step 7/17 : COPY ["demo8/demo8.csproj", "demo8/"]
...
...
##[error]COPY failed: file not found in build context or excluded by .dockerignore: stat demo8/demo8.csproj: file does not exist
It works fine on localhost/docker. So, I'm guessing maybe vs2019 uses some more tolerant implementation to patch it over. Or, there's a problem with azure's implementation or something else?
I am relatively new to Dockerfile editing and see lots of detailed/complex scenarios, hopefully, someone can shed some light on how to get it working?
Here's a screenshot of the project files/structure:
UPDATE -
Moving the Dockerfile to the solution folder in project B makes it work in azure BUT
then it does NOT work in Visual Studio, no debugging, etc.
make a copy of Dockerfile in project & parent folders ( + keep in sync )
BUT if your solution has multiple projects like mine then
you have to name the Dockerfile different to have unique names in the parent folder
and modify the pipelines to use respective file names
Posting it here in case it helps someone.
The code in the Dockerfile must have the relative path to the folder level that the Dockerfile is in.
Tow ways to solve the problem. One is the change the code in the Dockerfile, for the project B, you can change the COPY code like this:
COPY ["demo8.csproj", "demo8/"]
Another way is to move the Dockerfile to the path that matches your COPY code. But in this way, it may affect other codes in your Dockerfile.
And you'd better plan your projects in different folders and create the Dockerfile for each project in different folders.
I just ran into this problem with a default console app in VS2022. I fixed it by changin the "Build context" parameter in the ADO build pipeline's Docker "build and push" step from the default of ** to **/.. so that the working directory was the solution folder, which matches VS (AFAIK).
Similar to #Josh's answer above -- I explicitly set
buildContext: '$(Build.SourcesDirectory)'
in the Docker task in the Azure Pipelines YML file, and it worked like a charm.
Something for the Azure pipelines template maintainer(s) to look into perhaps?
We have hundreds of similar projects in GitLab which have the same structure inside.
To build these projects we use a one common TeamCity build. We trigger and pass project GitLab URL along with other parameters to the build via API, so TeamCity build knows which exact project needs to be fetched/cloned. TeamCity VCS root accepts target URL via parameter.
The question is how to replace existing TeamCity build with a GitLab pipeline.
I see the general approach is to have CI/CD configuration file(.gitlab-ci.yml) directly in project. Since the structure of the projects the same this is not the option to duplicate the same CI/CD config file across all projects.
I'm wondering is it possible to create a common pipeline for several projects which can accept the target project URL via parameter ?
You can store the full CICD config in a repository and put in all your projects a simple .gitlab-ci.yml which includes the shared file.
With thus approach there is no redundant definition of the jobs.
Still, you can add specific other jobs to specific projects (in the regarding .gitlab-ci.yml files or define variables in a problem and use some jobs conditionally) - you can also include multiple other definition files, e.g. if you have multiple similar projects.
cf. https://docs.gitlab.com/ee/ci/yaml/#include
With latest GitLab (13.9) there are even more referencing methods possible: https://docs.gitlab.com/ee/ci/yaml/README.html#reference-tags
As #MrTux already pointed out, you can use includes.
You can either use it to include a whole CI file, or to include just certain steps. in Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location - you can find detailed explanation with examples of both usages
I'm developing 2 node.js-based tasks as contributions of my Azure DevOps extension. According to my experience, every task must be self-contained in terms of files and cannot depend on files brought by other contributions (correct?).
If I'm correct, what is the best practice in the situation below:
The source tree looks as below:
TaskA
|___TaskA files
TaskB
|___TaskB files
Common
|___Common files
How can I package the "Common files" as a part of both tasks? Do I need to copy them to the TaskA and TaskB folders before creating my VSIX file or there is some creative way to cause the extension to deploy them to a common directory and then let the tasks' code "require" this common location?
Thank you in advance,
Moshe.
There is no concept of "shared folders" between tasks in Azure Pipelines. It's your job to put all the files in the task folder prior to packaging.
You can use whatever script tech to copy the files to a dist folder prior to packaging. In the Azure DevOps Extension tasks I used a trick in Typescript to always include the files in the common folder as part of the output.
I've seen others install the common package from filesystem into the node_modules folder of the task packages. Example can be found in this project's make-util.js.
In CC project config file I have many MSBuild tasks. Each task is used to build one solution.
Problem is that this requires maintenance if this CC config file each time when new project is added to / deleted from repository.
My idea is to pass to the CC dynamic list of solutions that should be build and execute build one by one as it is done now with "static / old fashion" maintenance of config file.
Does anyone prepare already such configuration?
Presuming you have something akin to the following:
On disk:
./solution1.sln
./solution2.sln
./solutionN.sln
And a single ccnet project:
Msbuild task -> solution1.sln
Msbuild task -> solution1.sln
Msbuild task -> solutionN.sln
What you are asking for is ccnet to react to what is outside of its environment. This isn't possible, however it would be possible to get another tool to do so.
Possible options:
1. Custom Msbuild project
Create a specific msbuild project which finds and invokes msbuild on all solution files it finds. Call msbuild on this project alone. It should be possible to do this with vanilla msbuild, see https://msdn.microsoft.com/en-us/library/z7f65y0d.aspx
2. Batch files
Find all files, and for each file execute msbuild. Ensure the output is logged to an XML file (msbuild switch - I believe) and merge in the result in the publishers section.
See How to do something to each file in a directory with a batch script
3. Single solution
Create a single solution, which contains all the projects from all solutions (1 to N) and call msbuild on this once.
This solution file would be need to be updated each time a new project comes along however.
I'm trying to set up a build definition for a solution that has multiple web projects in it, but I want a specific one to be deployed as a result of a build (as far as I understand it takes the first web project). I know that there has been discussions around that issue but these discussions are quite old now. I wonder if there is still no solution.
Take a look at customizing deployments using a .deployment file
Basically to achieve what you want you would need to create a .deployment file in the root of your repo that contains something like the following
[config]
project = MyOtherWebProject/MyOtherWebProject.csproj