Proper file structure while developing several Azure DevOps custom tasks - azure

I'm developing 2 node.js-based tasks as contributions of my Azure DevOps extension. According to my experience, every task must be self-contained in terms of files and cannot depend on files brought by other contributions (correct?).
If I'm correct, what is the best practice in the situation below:
The source tree looks as below:
TaskA
|___TaskA files
TaskB
|___TaskB files
Common
|___Common files
How can I package the "Common files" as a part of both tasks? Do I need to copy them to the TaskA and TaskB folders before creating my VSIX file or there is some creative way to cause the extension to deploy them to a common directory and then let the tasks' code "require" this common location?
Thank you in advance,
Moshe.

There is no concept of "shared folders" between tasks in Azure Pipelines. It's your job to put all the files in the task folder prior to packaging.
You can use whatever script tech to copy the files to a dist folder prior to packaging. In the Azure DevOps Extension tasks I used a trick in Typescript to always include the files in the common folder as part of the output.
I've seen others install the common package from filesystem into the node_modules folder of the task packages. Example can be found in this project's make-util.js.

Related

How to create a common pipeline in GitLab for many similar projects

We have hundreds of similar projects in GitLab which have the same structure inside.
To build these projects we use a one common TeamCity build. We trigger and pass project GitLab URL along with other parameters to the build via API, so TeamCity build knows which exact project needs to be fetched/cloned. TeamCity VCS root accepts target URL via parameter.
The question is how to replace existing TeamCity build with a GitLab pipeline.
I see the general approach is to have CI/CD configuration file(.gitlab-ci.yml) directly in project. Since the structure of the projects the same this is not the option to duplicate the same CI/CD config file across all projects.
I'm wondering is it possible to create a common pipeline for several projects which can accept the target project URL via parameter ?
You can store the full CICD config in a repository and put in all your projects a simple .gitlab-ci.yml which includes the shared file.
With thus approach there is no redundant definition of the jobs.
Still, you can add specific other jobs to specific projects (in the regarding .gitlab-ci.yml files or define variables in a problem and use some jobs conditionally) - you can also include multiple other definition files, e.g. if you have multiple similar projects.
cf. https://docs.gitlab.com/ee/ci/yaml/#include
With latest GitLab (13.9) there are even more referencing methods possible: https://docs.gitlab.com/ee/ci/yaml/README.html#reference-tags
As #MrTux already pointed out, you can use includes.
You can either use it to include a whole CI file, or to include just certain steps. in Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location - you can find detailed explanation with examples of both usages

Share and manage rc files (or config files) between mltiple projects

I'm working on different projects based on node, and one thing I always have to do is create the configuration files in all the projects since you all share a lot of configuration, for example, in all projects I use commitlint, lint-stage, husky, eslint, nodemon, and typescript and other settings.
How could I share all these settings in all projects and if I update any of them, update them in all projects?
The first thing that occurs to me is to create a npm packet with all the configurations, and several scripts, that copies / updates these configuration files in the root directory of the project where user is, something like
> myscript update
> myscrpt init
Another option would be to use the configurations programmatically, that is, instead of using a .rc use a .js, but this would force me to manage the dependencies in each project and create a .rc file that uses the configuration of the js file which is in the configuration package.
Another option is to create a github repository as a template, but if I update this repository, the projects I have created using this template are not updated, right?
What do you think is the best way to do it?
Even though git submodules seem to be discouraged lately, I think it's the most reasonable choice (assuming all of your projects are git-based): https://git-scm.com/book/en/v2/Git-Tools-Submodules
In your case, you'd have a 'common' repository with configuration, eg. configuration and two projects: projectA and projectB. Both of them would have the submodules added:
git submodule add <your_git_repo_server>/configuration
Please notice, however, that submodule refers to a particular commit (not a branch or tag - commit). You always have to take big care of synchronizing your dependencies correctly.

Bamboo plan: Compress the artifact after build and uncompress after deployment to server

This is my first time where I am both learning and implementing automated CICD pipelines in Atlassian bamboo. I have a NodeJS project whose build and deployment plan I configured after much R&D over the net.
In the deployment process, I observed that the deployment is taking very much time as the number of files to be transferred are more in numbers due to node_modules probably. I would like to compress the artifact generated after build steps and want to decompress at server side once the transfer is complete.
I tried finding ZIP in the tool tasks but it is not there. My question is that is it possible in any other way. Is doing it via cmd works & is feasible?
I have a little experience over the Linux commands.
Any help would be highly appreciated.
In my company we use an Ant task including ivy to prepare, zip and publish our projects as artifacts. In the deployment we use an SCP Task to copy the artifact onto our server and an SSH task to unzip it.
So our whole build part is implemented in ant and the only thing our bamboo build does, is checking out a git repository and running the ant script.
That workflow is used for a lot of different projects including nodejs, python, java, c++ or pure text file setups and it works really well.
But a normal script task for zipping should also do the job and depending on the scale of your projects Ant may be an overkill.
I think its possible to use win/linux commands for acheiving your requirement. you would need to write a task to compress the files you can use shell plugin or any other suitable plugin. once the artifact is sent to server you would need a pooling batch program to unzip your artifact at the server end.

Creating a custom VSTS-Task

I am developing a VSTS task and I am having problems with references to different modules.
My first question: When building a task, you need to add the VstsTaskSDK. Do I need to this by copying the module into the TaskRoot/ps_modules? Or is there a certain flag when building the task that can this?
If I need to copy it in the root, how am I going to handle multiple tasks? Copy it in the root of every task? Is there a neater way to do this? Store the sdk in one place and copy it somehow?
I have used the https://github.com/Microsoft/vsts-tasks repo for samples and noticed that shared code is available in "Tasks/Common". Where are the manifest files? I also would like to have a common folder and would like to be able to reference (and copy in the task-package on build), any ideas how?
It seems you are using VSTS DevOps Task SDK. The SDK should be packaged with the task in a ps_modules folder. The ps_modules folder should be in the root of the task folder.
Example layout: Consider the following layout where MyTask is the root folder for the task.
MyTask
| MyTask.ps1
│ task.json
└───ps_modules
└───VstsTaskSdk
[...]
VstsTaskSdk.psd1
Instead of "Tasks/Common", you can use MSBuild Task as an example. More information about manifest files, please refer to this article.

How does a bin folder that is excluded from a project affect automated builds?

We do automated builds using Nant and CruiseControl.net. I'm very green when it comes to the process. While looking into some things, I noticed that for most(all?) of the solutions involved in the automated build process, the bin folders are included in the project. Is this a requirement for automated builds? If the bin folder is excluded, will the folder and any files in it need to be copied to the deployment servers manually?
Thanks.
If you are referring to the /bin/debug/ folder under a project, you should not need those checked into your source control. If you have external libraries (log4net.dll for example) they should be checked into source control along with your code, but in a separate folder (named "ThirdParty" or "DLLs" for example.) When CruiseControl.net runs, it should compile any assemblies that have been modified, and copy output to the /bin/debug/ folder in the same way as VisualStudio copies those files on your box.
It is better to include bin folder in the automated build process, since it contains some external dlls like AjaxControlToolkit along with internal dlls.
We here excluded the Debug folder and user option files(*.suo) from the automated build.

Resources