CruiseControl with static website? - cruisecontrol.net

I maintain a classic asp web application, and I'd like to try to automate some of the merging to and from branches using CruiseControl (CruiseControl.net specifically). But I don't have any use for a build tool. Basically I would want a commit on /trunk to automagically get merged to a bunch of feature branches that would exist as CC projects.
Is this possible/recommended or is there an easier way to do this?
Some parts of the repository do contain VB6 or .NET code, so I would like to be able to automate builds to those parts in the future, but for the time being I would just basically be using it to automate keeping my static website clean.

It is not the normal responsibility of CruiseControl to affect source control, short of tagging commits. You could have another, say release, branch that CruiseControl monitors. The merging can then be done manually to this branch as required - resolving any merge conflicts at point of merge. CruiseControl would then just 'build' these merge commits.
CruiseControl is a basically a tool for executing commands in a given order based on a trigger. In your case a SourceControlTrigger.
You can setup CruiseControl to execute msbuild, or not, as required.
As you are trying to deploy/package a classic asp site, I guess you only need to copy/zip the source files and deploy them to your web server. This can be achieved as so:
Use nant to copy, and zip, your source files, execute it via an nant task
use curl, executed via an exec task to send the zip file via http(s)
If required you can preceded this with an msbuild task to build your .net projects.

Related

How to create a common pipeline in GitLab for many similar projects

We have hundreds of similar projects in GitLab which have the same structure inside.
To build these projects we use a one common TeamCity build. We trigger and pass project GitLab URL along with other parameters to the build via API, so TeamCity build knows which exact project needs to be fetched/cloned. TeamCity VCS root accepts target URL via parameter.
The question is how to replace existing TeamCity build with a GitLab pipeline.
I see the general approach is to have CI/CD configuration file(.gitlab-ci.yml) directly in project. Since the structure of the projects the same this is not the option to duplicate the same CI/CD config file across all projects.
I'm wondering is it possible to create a common pipeline for several projects which can accept the target project URL via parameter ?
You can store the full CICD config in a repository and put in all your projects a simple .gitlab-ci.yml which includes the shared file.
With thus approach there is no redundant definition of the jobs.
Still, you can add specific other jobs to specific projects (in the regarding .gitlab-ci.yml files or define variables in a problem and use some jobs conditionally) - you can also include multiple other definition files, e.g. if you have multiple similar projects.
cf. https://docs.gitlab.com/ee/ci/yaml/#include
With latest GitLab (13.9) there are even more referencing methods possible: https://docs.gitlab.com/ee/ci/yaml/README.html#reference-tags
As #MrTux already pointed out, you can use includes.
You can either use it to include a whole CI file, or to include just certain steps. in Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location - you can find detailed explanation with examples of both usages

Bamboo plan: Compress the artifact after build and uncompress after deployment to server

This is my first time where I am both learning and implementing automated CICD pipelines in Atlassian bamboo. I have a NodeJS project whose build and deployment plan I configured after much R&D over the net.
In the deployment process, I observed that the deployment is taking very much time as the number of files to be transferred are more in numbers due to node_modules probably. I would like to compress the artifact generated after build steps and want to decompress at server side once the transfer is complete.
I tried finding ZIP in the tool tasks but it is not there. My question is that is it possible in any other way. Is doing it via cmd works & is feasible?
I have a little experience over the Linux commands.
Any help would be highly appreciated.
In my company we use an Ant task including ivy to prepare, zip and publish our projects as artifacts. In the deployment we use an SCP Task to copy the artifact onto our server and an SSH task to unzip it.
So our whole build part is implemented in ant and the only thing our bamboo build does, is checking out a git repository and running the ant script.
That workflow is used for a lot of different projects including nodejs, python, java, c++ or pure text file setups and it works really well.
But a normal script task for zipping should also do the job and depending on the scale of your projects Ant may be an overkill.
I think its possible to use win/linux commands for acheiving your requirement. you would need to write a task to compress the files you can use shell plugin or any other suitable plugin. once the artifact is sent to server you would need a pooling batch program to unzip your artifact at the server end.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

Build Definition: Copying the build output to the server?

When configuration the build definition, I have the option of copying the build output to the server:
We are using the VSO Host Build Controller and Azure's Continuous Integration build template to release to our development environment after every check-in.
Is there any reason why we need to have this value set? How could it ever be useful?
The Copy build output to the server will put the output of the build as a zip file attached the build that can be downloaded later.
In a situation where you don't care about the build output because you have it setup to continuously deploy to Azure (or some other build based deployment) you would not use this option.
If you however needed to download the output, for example a Windows Store App that you need to publish to the Store manually, then you could use this to get the application.
In VSO you most likely don't have a Drop Server (unless you have invested in Azure heavily) so you have 2 choices:
Put them in source control. Only in TFVC, not Git. Also fills up your Repo with Large Files.
Attach them to the build as a Zip.
The second scenario is exactly where you would use this.

Maven - copying properties file from SVN to Linux based App Server machines

We are using Maven and Jenkins for our automated Build and Deployment needs. Our Build Engineer has left and it is now up to me (Java Architect) to implement a few remaining stuff. I tried a lot of things to resolve this issue we are having. The problem statement is -
We have made a separate project in Eclipse to store properties files. The Developers check-in the properties file into SVN once they make any changes to it. Now we want that Maven, when triggered to do a deploy, to do the following -
1. Take the latest properties files from the SVN from the project used to store properties files.
2. Copy the same onto the Linux based JBoss App Server's /conf/ folder
3. Carry on with its deployment task.
We would like to have solution to point 1 and 2 above.
I dont know the exact answer. But it is quite doable. Quick google search did not show up any svn related plugin to retrieve properties. But you can always write your own maven plugin to do that task. For an example, if you want to retrieve properties file from a svn location to a local file system, just write a simple maven plugin[1] using the svn-kit [2].
we can use maven-wagon plugin[3] to transfer any artifact to a destination. Given that it supports SCP i would go with that. (just like a doing a scp to a remote Linux machine)
HTH.
[1] http://maven.apache.org/guides/plugin/guide-java-plugin-development.html
[2] http://svnkit.com/
[3] http://mojo.codehaus.org/wagon-maven-plugin/usage.html

Resources