I'd like to make the CodePipeline Build# (CODEBUILD_BUILD_NUMBER) available to my node code that is being deployed. Currently there are only two steps in the pipeline: pull from bitbucket, then deploy to Elastic Beanstalk, so I don't know how this would work.
Alternatively, if I could get the most recent commit number available to my node.js code, that would be ok.
This demonstrate how to specify an artifact name that is created at build time.example
Related
I have a service running on ECS Fargate, which gets deployed by CodePipeline / CodeBuild.
Terraform:
Defines ECS Fargate Task
Defines CodePipeline
Defines CodeBuild Step
CodePipeline:
Fetches changes from GitHub
Runs CodeBuild
Deploy to ECS using CodeBuild Artifacts
CodeBuild:
Builds a new Docker image
Pushes it to an ECR repo
Returns an imagedefinitions.json as an artifact
The issue with this is that after a commit to GitHub, CodePipeline deploys a new version (which also creates a new revision of the task definition), which in turn causes the Terraform state to be out of sync (the task definition is a required parameter of aws_ecs_service).
I see two possible solutions:
Introduce a shared Terraform state and make CodeBuild / CodePipeline write to it. (seems overly complicated for a simple setup)
Use ignore_changes to ignore any changes to the task definition (I guess that means when updating the task definition that needs to be commented out, which will be forgotten at some point).
Are there any better solutions? If not, which would be the preferred way to do this?
We have two servers in our organisation.
1) server with gitlab
2) Build server
I would like to create an automate build happen in the second machine(Build server ) for the source code in the gitlab server.
How can I achieve this using gitlab ?
Thanks,
siva
If you are moving from an "pull" continuous integration system (e.g. using a kind of crontab that regularly checks if the source code on the versioning system has changed and start the configure/build/test/deploy stages if it has), then know that gitlab has a much better way of doing this.
gitlab approach is to configure a "pull" system: every time the code is updated (in any branch) on the git repository then the script defined in your .gitlab-ci.yml is read to see if continuous integration jobs have to be launched. jobs are send to your configured gitlab runners. gitlab runners are defined on your build server(s) and takes the job when they are coming.
Definition of what to do is also describes in the .gitlab-ci.yml.
Here is a list of documentation to start learning about gitlab CI:
the official documentation can be helpful
A general introduction to gitlab ci using docker can be found in this blog article (the first slides are great). If your build server or your intended build is on Linux, I would recommend using the "docker executor" (e.g. gitlab runners are executed inside a docker machine inside your build server). It is easy and quick to setup.
Hope this helps you starting...
Is it possible to setup continuous delivery for a simple html page in under 1 hour?
Suppose I have a hello world index.html page being hosted by npm serve, a Dockerfile to build the image and a image.sh script using docker build. This is in a github repo.
I want to be able to check-in a change to the index.html file and see it on my website immediately.
Can this be done in under 1 hour. Either AWS or Google Cloud. What are the steps?
To answer your question. 1 hour. Is it possible? Yes.
Using only AWS,
Services to be used:
AWS CodePipeline - To trigger Github webhooks and send the source files to AWS CodeBuild
AWS CodeBuild - Takes the source files from the CodePipeline and build your application, serve the build to S3, Heroku, Elastic Beanstalk, or any alternate service you desire
The Steps
Create an AWS CodePipeline
Attach your source(Github) in your Pipeline (Each commit will trigger your pipeline to take the new commit and use it as a source and build it in CodeBuild)
Using your custom Docker build environment, CodeBuild uses a yml file to specify the steps to take in your build process. Use it to build the newly committed source files, and deploy your app(s) using the AWS CLI.
Good Luck.
I think I would start with creating a web-enabled script which would be a Github commit hook. Probably in Node on a AWS instance which would then trigger the whole process of cleaning up (deleting) the old AWS instance and reinstalling a new AWS instance with the contents of your repository.
The exact method will be largely dependant on how your whole stack is setup.
We have setup and perfectly running gitlab + gitlab-ci installation. We are now looking how to do cross-project builds. Our project is divided into several repositories and everything is joined during build process via composer.
What I would like to achieve is - when you commit to any of those repositories, they trigger main repository to get built. I was trying to achieve this via webhooks, unfortunately I need a lot of information about commit from the main repository, that I don't have.
Any idea how to do it?
I updated gitlab-ci code a little bit: https://github.com/gitlabhq/gitlab-ci/commit/7c7066b0d5a35097a04bb31848d6b622195940ed
I can now call the api.
I'm using code ship for deploy a node.js project in heroku.I'm dong it using this tutorial.I have setup a git repository with the node js project in bitbucket.
In one step it ask to hook codeship with bitbucket.I have successfully done it.but this message keeps on coming , with out going to the next step.Please help me out with this.
When you push a new commit to your Bitbucket repository, this will trigger your first build and the message will disappear.