This question already has answers here:
Deploying GitLab pages for different branches
(4 answers)
Closed 3 years ago.
I have a vanilla JavaScript application that I host on GitLab Pages. Recently, I have been making changes and bug fixes that have been breaking the site, and I haven't noticed until already having pushed the changes.
In an effort to reduce user exposure to bugs, I would like to publish two sites in separate folders:
public/ : master branch; the official website
public/staging/ : staging branch; the nightly build
I would like for these to correspond to two different branches: master and staging
Reading GitLab CI for GitLab Pages, it sounds like this is not even possible. I'm hoping I'm reading this wrong.
default:
image: node:latest
test:
stage: test
script:
- npm install
- node test.js
only:
- staging
- master
staging:
stage: deploy
environment: staging
script:
- mkdir -p public/staging
- cp -r www public/staging
artifacts:
paths:
- public
only:
- staging
pages:
stage: deploy
environment: production
script:
- mkdir -p public
- cp -r www public
artifacts:
paths:
- public
only:
- master
Is this possible? Is it possible to deploy two different folders from two different branches?
Interestingly, it is possible to post from any branch, just not from any job.
To do this, I needed to make two changes:
I need to know the current state of the published data
I need to change directory based on the current branch
GitLab has the ability to cache folders. Generally this is used to speed up builds by caching downloaded drivers. There is no reasons I could not use this to store the public folder. This way when I make changes to staging, I will remember the state of the root application:
cache:
paths:
- public
The next trick would be to publish pages to the appropriate folder, depending on current branch being built. To do this, we can look to GitLab CI/CD Environment Variables; in particular:
CI_COMMIT_REF_SLUG: The current branch
CI_DEFAULT_BRANCH: the default branch (master)
Knowing these two values, we can do a bit of bash to determine the correct place to write the content to.
pages:
stage: deploy
script:
- dir="$CI_COMMIT_REF_SLUG"
- if [ "$CI_COMMIT_REF_SLUG" == "$CI_DEFAULT_BRANCH" ]; then dir=""; fi;
- dir="public/$dir"
- echo "Deploying to $dir"
- mkdir -p $dir
- cp -r www $dir
artifacts:
paths:
- public
only:
- staging
- master
Don't forget to limit pages to only staging and master.
WARNING
I'm not satisfied with this.
I think it would be better to maintain the cache somewhere completely different and copy them in at a later stage, but completely re-writing the public folder each time.
The current solution will build cruft over time, but the basic idea is sound.
You can only publish changes to GitLab pages through your master branch, just as you describe. The only thing that GitLab pages does though, is to put files in the public folder in the job called pages. These files can be whatever files that you want though, as long as you manage to get them to this folder through the GitLab job.
You could try something like this:
pages:
...
script:
- mkdir -p public
- cp -r www public
- git checkout origin/staging
- mkdir -p public/staging
- cp -r www public/staging
I haven't tested this, so please let me know if it doesn't work!
If you run a GitLab job, it usually has all of the git history of your repo. There are settings that changes this though, both in git and in GitLab, so you have to make sure that you always get all of your git history to the pages job. If you have a folder that hasn't been added to git, like public, git should not change anything in it when you checkout another branch.
I think that you should also be able to set up the GitLab pages job with a schedule, so that the pages job is run even if only the staging branch has been updated, but not the master branch.
Related
i go a CICD that make an artifact and save them in public folder
i need to copy some files of these artifacts to another project
what script i can't use or what the best way to do it?
Assuming, you want to copy file into another gitlab project.
Once artifact is built in a job, in a stage (ie : build), it will be available in all jobs of the next stage (ie : deploy).
build-artifact:
stage: build
script:
- echo "build artifact"
artifacts:
name: "public"
paths:
- "public/"
deploy_artifact:
stage: deploy
script:
- cd public/ # here are your artifact files.
- ls -l
- cd ..
# now implement copy strategy,
# for example in a git repo :
- git clone https://myusername:mypassord#myrepo.gitlab.com/mygroup/another_project.git
- cd another_project
- cp ../public/source_file ./target_file
- git add target_file
- git commit -m "added generated file"
- git push
If your destination is not a git repository, you can simply replace by a scp directly to server, or any other copy strategy, directly in the script.
Another way to do it is to get the last artifact of project A, inside the ci of Project B.
Please see https://docs.gitlab.com/ee/ci/pipelines/job_artifacts.html#retrieve-job-artifacts-for-other-projects
When you do trigger to downstream pipelines, just use needs from job that push artifacts, then it will be passed to downstream.
During deployment using gitlab runner it checkouts out the codebase including the .git folder, I presume this is the default behaviour?
Without manually cleaning the .git folder in our deployment script, is there an option to use git archive instead?
eg.
deploy-dev:
when: manual
stage: deploy
script:
- bash deploy.sh $CI_COMMIT_REF_SLUG $CI_BUILD_TOKEN
include-git: false
I have tried searching on google but couldn't find any relevant results (maybe due to bad search keywords)
I want to deploy a static page using GitLab repos with plain HTML/CSS (actually SCSS). As far as I learnt, a static page needs at least .gitlab-ci.yml and /public folder. The file .gitlab-ci.yml will have a minimum requirement like this: (an example from official doc)
pages:
stage: deploy
script:
- mkdir .public
- cp -r * .public
- mv .public public
artifacts:
paths:
- public
only:
- master
And my question is lying in the script line.
(I assume the script below will create a hidden folder name .public and copy all the file in it then move it from .public to public folder. Please correct me if I'm wrong.)
script:
- mkdir .public
- cp -r * .public
- mv .public public
To me, it's similar to shell-script of Linux. It's also confirmed in GitLab doc that it's run by the Runner. But the problem is, how do I know how many shell-scripts are installed in GitLab? And is it possible to make one?
I would like to make 2 folders: src and public. The GitLab CI will run the script and compile SCSS from src then move it to public.
I'm using gitlab.com by the way.
So a few things to consider. Each job in gitlab is run in a container. Generally you specify which one you want to use. Pages is a special case though so you don't have to care about the image for the container.
The pages job will populate your public folder. But you can alter the gitlab-ci.yml file and add steps.
This would build an app using node:
build_stuff:
stage: build
image: node:11
before_script:
- npm install
script:
- npm run build
artifacts:
paths:
- build
pages:
stage: deploy
script:
- mkdir .public
- cp -r build/ .public
- mv .public public
artifacts:
paths:
- public
only:
- master
Formatting is off
Things to note. The first step is running the build steps to generate all the assets for your output folder. It is then storing anything declared in the artifacts block, in this case the build folder, and passing it on to the next job. Adjust this step accordingly to what you need to build your app.
The only thing I altered in the second step is that you copy the contents of the build folder instead of the entire repo into the .public folder. Adjust this to your needs as well.
As for shell scripts, there are none present except for the ones you bring to the repository. The default runner supports Bash so you can execute bash commands just as you would in your terminal.
If you create the file foo.sh in your repo and do bash foo.sh it will execute the script, if it's executable. Remember to chmod it before pushing it.
There are no "shell-scripts installed in Gitlab". Gitlab supports several shells and the script part in your example are just pure bash commands. Since you are most probably using the default docker runner you can execute bash commands from the script part, run scripts in other languages that are in your repo, install packages on the docker container and even prepare and run your own docker images.
i have the following .gitlab.yaml file
stages:
- build
- test
compile:
stage:build
script:
- stuff_happening
test_1:
stage: test
script:
- do_something_1
artifacts:
when: on_failure
name: "$CI_COMMIT_REF_NAME"
paths:
- /root/dir
When test_1 is executed , it creates a folder dir inside root
I want to add it to artifacts but , i get an error saying:
no matching files. If I add - ls root in the job , I can see the folder.
There is an open question in support forum, but still no response there.
Can anyone help? Thanks
The issue here is that you are trying to upload files that are outside of the project's scope.
From the official documentation on artifacts:
You can only use paths that are within the project workspace.
The reason is obvious; if a git runner were allowed to upload anything outside its workspace, it would cause a serious security issues.
However, if you really wanted to upload anything outside the runner's workspace, you might want to try copying the files outside into the project's root folder, and uploading it from there.
The git runners are usually registered as user:group git-runner:git-runner and so it will likely require sudo privilege if you wanted it to copy files from /root.
According to https://docs.gitlab.com/ee/ci/yaml/README.html#artifacts
You can try adding an additional slash after your path.
paths:
- /root/dir/
Examples from the link above:
Send all files in binaries and .config:
artifacts:
paths:
- binaries/
- .config
I have a project hosted on Gitlab. The project website is inside the pages branch and is a jekyll based site.
My .gitlab-ci.yml looks like
pages:
script:
- gem install jekyll
- jekyll build -d public/
artifacts:
paths:
- public
only:
- pages
image: node:latest
cache:
paths:
- node_modules/
before_script:
- npm install -g gulp-cli
- npm install
test:
script:
- gulp test
When I pushed this configuration file to master, the pipeline executed only the test job and not pages job. I thought maybe pushing to master didn't invoke this job because only specifies pages branch. Then I tried pushing to pages branch but to no avail.
How can I trigger the pages job?
You're right to assume that the only constraint makes the job run only on the ref's or branches specified in the only clause.
See https://docs.gitlab.com/ce/ci/yaml/README.html#only-and-except
It could be that there's a conflict because the branch and the job have the same name. Could you try renaming the job to something different just to test?
I'd try a couple of things.
First, I'd put in this stages snippet at the top of the YML:
stages:
- test
- pages
This explicitly tells the CI to run the pages stage after the test stage is successful.
If that doesn't work, then, I'd remove the only tag and see what happens.
Complementing #rex answer's:
You can do either:
pages:
script:
- gem install jekyll
- jekyll build -d public/
artifacts:
paths:
- public
Which will deploy your site regardless the branch name, or:
pages:
script:
- gem install jekyll
- jekyll build -d public/
artifacts:
paths:
- public
only:
- master # or whatever branch you want to deploy Pages from
Which will deploy Pages from master.
Pls let me know if this helps :)