Such as:
source repo on github Branch master -> destination repo on gitlab Branch main
I'm newbie for github action.
How to design a script entrypoint.sh or new pipeline or example for me, please.
Note: sorry, my english not strong.
Workflows: git-repo-sync.yml
name: build
on:
- push
- delete
jobs:
sync:
runs-on: ubuntu-latest
name: Git Repo Sync
steps:
- uses: actions/checkout#v2
with:
fetch-depth: 0
- uses: chayanbank/git-sync#main
with:
target-url: 'https://gitlab.com/chayanbank/test-react-web.git'
target-branch: 'main'
target-username: 'githubsync'
target-token: ${{ secrets.GITHUBSYNC }}
action.yml
name: 'Git Repo Sync'
description: 'Git Repo Sync enables you to synchronize code to other code management platforms, such as GitLab, Gitee, etc.'
branding:
icon: upload-cloud
color: gray-dark
inputs:
target-url:
description: 'Target Repo URL'
required: true
target-branch:
description: 'Target Repo Branch'
required: true
target-username:
description: 'Target Repo Username'
required: true
target-token:
description: 'Target Token'
required: true
runs:
using: "composite"
steps:
- run: ${{ github.action_path }}/entrypoint.sh
shell: bash
env:
INPUT_TARGET_URL: ${{ inputs.target-url }}
INPUT_TARGET_BRANCH: ${{ inputs.target-branch }}
INPUT_TARGET_USERNAME: ${{ inputs.target-username }}
INPUT_TARGET_TOKEN: ${{ inputs.target-token }}
GITHUB_EVENT_REF: ${{ github.event.ref }}
entrypoint.sh
git remote add target https://${INPUT_TARGET_USERNAME}:${INPUT_TARGET_TOKEN}#${INPUT_TARGET_URL#https://}
case "${GITHUB_EVENT_NAME}" in
push)
git push -f --all target
git push -f --tags target
;;
delete)
git push -d target ${GITHUB_EVENT_REF}
;;
*)
break
;;
esac
Related
I was trying to check out only the repository that triggers the pipeline.
resources:
repositories:
- repository: repo2
type: git
name: branching/repo2
ref: dev
trigger:
- dev
- repository: repo1
type: git
name: branching/repo1
ref: main
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- script: echo "$(Build.Repository.Name)"
- ${{ if in(variables['Build.Repository.Name'], 'repo1') }}:
- checkout: repo1
- ${{ if in(variables['Build.Repository.Name'], 'repo2') }}:
- checkout: repo2
But each time, this only checkout the source repository.
When the pipeline is triggered from repo1, I tried to checkout repo1, and when it is triggered from repo2, I tried to checkout repo2.
I don't want to keep changing the name of the checkout repository in the pipeline file.
Is there another way to have the checkout task choose the triggered repository automatically?
Checking out multiple repos involve different ways of calling checkout:
Check out the triggering repo with: checkout: self
Check out the other repos with: checkout: <reponame>
More info and options using checkout see:
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops#specify-multiple-repositories
Edit
Assuming your trigger repo is repo1 or repo2, your YAML example would look like:
resources:
repositories:
- repository: repo2
type: git
name: branching/repo2
ref: dev
trigger:
- dev
- repository: repo1
type: git
name: branching/repo1
ref: main
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- script: echo "Triggered repo: $(Build.Repository.Name)"
- checkout: self
- ${{ if in(variables['Build.Repository.Name'], 'repo1') }}:
- checkout: repo2
- ${{ if in(variables['Build.Repository.Name'], 'repo2') }}:
- checkout: repo1
I know, that this problem is discussed a lot of times, but this is because every time the setup is different....
I have an existing pipeline in Azure Dev Ops, in which a yaml file is executed. This yaml file resides in repository in Azure Repos Git.
The content of the yaml file is the following:
parameters:
- name: 'dockerfilePath'
default: '$(Build.SourcesDirectory)/.pipelines/dbdump/Dockerfile'
type: string
- name: 'imageRepository'
default: 'databasedump'
type: string
- name: 'dockerRegistryServiceConnection'
default: 'ServiceConnectionIoTContainerRegistry'
type: string
steps:
- checkout: self
- task: Docker#2
displayName: Login to ACR
inputs:
command: login
containerRegistry: ${{ parameters.dockerRegistryServiceConnection }}
- task: Docker#2
displayName: Build and push an image to container registry
inputs:
# Only build on PR, and build and push docker image otherwise
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
command: build
${{ else }}:
command: buildAndPush
repository: ${{ parameters.imageRepository }}
dockerfile: ${{ parameters.dockerfilePath }}
buildContext: $(Build.Repository.LocalPath)
containerRegistry: ${{ parameters.dockerRegistryServiceConnection }}
tag: '$(Build.BuildId)'
The yaml file is put into the ".pipelines" directory, in its subdirectory "dbdump" there are two files - "Dockerfile" and "fakedev.dump.sql" i e. i have the following:
.pipelines/
build_and_push_docker.yml
dbdump/
Dockerfile
fakedev.dump.sql
And the content of the docker file is the following:
FROM mariadb:10.4.26
ENV MYSQL_ROOT_PASSWORD <some_password>
ENV MARIADB_DATABASE <some_database>
COPY fakedev.dump.sql /docker-entrypoint-initdb.d/
EXPOSE 3306
And when i execute the pipeline i get :
COPY failed: file not found in build context or excluded by .dockerignore: stat fakedev.dump.sql: file does not exist
Obviously during the build the search for this file is done in the directory, where the build is executed, and tried by changing the build context, but none of
buildContext: '.'
buildContext: $(Build.Repository.LocalPath)
buildContext: $(Build.SourcesDirectory)
was helpful.
So what i have to change in order the sql file to be found during the build?
I have files stored in an AWS S3 bucket. I would like to use GitHub actions to download those files and put them into my GitHub repository. Furthermore, I am able to download the files, but I cannot seem to get the files to then go into my repository. Here are the attempts I have made.
steps:
- name: Download from S3
run: |
aws s3 cp --recursive aws-bucket myDirectoryIWantTheFilesIn
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: 'us-east-1'
I have tried as well with the aws-s3-github-actions
- name: copy sitemaps
uses: keithweaver/aws-s3-github-action#v1.0.0
with:
command: cp
source: awsS3Bucket
destination: myDirectoryIWantTheFilesIn
aws_access_key_id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws_region: us-east-1
flags: --recursive
I needed to include the action's checkout and then commit it.
# This workflow will do a clean installation of node dependencies, cache/restore them, build the source code and run tests across different versions of node
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-nodejs-with-github-actions
# This is a basic workflow to help you get started with Actions
name: Fetch data.
# Controls when the workflow will run
on:
schedule:
# Runs "at hour 6 past every day" (see https://crontab.guru)
- cron: '00 6 * * *'
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: keithweaver/aws-s3-github-action#v1.0.0 # Verifies the recursive flag
name: cp folder
with:
command: cp
source: myBucket
destination: myDestination
aws_access_key_id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws_region: us-east-1
flags: --recursive
- name: Commit changes
run: |
git config --local user.email "action#github.com"
git config --local user.name "GitHub Action"
git add .
git diff-index --quiet HEAD || git commit -m "MyCommitMessage" -a
git push origin master
I'm using docker/build-push-action#v2 with an automated version bump action and the job runs fine until the build and push step where I get this error
/usr/bin/docker buildx build --build-arg NPM_TOKEN=*** --iidfile /tmp/docker-build-push-MO1ELt/iidfile --tag registry.digitalocean.com/xxx/xxx: --metadata-file /tmp/docker-build-push-MO1ELt/metadata-file --push ./xxx
error: invalid tag "registry.digitalocean.com/xxx/xxx:": invalid reference format
Error: buildx failed with: error: invalid tag "registry.digitalocean.com/xxx/xxx:": invalid reference format
Am I adding the tag correctly or is it something wrong with the version bump package?
My GitHub action file
name: deploy-auth
on:
push:
branches:
- main
paths:
- 'xxx/**'
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout main
uses: actions/checkout#v2
- name: Automated Version Bump
uses: 'phips28/gh-action-bump-version#master'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
default: fix
tag-prefix: 'v'
- name: Output Step
env:
NEW_TAG: ${{ steps.version-bump.outputs.newTag }}
run: echo "new tag $NEW_TAG"
- name: Install doctl
uses: digitalocean/action-doctl#v2
with:
token: ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }}
- name: Log in to DigitalOcean Container Registry with short-lived credentials
run: doctl registry login --expiry-seconds 600
- name: Build and push Docker image
uses: docker/build-push-action#v2
with:
context: ./xxx
push: true
tags: registry.digitalocean.com/xxx/xxx:${{ steps.version-bump.outputs.newTag }}
build-args: |
NPM_TOKEN=${{ secrets.NPM_TOKEN }}
There's no version-bump step in that GHA. Which means ${{ steps.version-bump.outputs.newTag }} is empty. You can set an id field on the step to define it:
- name: Automated Version Bump
uses: 'phips28/gh-action-bump-version#master'
id: version-bump
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
default: fix
tag-prefix: 'v'
how can I do it , I added this one but I not sure if is right:
steps:
- checkout: self
fetchDepth: 1
clean: true
resources:
repositories:
- repository: MyRepo
type: git
name: Myrepo
ref: master
steps:
- checkout: MyRepo
This should work