In my Azure repo for my function app, I included a submodule that is cloned from another Azure Repos. I try to enable CI/CD pipeline of this function, however, if I change and commit a new change to the submodule's original Azure repo, it cannot trigger a new build and deploy of the function APP. Is there a way to enable CI/CD for Azure repo submodule change?
For this issue, you need to enable the Checkout Submodules option in the advanced section of the Get Sources step.
You can refer to this document for details.
Related
I have few CI and CD pipelines in azure devOps project in which CI pipelines connect to github enterprise repository to fetch the code.
CI pipelines were invoked by the trigger whenever there is a change in main branch of repository.
This was working fine until our git repository instance was changed. All our git repositories are migrated to the new instance.
I have updated the service connection, to point to the new git instance and manually invoked CI pipelines and tested. It worked fine.
But the issue is now with automatic triggering of CI pipelines. It is not working now.
I tried remove and add git service connections and repository details inside CI pipeline and enabled trigger, but still it is not getting automatically invoked whenever there is a change in repository.
What could be the reason for this ? I already removed and added the git repository details in CI pipeline, still that does not work. Is there anything I missed out? Any leads appreciated!
You can check the "Override the YAML trigger from here" setting for the types of trigger (Continuous integration or Pull request validation) available for your repo." in the Triggers UI.
If it does not work, please create a new pipeline to check if it works.
For more information, you could refer to: troubleshooting failing triggers
Finally figured out the issue and fixed. It was error with web hook in the repository.
Updated the webhook and it is working fine now.
I have a repository in GitHub and have a bunch of documentation(.md) files there.
I want to migrate the documentations into one of the Azure DevOps Wikis.
I am referring this link.
When I am using the option Publish code as Wiki,it only shows the repositories which are available inside the Azure DevOps project.
Is there a way I can publish the GitHub documentations in repositories which are from another project into the Azure DevOps Wikis?
Consider approaching this differently. If you are using git for your Azure DevOps project, then the Azure DevOps Wiki should be persisted to a hidden, but locatable, git repository. Git clone the source and target repositories locally. Then copy what you want to the target (Azure DevOps Wiki, local clone). Git add, commit, and push the added target files.
Attached images/files, if any, may be more problematic depending on how exactly they are represented in the source GitHub repo. In Azure DevOps Wiki ALL attachments are simply stored in a root .attachments folder. So, you'll need to migrate them there and "fix up" your links.
I've done this going the other direction, Azure DevOps Wiki -> GitHub Enterprise repo. You should know that you’ll likely need to “fix up” page links and that the two markdown styles have slight variations you may have to address.
Is there a way I can publish the GitHub documentations into the Azure DevOps Wikis?
for copying documents from GitHub you need to use Import repository from your devops project.
how to import an existing Git repo from GitHub, Bitbucket, GitLab, or other location into a new or empty existing repo in your Azure DevOps project.
For complete information you can go through the Import Git repo link.
I am looking for a sample ARM template which can setup my Azure DevOps repository into Azure Databricks. This will help me deploy my Master branch directly on ADB workspace.
I tried to do manually on portal and it works, but the repos path for the notebooks shows my email_id, which is not good in Production.
I want to configure through a Powershell OR an ARM template while creating Databricks. The same problem I am facing on Azure dataFactory as well.
Please help me resolve it.
It's not possible as of today - there is no API for creating a checkout. It will be possible only when Databricks Repos will start to provide corresponding API for creating the checkouts of repositories, not only "Update checkout" API that is available right now.
If you're concerned with the checkout created in your own folder, you can just create a Folder inside Repos, call it like "Production", and then do checkout inside that folder (pictures are taken from my demo of Repos with Azure DevOps):
To deploy Notebooks from your master branch to another workspace, I would recommend to trigger a deployment pipeline from the master branch onto the target databricks worskpace.
That way, no need to setup Repos in the target environment.
You use Repos in your development workspace (with your email in path)
You commit to the branch you work on and eventually merge / PR to master
Once on Master branch, a DevOps pipeline is triggered and deploys the notebook to your target workspace on the path you want
We have multiple teams working on the same Api Management instance, and the current git-based configuration that API Management provides, does not really facilitate a good process for us (with support for code reviews, pull requests, deploys etc.).
Can we use a GIT repository in Azure DevOps to control the configuration instead of having to use the repository provided directly by API Management?
Our primary use cases are:
Merge/sync changes from API Management into our central repository
Performing changes in a DevOps repo in separate branches, merge the changes to the main branch via pull requests and sync'ing the changes to API Management
We can clone the configuration repository and push changes back- using our familiar Git commands.
You can try to run the following command in cmd task of azure devops pipeline.
git clone https://username:password#{name}.scm.azure-api.net/
git add .
git commit -m "abc"
git push
Here are the document and similar case you can refer to.
Boss wants me to set up a pipeline in Azure Devops to our Gitlab repos. I have a few questions:
Do I set it up under "Git other"? Should I mirror the repositories into Azure Devops?
I am supposed to set it up with a docker image, do I need to use docker hub?
I've never set up a pipeline and I am just a lost intern, thanks for any advice.
Do I set it up under "Git other"?
Yes, you could use the Git other to create a Service connections for the
GitLab. And there is an extension GitLab Integration for Azure Pipelines,
which could be able to download the sources from a GitLab repository (using
clone command) and use downloaded sources in Azure Pipelines.
Should I mirror the repositories into Azure Devops?
If you have no plans to migrate gitlab repo to azure devops repo, you do not need to mirror the repositories into Azure Devops. Besides, just as LJ said, since the YAML structure does not support for Gitlab at this moment, we could not use YAML structure with Gitlab repo.
I am supposed to set it up with a docker image, do I need to use
docker hub?
This is a matter of taste. In addition to dikcer hub, you can also use Azure Container Registry.
Do I set it up under "Git other"? Should I mirror the repositories into Azure Devops?
If you want to set up the pipeline using the YAML structure and have all the features that Azure DevOps provides, you have to mirror the repository since it's not possible yet to use the YAML file to run pipelines directly from GitLab and Git other connection has some limitations.
I am supposed to set it up with a docker image, do I need to use docker hub?
For the pipeline environment, you can use VM Images provided by Azure.