Creating Bitbucket Server Webhook for AzureDevops Builds - azure

To keep it short and sweet I am attempting to automate my CI/CD process which includes a AzureDevOps build running automatically when code is pushed in a bitbucket server repo. I have not found any documentation on how to set this up. Does anyone have any experience with this process? Keep in mind (even though i've mentioned it) I am using the server version of Bitbucket while using the PAAS version of AzureDevops.

It's fairly straightforward. When you create a pipeline, it will ask you where your repo is.
If it's hosted by Atlassian (you access it by https://bitbucket.org), select the BitBucket Cloud one and provide your login.
If it's self hosted, select "Other Git".

Related

Why doesn't Azure Pipelines pick up commit on GitHub?

We use Azure Pipelines for our CI/CD processes since a few weeks. The CI pipeline gets code from GitHub, builds, tests and creates a deploy package.
From the beginning I am quite certain that every commit got detected as intended, but recently that is not the case. Manual triggers and scheduled triggers work, but continuous integration does not.
What could be the causes for this?
In the pipeline, we checked the box for "continuous integration", and we use the recommended GitHup App to provide authorization. This is verified to work, we can see the authorized GitHub repos in the pipelines settings.
You can check if the github branch you committed to is included in the Branch filters. If it is not included. Click Add to add the branch.
Check if there is skip CI command(eg.[skip ci]) in commit message. See here for more information.
If CI trigger is not working even all the settings are correct. You can try below workarounds:
1,Disable the CI trigger, save, then re-enable it and save it again.
2,Clone your build definition. See below screenshot
3, Create a new build pipeline with the same trigger and settings.
If all above arenot working. You can go to this site to see if there is a server outage of azure devops.
We ended up changing how we connect from Azure Pipelines to GitHub. The recommended way is to install the Azure App in GitHub and connect using that in Pipelines. My experience is that it worked at first, but stopped working. I read somewhere that only the first connection works with webhooks, so maybe we tried it somewhere else or something that broke it. I ended up using a GitHub servic account to pull and listen for webhooks, and that works just as expected.

How to maintain many Azure resources and deployments in one git repo?

I have a project that consists of an Azure webapp, a PostgreSQL on Azure, and multiple Azure functions for background ETL workflows. I also have a local Python package that I need to access from both the webapp and the Azure functions.
How can I structure configuration and script deployment for those resources from a single git repo?
Any suggestions or pointers to good examples or tutorials would be very appreciated.
All the Azure tutorials that I've seen are only for small and simple projects.
For now, I've hand-written an admin.py script that does e.g. the webapp and function deployments by creating a Python package, creating ZIP files for each resource and doing ZIP deployments. This is getting messy, and now I want to have QA and PROD versions, and I need to pass secrets so that the DB is reachable, and it's getting more complex. Is there either a nice way to structure this packaging / deployment, or a tool to help with it? For me, putting everything in Kubernetes is not the solution, at least the DB already exists. Also, Azure DevOps is not an option, we are using Gitlab CI, so eventually I want to have a solution that can run on CI/CD there.
Not sure if this will help complete but here we go.
Instead of using a hand-written admin.py script, try using a yaml pipeline flow. For Gitlab, they have https://docs.gitlab.com/ee/ci/yaml/ that you can use to get started. From what you've indicated, I would recommend having several job steps in your yaml pipeline that will build and package your web and function apps. For deployment, you can make use of environments. Have a look at https://docs.gitlab.com/ee/ci/multi_project_pipelines.html as well which illustrates how you can create downstream pipelines.
From a deployment standpoint, the current integration I've found between Azure and GitLab leaves me with two recommendations:
Leverage the script command of yaml to continue zipping your artifacts use Azure CLI (I would assume you can install the tools during the pipeline) to zip deploy.
Keep your code inside the GitLab repo and utilize Azure Pipelines to handle the CI/CD for you.
I hope you find this helpful.

Azure DevOps deploy overview

I am looking for some help to understand how to use Azure DevOps to deploy the application I am working on to a Windows VM.
Currently process: Our code is currently in Azure Git repo and we have two QA servers, the QA servers have been setup already. Every time we go to QA server to manually pull the lastest code with command line git pull command. Then run a web page to upgrade/downgrade database if database script has been updated.
Goal: Would use Azure DevOps to automate the process.
Here is what I would know:
1) With Azure DevOps, when deploying the code to QA server, could we only copy over the changed files? The software package is pretty big, it would take long time to copy the whole thing.
2) How Azure DevOps move files to QA server, does it use Git pull or file copy?
3) When using Azure DevOps tools, could we trigger a http(s) request?
4) Is there any tool I could check if Git repo has updates?
5) Is there any tool support if/else logic, because we would trigger the http(s) request only if Git repo has changes.
Just would get an overall idea.
as far as I know there is no layering\caching.
how would it use git pull to download from a web server? its using http request to download the package
not sure I understand the question, but you can have a script step in the deployment and do whatever you like (ie http(s) request)
this question doesnt make sense, you can use git command line, but I dont understand how that ties to the release process. you should build your code on commit and create a package that you would later consume in the release process
read 3 and 4.

Azure ARM template Continuous Delivery Release pipeline

I am developing CD release pipeline using TFS 2015 update 2 on-prem instance. I'm relying on ARM template to setup Azure website and Azure Sql server. I'm using FTP method for deploying website bits from internal Build server to Azure website. For this website deployment I'm reading the credentials from PublsihingProfile of the newly created website.
Is this a right way or can you suggest a better way? Any comments are appreciated.
P.S. Customer wants to use FTP method and not WebDeploy.
If you really have to use FTP, and the thing you're not happy with is the process/password secret management, you could try this:
https://marketplace.visualstudio.com/items?itemName=januskamphansen.ftpupload-task
Its a VSTS extension task for release, which works with the vnext build/release system in VSTS or TFS2015 server. This task lets you put the parameters in against each environment you setup, and mark the passwords as secrets so they wont come out in logs or the UI.
The step basically wraps up the process of doing the ftp bit for you - you may want to do other steps as part of the release.

Continuous deployment to Azure using Bamboo

I'm working with Atlassian Bamboo on Demand for Continuous Integration and it works great.
Now I'm trying to use the "Deploy" feature and the problem is that I'm working with Azure (ftp, publish, git, mercurial... I really don't care how) and I can't find a "task" which could perform it.
Has anyone achieved this?
I do automated deployments to AWS from bamboo, but the concept is pretty much the same.
Bamboo has no specific options for deploying to the public cloud, so you have to build or call an existing deployment tool. At the end of the day bamboo deployments provide you with meta-data over which build has been deployed to which environment, and security over who can do deploys, but its up to you have to make the actual deploy work. Bamboo does give you a totally extensible engine for controlling the "how" via scripting. The deployment engine is basically a cut down version of the CI engine with a subset of tasks.
I resolved to build our deployment tooling due to it being fairly simple to get started and a worthwhile investment in time because this will be used often and improved over time. Bamboo gives me authorization and access control, and my scripts give me fine grained control of my deployments.
I'm assuming you are running a bamboo agent on a windows image like me. So powershell scripts are your friend . If you're running in linux you'll want to do the same with bash.
I have a powershell scripts controlling my deployments through a controller/agent model.
The controller script is source controlled and maintained in mercurial repo. This is pulled by the repository task.
The agent is a powershell script wrapped by a simple webapi rest service with a custom authentication mechanism. The agent is setup when an app server instance is provisioned in ec2. We use puppet for server provisioning.
The controller does the following for a deployment
connects to the vpc
determines the available nodes in my web farm using ec2
selects the first node and sends the node an "upgrade database" command
then proceeds to send "upgrade app server" command to each node
The logic for doing the deploy is parameterized so it can be re-used for deployment to different environment. I use bamboo deploy variables to manage feeding parameters for the different environments.
DEV is deployed automatically, test, staging and prod are all manual click deploys which are locked down to specific users.
One option I considered but did not invest the time to look at as aws elastic beanstalk as a deployment tool. It has a rich api for deploys. On the Azure side it looks like web deploy supports deployment to Azure IIS sites.

Resources