Can Azure Logic App pull repository from bitbucket? - azure

Is it possible to pull / clone bitbucket repository within Azure Logic Apps?
I am curious if it is possible to set up some backend tests within Azure Logic Apps. So to pull repo with tests first and then execute them within CLI. I see that there is bitbucket connector in Logic Apps but there is no option to pull the repo. Or should I check some custom connector to run commands from hand like "git clone" etc. - if yes which one?

Azure Logic Apps is a cloud platform where you can create and run automated workflows with little to no code. By using the visual designer and selecting from prebuilt operations, you can quickly build a workflow that integrates and manages your apps, data, services, and systems.
From: What is Azure Logic Apps?
The key concepts here are "little to no code" and "prebuilt operations". Building your code with a CLI and running its tests is not something Logic Apps is targeting. It would also make executing Logic Apps a LOT more complex on the Azure end, since it would mean installing any and all frameworks, tools, etc. that are needed for building the code/running the tests.
If you look at Bitbucket actions that are supported, you can kind of make out that they're all API calls.
What you're looking for is available for free with GitHub workflows
A workflow is a configurable automated process that will run one or more jobs. Workflows are defined by a YAML file checked in to your repository and will run when triggered by an event in your repository, or they can be triggered manually, or at a defined schedule.
or Azure Pipelines.
Azure Pipelines automatically builds and tests code projects. It supports all major languages and project types and combines continuous integration, continuous delivery, and continuous testing to build, test, and deliver your code to any destination.
Potentially interesting read: Build Bitbucket Cloud repositories.

Related

How to maintain many Azure resources and deployments in one git repo?

I have a project that consists of an Azure webapp, a PostgreSQL on Azure, and multiple Azure functions for background ETL workflows. I also have a local Python package that I need to access from both the webapp and the Azure functions.
How can I structure configuration and script deployment for those resources from a single git repo?
Any suggestions or pointers to good examples or tutorials would be very appreciated.
All the Azure tutorials that I've seen are only for small and simple projects.
For now, I've hand-written an admin.py script that does e.g. the webapp and function deployments by creating a Python package, creating ZIP files for each resource and doing ZIP deployments. This is getting messy, and now I want to have QA and PROD versions, and I need to pass secrets so that the DB is reachable, and it's getting more complex. Is there either a nice way to structure this packaging / deployment, or a tool to help with it? For me, putting everything in Kubernetes is not the solution, at least the DB already exists. Also, Azure DevOps is not an option, we are using Gitlab CI, so eventually I want to have a solution that can run on CI/CD there.
Not sure if this will help complete but here we go.
Instead of using a hand-written admin.py script, try using a yaml pipeline flow. For Gitlab, they have https://docs.gitlab.com/ee/ci/yaml/ that you can use to get started. From what you've indicated, I would recommend having several job steps in your yaml pipeline that will build and package your web and function apps. For deployment, you can make use of environments. Have a look at https://docs.gitlab.com/ee/ci/multi_project_pipelines.html as well which illustrates how you can create downstream pipelines.
From a deployment standpoint, the current integration I've found between Azure and GitLab leaves me with two recommendations:
Leverage the script command of yaml to continue zipping your artifacts use Azure CLI (I would assume you can install the tools during the pipeline) to zip deploy.
Keep your code inside the GitLab repo and utilize Azure Pipelines to handle the CI/CD for you.
I hope you find this helpful.

Running TFS Build 2015 Steps in parallel

In my build definition I deploy to multiple Azure cloud services and would like to deploy in parallel, however the build definition in 2015 doesn't allow steps to be run this way.
Is there a way I can have three groups of steps (each with a Visual Studio Build and then Azure Cloud Service Deployment step) running in parallel.
This will save me huge amounts of time in our CI/CD builds allowing for faster feedback from the builds.
Instead of deploying from a build, deploy using the Release hub.
You can define multiple release environments, then use the "Deployment Conditions" option to set multiple environments to deploy at once. However, you would need one agent per environment so that the agents can run in parallel.
Currently, there are not parallel tasks in the VSTS build and release process. However, there is a uservoice item for it. Please vote for it!
https://visualstudio.uservoice.com/forums/330519-team-services/suggestions/13481574-add-ability-to-run-build-steps-in-parallel

What tools exist for continuously delivery of NodeJS environments?

I'm currently attempting to implement a continuous delivery pipeline for NodeJS, and want to have a tool that is capable of;
deploying and managing packages of an application
rolling back deployments
monitoring the deployments for potential rollbacks
has a REST API.
is not a SaaS solution.
I have tried go.cd, but it didn't have monitoring capabilities.
I think the product that may suit your needs is Codeship. There is a very good presentation about using this tool to deploy a simple web application, showing its capabilities.
A you can desume from its features it can:
Automate your development and deployment workflow
Run your automated tests and get notified
Speed up your tests with ParallelCI, that runs your test
Configure powerful deployment pipelines that run after successful tests to deploy your application to multiple environments
access debug builds via SSH
the API and Webhook enable you to integrate Codeship with the tools you are currently using
Check this out.

Visual Studio Online, multiple Cloud projects, Continuous Integration

I have a Visual Studio 2013 solution with 3 cloud service projects (1 Web Role, 2 Worker Roles) and a set of nUnit tests hosted in Visual Studio Online.
Through Azure portal, I've setup a continuous integration build that builds and deploys my solution on checkin. Azure, however asked me for a single Cloud Service to deploy solution into (although my project contains 3 of them) and obviously, the build only deploys one of the cloud projects (presumably the first one it finds). How can I make it deploy all three?
I'd rather not create three different builds.
From my experience, you cannot deploy all of your azure projects within your solution to azure at the same time using the VSO CI builds. If you look at your CI build definition under process you'll see that you can only specify one cloud service name to deploy to. We ended up having to create one solution, with one azure project, and one CI build per cloud service. We use the staging environments as a temporary deployment destination until all of our services are built/deployed, then swap them all at the same time to achieve a somewhat seamless/instant update.
Hope this helps.
I was able to get this to work with the new VSO VBuild Next System in TFS 2015.
You are able to build a specific *.ccproj cloud service project and from there use a specific task to publish to azure.
This allows me to have multiple cloud services and multiple web roles in the same solution.
You should also be able to build multiple cloud service projects and setup multiple publish tasks, all from the same build definition.
From MS...
We've built a brand new scriptable build system that's web-based and
cross-platform. We believe all new and most existing customers should
use it instead of the XAML build system.
More info:
https://msdn.microsoft.com/Library/vs/alm/Build/overview
Here is a view of my build tasks..

Continuous deployment to Azure using Bamboo

I'm working with Atlassian Bamboo on Demand for Continuous Integration and it works great.
Now I'm trying to use the "Deploy" feature and the problem is that I'm working with Azure (ftp, publish, git, mercurial... I really don't care how) and I can't find a "task" which could perform it.
Has anyone achieved this?
I do automated deployments to AWS from bamboo, but the concept is pretty much the same.
Bamboo has no specific options for deploying to the public cloud, so you have to build or call an existing deployment tool. At the end of the day bamboo deployments provide you with meta-data over which build has been deployed to which environment, and security over who can do deploys, but its up to you have to make the actual deploy work. Bamboo does give you a totally extensible engine for controlling the "how" via scripting. The deployment engine is basically a cut down version of the CI engine with a subset of tasks.
I resolved to build our deployment tooling due to it being fairly simple to get started and a worthwhile investment in time because this will be used often and improved over time. Bamboo gives me authorization and access control, and my scripts give me fine grained control of my deployments.
I'm assuming you are running a bamboo agent on a windows image like me. So powershell scripts are your friend . If you're running in linux you'll want to do the same with bash.
I have a powershell scripts controlling my deployments through a controller/agent model.
The controller script is source controlled and maintained in mercurial repo. This is pulled by the repository task.
The agent is a powershell script wrapped by a simple webapi rest service with a custom authentication mechanism. The agent is setup when an app server instance is provisioned in ec2. We use puppet for server provisioning.
The controller does the following for a deployment
connects to the vpc
determines the available nodes in my web farm using ec2
selects the first node and sends the node an "upgrade database" command
then proceeds to send "upgrade app server" command to each node
The logic for doing the deploy is parameterized so it can be re-used for deployment to different environment. I use bamboo deploy variables to manage feeding parameters for the different environments.
DEV is deployed automatically, test, staging and prod are all manual click deploys which are locked down to specific users.
One option I considered but did not invest the time to look at as aws elastic beanstalk as a deployment tool. It has a rich api for deploys. On the Azure side it looks like web deploy supports deployment to Azure IIS sites.

Resources