I have a situation where my organization uses a specific project in our VSTS:
{organization-1}.visualstudio.com/{Project.Client}
Some clients have their own as well for the same project:
{organization-2}.visualstudio.com/{Project.Client}
Is it possible to keep these two projects in sync (work-items and code)? Assume that they are both using TFS.
Creating a service EndPoint from {organization-2} to {organization-1} is possible, but doesn't seem to provide much AFAICT since nothing new shows up in the Notifications menu, nor on any of the Work boards relating to the new Endpoint
I've tried creating a Service Hook from {organization-1} to an Azure Service Bus, but there doesn't seem to be any way to consume it from {organization-2} that I can see.
How can I get these two organizations to usefully talk to each other?
Work Item change notifications, Code Check Ins, etc.
No, the two accounts can’t be synchronized but you can export/import WIT and code from one account to the other.
For export/import WIT, you can use MS excel or other extensions such as VSTS sync migration.
Related
I want add a functionality in my ADF pipeline which will send me email notification in case of failure. On searching the internet, I came to know that Azure Logic Apps helps with this. I am trying to follow below link to achieve this.
https://microsoft-bitools.blogspot.com/2018/03/add-email-notification-in-azure-data.html
I have tried searching up many tutorials, guides and the official docs as well. However, all of them have some templates already there in the Logic Apps Designer. I cannot find the templates and the 'When a HTTP request is received' trigger is also not available in the drop-down.
Please let me know how to proceed.
EDIT :
If you start with a blank Logic App, search for 'HTTP' or 'Request' and select Request.
On the next screen under triggers, select "When a HTTP request is received" and you should be good to go.
EDIT:
It looks like you created a Standard Logic App, which works in a slightly different way. For instance it can contain multiple workflows, which means you create workflows yourself. In the Consumption model, there's one workflow withing a Logic App, so you can open up the editor for that one directly. If there's no explicit reason for you to use Standard, a Consumption Logic App will be easier to work with.
If you really need a Standard Logic App, go to 'Workflows' and create a new workflow:
Then click the newly created workflow to edit it, go to 'Designer' and search for 'HTTP' to add an HTTP trigger:
Here's some information on the Consumption model for Logic Apps:
Resource type
Benefits
Resource sharing and usage
Limits management
Logic App (Consumption) Host environment: Multi-tenant Azure Logic Apps
- Easiest to get started - Pay-for-what-you-use - Fully managed
A single logic app can have only one workflow. Logic apps created by customers across multiple tenants share the same processing (compute), storage, network, and so on.
Azure Logic Apps manages the default values for these limits, but you can change some of these values, if that option exists for a specific limit.
See Resource type and host environment differences for a comparison with the other hosting options.
I was able to solve this. I wasn't able to view a few functionalities because of another error : Functions runtime error Microsoft.WindowsAzure.Storage: Value cannot be null. (Parameter 'connectionString').
AzureWebJobsStorage App Setting was missing which caused the error. I added that and now I can see the triggers and other stuff.
Thanks #rickvdbosch
I've been working on migrating all of the work items from one Azure DevOps (Services) project to another project in the same Organization.
I used the nkdAgility azure-devops-migration-tools to successfully copy the majority of existing work items across, but it did not grab our Shared Queries.
I played around with the Azure Rest API in powershell to list the queries. I also looked at the AZ CLI suite to see if there was a way to list the queries. I was able to find a couple at the root level, but it was not the entire list of Shared Queries.
Is this possible to accomplish through either of the above methods?
My Google-fu was strong today! Here's a link to a script that does almost exactly what I want.
Migrate Azure DevOps work items queries to a new organization
The only difference is that I am staying within my Organization, so making mods accordingly. Also, the Azure Rest API has probably evolved a bit since the original script was written, so I am updating the requests to handle that.
Thanks Josh Kewley!
My Azure based SaaS system publishes events and I have customers who wish to subscribe to them - webhooks seem undeniably the right architecture (And I'm currently a happy consumer of webhooks). I've found lots of great documentation and case studies on best practices (e.g. http://resthooks.org) however I've not managed to find an existing architecture, framework, project, sample or solution that implements the best practices.
I could build my own solution however I don't want to reinvent the wheel. I was expecting to find an existing framework (e.g. on Github) created by people much smarter than I but haven't had any success.
I currently use a number of Azure services (such as Service Bus, Cosmos, Table Storage) internally and consume using Azure Functions but what I don't have is an architecture for allowing my customers to subscribe to these events.
Specifically I'm looking for best practices and code samples on how to manage potentially millions of subscribers (who are external customers) and the approach to distribute the webhooks out to each of them.
I already understand how to publish and consume webhooks where I am an individual subscriber and there are already some great samples available - https://github.com/aspnet/AspLabs/tree/master/src/WebHooks
Can anyone point me in the right direction? (Preferably to a .NET / C# based solution)
Not sure if this is the 'right' direction but here are my current thoughts on a solution.
We are currently using CosmosDb and are leveraging the change feed to trigger an Azure function execution. The code within the function does a specific task for all tenants in our system. This code will be changed to simply send a new event to the Event Grid topic. A 'in-house' subscription will then be added that will handle what the function code is currently doing today.
We will then follow the subscription management guidance Zapier offers. In a nutshell it is to expose the capability to our customers to subscribe to the events that we publish via a few endpoints. In addition to standard CRUD stuff when a tenant adds/removes a subscription the code will leverage the Event Grid Management SDK to add/remove subscriptions to the appropriate topics within Event Grid (samples here). The subscriptions that get added will have filters set to ensure each tenant only receives their own events.
There are limitations to the number of subscriptions and topics with in Azure (details here). These limitations are acceptable in our case but is something you might need to look into more if you need to reach 1mm subscribers.
Here is how I visualize it:
Not 100% we'll build this but if we do I'll post back here any gotchas we uncover.
Cheers!
I understand that Microservices is about independent loosely coupled services. I have read https://en.wikipedia.org/wiki/Microservices.
When it comes to Azure, I understand there are many components like Azure Service Fabric, AKS and also have the option of deploying containers within Azure VMs using Docker or any other containerization tools. However, since Microservices is about developing atmoic individually scalable services, can this also be achieved by deploying each service as an Azure Web API APP within an App Service Plan and configure Auto-Scale based on Performance metrics (though each API APP may not be individually scalable, they can still be individually manageable in terms of deployment, configuration etc)?
Can someone please suggest if this thought process is correct?
Microservices aren't a platform or technology so if you can make small independently deployable services then they are microservices. Sure - some tech helps but it depends on your situation.
If you only need a few services you probably don't need anything complex. Make sure services are well modeled, own their own data and ideally have a good monitoring and deployment pipeline setup. Design for service failure where possible.
Do you need to scale each part independently? Ideally, you should be able to but do services have very different requirements? You could have many small App service plans but that comes at cost of unused resources so split when you need to.
This question and of course the answers are going to be opinion based, but generally when thinking in terms of micros services, think not in terms as things like loads of API's and VM's etc. Instead think in terms of. When i upload an image, its needs to be resized, and the table updated to give a url for the thumb. or when XXX record is updated in database, Run XXX in order to create a report, or update Azure search. and that each service, just knows how to do a single thing only. I.E Resize an image.
Now one could say. I have a system, A repo library, and some functions library. When an image is posted, I upload, then call this, and that etc.
With Micor services. You would instead just add the image to a queue. Create an azure function that has a queue trigger. that would resize and save both the large and the thumb to storage. this would then either update the database, or in true micro service, it would add a queue to store the new info, another function would watch that queue and insert into the database.
You can use the DB queue from anything. You can use the Blob queue from anything. Your main API, does not care how images are handled. You can change your functions one day to maybe save to dropbox, instead of azure blob. All really easy, with no re-build of the API, because the API does not care.
A good example I use it for is email and SMS. My systems dont know how to send an email, or an SMS. They only know how to add to a queue. My microservices. SendEmail and SendSMS do know how to do it, and I can change how and who i send that content with, really easy. I can tomorrow change from Twilio to send grid, without ever telling the API that i've done it.
On a more complex thing. I have approval, at the moment that approval sends an email or SMS to either user or admin, and that can change over time. So I have an SMS server, Email Service and and approvalService. when approval happens, it just adds a config to the queue, The rest is done by a logic app, that knows to send an email to XXX and an SMS to XXX and then update database. My api, is just a post, that creates a queue.
Basically what I am saying here is to get started, maybe porting an existing app. Start with the workflow stuff, like send an email, resize an image, create a report, create a PDF, email 50 subscribers etc. and take all that code out and put into there own micro service that just knows how to do one thing. Then when you grow with confidence, create a workflow from all of these services with Logic Apps, let azure take care of the rest, thats what they want to do.
I currently have a couple of WebApi projects that use a few class libraries such as address lookup, bank validation, image storage etc.
Currently they are all in a shared solution but I'm planning to split them up. I thought about moving the libraries into NuGet packages so that they are separate from the API projects and is properly shared.
However, if I make a change to one of these components I will need to build and redeploy the API service even though it's a separate component which has changed.
I thought about putting these components into a separate service but seems a bit of overhead for what it is.
I've been looking at Azure WebJobs and think I may be able to move these components into this instead. I have two questions related to this:
Are WebJobs suitable for calling on demand (not using a queue)? The request will be activated from a user on a web site which calls my API service which then calls the Web Job so it needs to be quick.
Can a WebJob return data? I've seen examples where it does some processing and updates a database but I need a response (ideally Json) back to my API service.
Thanks
According to your requirement, I assume that you could try to leverage Azure Functions by creating a function using the HTTP trigger, which could be triggered by accessing the Function URL with parameters and return the response as you expected. You could follow this tutorial for getting started with Azure Functions.