How to secure Azure Serverless Microservice Architecture? - azure

I am trying to build Serverless Microservice Architecture
Azure services used by me are:
Azure CDN
Azure Active Directory
Azure Logic Apps
Azure Functions
Azure Event Grid
Azure SignalR Service
Which below tools do I need mange and secure my API in Azure Serverless Microservice Architecture?
Azure Traffic Manager
Azure Application Gateway
Azure API Management
Azure Function Proxy
Links Referred by me are :
https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/serverless/web-app
https://learn.microsoft.com/en-us/dotnet/standard/serverless-architecture/serverless-design-examples
Please help
Edit:
I understand above tools and it purpose but what I can't understand is do I require them, if yes in what order, all I am developing is an Angular 2+ app, post a Command Event to Azure Functions/Logic Apps using REST api returning RequestId (and triggering chain of events) and subscribing to that RequestId to listen for Domain Event.

This is very broad Architecture question. All the services you mentioned have specific purpose. You can even secure your functions without using any of them by simply turning on authentication on functions.
I would suggest reading all of them in details can help you identify which service may suit your needs in this case. e.g. Traffic manager is used for cross region traffic distribution and may not be required in your case. Function proxies and Api management overlap in few cases and really depends on what you are trying to achieve. To get better idea you may need to share your architecture diagram.

Related

Serverless SignalR Service - What do Azure Functions provide?

Say I'm working with an Azure SignalR Service run in serverless mode to implement a chat application. I'm wondering why would we use Azure Functions for this. What do they provide us? Couldn't we just build the connection with the SignalR Service on our own directly? Or say, after we negotiate an access token with an Azure Function, why can't we just use the connection we build with that token to broadcast messages, rather than relying on an additional Azure Function to broadcast messages?
In the past, people used to couple SignalR in their own web api or mvc project. The problem was that when there was a need to scale, it wasn't possible to scale things separately. Also, when comes to SignalR, it's hard to work with sticky sessions for example. This is when they released Azure SignalR Service, a managed service that would implement the backplane pattern for you.
More info:
https://learn.microsoft.com/en-us/aspnet/signalr/overview/performance/scaleout-in-signalr
The last piece would be to separate the real time bi-directional communication from the webapi / mvc project. They added Azure Functions as it's a light weight and easier to scale when comparating with webapi / mvc.
why can't we just use the connection we build with that token to
broadcast messages, rather than relying on an additional Azure
Function to broadcast messages
A: It's because the function is not being executed 100% time.

Is it possible to use the Direct Line API protocol that is through an azure bot service for an on-premise environment?

The question I have is can you use the Microsoft Bot Framework service via an on-premise solution through, ideally a docker container, ~~or at least an Azure Stack installation~~ (not available currently through azure stack)? We need a 100% on premise solution that will utilize LUIS and other Azure services but still be on-premise when utilizing the chat bot.
The problem is the bot almost requires a solution that is through the direct line api which authenticates through a token. This token is generated through an azure service, if it's not the secret, and the direct line api is through a registered bot application through an azure service.
Although there is LUIS container support, meaning a localized docker container that can pull down azure cognitive services and use them through that container, there doesn't seem to be any support for the bot framework service. Which just seems bizarre to not have one without the other.
https://learn.microsoft.com/en-us/azure/cognitive-services/cognitive-services-container-support
But, that's ok if utilizing an Azure stack that would perhaps solve a lot of on premise solutions. It could even be the hybrid variation where lLis and other aspects are through traditional cloud services but the bot service has to be on premise and able to utilize the direct Line api. If possible. Or what is another solution?
Would it have to be traditional restful api calls and what would be missing from a deployed nodejs or C# bot to the cloud. Perhaps I am missing something in the architecture but the need described is 100% off premise
You will want to look into offline DirectLine. This is an unoffical package, but it is open source.

Azure Web API - how to communicate between services

I'm currently developing a SOA based architecture in Azure, using disparate Web API services (they'd probably qualify as Microservices, but I'm hesitant to use the term).
I have a service which is triggered by the Azure Scheduler. It does some "stuff" and then needs to call another Web API (via HttpClient) service to trigger something else. To do this, I need to know the URI of the 2nd service. When running locally, this is fine, as it is something like
POST http://localhost:1234/2ndService/api/action
However, when I deploy to Azure (using Internal Only as the access level), it gets an obfuscated URI, such as http://microsoft-apiapp8cf3d453-39d8-4b3b-ad00-e9d8008a9b58, which I obviously can't guess at deploy time.
Any ideas on how to solve this problem? Or have I made a fundamental error here?
Instead of relying on public http endpoints, have you considered passing messages via queues in Azure Table Services? It's very simple to do and is going to be more robust since you can take advantage of built-in features like guaranteed message delivery.
The overall idea is that Service A does some "stuff" then puts a message on queue ONE. Service B continuously reads from queue ONE until it picks up a new message from Service A (or any other service for that matter) and then does its "STUFF". You can continue to chain calls like this to other services that need to be notified.
If you want a more elegant solution you can look at using Service Bus Topics but the concept is basically the same.
Also, since you mentioned that your architecture is much like microservices, you can check out the new Service Fabric which is designed for your scenario.
In case of Azure Web Apps, you may always see such properties going to the web app dashboard, then properties. When deploying from the Visual Studio, you can set the URL as you want - just checked it, and it works fine.
Not very clear what technology do you use - is it IaaS VM? Is it Web Apps?
From my standpoint, each service should be deployed as a separate Web App (or API App, if you want). Each Web App has defined its own name as in yourwebapp.azurewebsites.net, so once you have provisioned the Web App no 1 in Azure, you know its address so you will call it from the Web App no 2.
In all the cases, you should have fully qualified domain names, and not local/internal ones.

Considerations when moving Web API to Fabric

I have an existing Web API 2 project that I'm looking to move over to Azure Service Fabric. I'm planning on creating a single stateless service within Fabric as detailed here (mainly as I don't understand actors/services at the moment!) and move the API across.
My API uses Entity Framework for it's backend and the built in Individual Accounts using OWIN.
Is there anything I need to consider when moving the service over (I mainly thought the the DB and authentication might be an issue) or is it just a case of putting a Service Fabric layer on top?
Thanks
EDIT
I've had a bit more of a thought about this and have some more questions (based on #Mihail's comment)!
So I'm going to have many stateless services (so I'm splitting my Web API project up) which will be exposed via a Web API project (based on this)
So two questions really:
I need to authenticate users based on Individual Accounts. I think I should do this on the Web API frontend so only authenticated users get through to the Fabric services but that means that the API has access to the DB and it's not just a pass through anymore. Is this OK? Should the API call a microservice to authenticate and then call the service it requires or is this overkill?
Entity Framework. If I have many services (and API frontend) accessing the same DB do I have to worry about concurrent connections/locking or will Entity Framework handle this for me?
As Mihail said, the questions around Entity Framework and authentication shouldn't be a problem, or at least not a Service Fabric specific problem.
One thing to consider though is whether Service Fabric is appropriate here if the only thing you'll have is a simple API, or whether an Azure API app would be a better fit for you.
If you went with Service Fabric, you'd have to have at least 5 VMs so you'll need to consider whether your app requires 5 VMs or whether that would be an overkill. Also remember that you'll need to manage those VMs - you don't get the magic that a PaaS solution would give you. You'd also have to deal with certain things that you'd get out of the box from an API app like auto-scale, authentication, rate limiting, integration with SaaS applications, etc. Might be worth having a look at this SO question for a comparison between Service Fabric and the App Service.

AWS Lambda : Can those event can be integrated with API Management Tools?

What i am trying to do:
i am planning to write my mobile back-end api using aws lambda. From the recent releases i got the info, that all events can triggered via aws-sdk on any platform including (android and iOS).
i have done POC for that, and its working fine. But how to manage all those events for metrics? security (OAuth)? Metering?. I know all these can be done via an api management tool. So i planned and chose CA API Management Gateway for doing these.
Where i am struck:
How can i integrate those lambda events in CA API Management Gateway or Any other API Management Gateway?
Extra Question's i have:
Can this be Done?
Will the api scale?
Is it a good idea of building all api in lambda?
There is no easy solution for this, I'm encountering a similar problem. It would be great if Amazon could provide an API gateway functionality for Lambda but they don't seem to at present. It feels like a big hole in their service.
If your CA API gateway has the ability to execute code you could write an intermediary that utilizes the Amazon mobile SDKs as detailed here, otherwise you might need to develop an intermediary application that performs a similar role.
The problem with developing an intermediary application however is then you will need to utilize EC2 compute resources and configure auto scaling mechanisms yourself. It would be much better if Amazon could provide this as a managed service.
Update
Amazon has just announced an API Gateway service.

Resources