We are attempting to get Azure SignalR serverless to with a dotnet core API application. With "default" SignalR, we ran into scaling issues in Azure as server instances behind an API App would continue to receive socket connections even as its CPU increases. There is no way to currently to change load balancing behavior or to take an instance out of traffic. As such, we've looked to use "serverless", however, all documentation points to using Azure Functions. However, given that serverless uses webhooks and such, we should be able to use anything that can take an HTTP request. We already have our APIs setup so getting this to work against out APIs is preferred.
Update 1
Effectively, we're looking for support for serverless that Functions get but for APIs. Functions have triggers and Serverless Hubs to inherit from, etc etc. These things handle negotiate calls and deserialization of negotiation data and all the other things SigR has to do. We're looking for something similar for, I guess, API controllers.
Thanks!
Related
I am debating what is the more "correct" / better performance solution to implementing a REST API on Azure that has a basic CRUD to a database:
Create a Fast API app and deploy it simply on an Azure App Service
Create an Azure Function App that every endpoint is represented by an Azure Http trigger function (inside of the azure function app) - the code in the function will be the basic CRUD functionality of the endpoint written in python (has nothing to do with fast api - basically only uses the pure code that would be inside a FastAPI route).
Both eventually would be wrapped with Azure API Manager.
What solution will have better response time?
Will server-less be more resilient?
Thanks
If the functions use the python runtime, then FastAPI is a very popular alternative.
I personally found the migration of Azure Functions to a FastAPI App Service to be very smooth and hassle-free.
Differences between the above two:
Function Independent Changes: Primarily we need to import FastAPI and Status, CORS Middleware in the include statements to avoid CORS issues.
In the Azure functions, CORS will be enabled through the portal.
For getting the more information about the differences like Function dependent changes, Return Statements, adding more than one function/endpoint, please visit this article.
FastAPI ranks among the highest-performing Python web frameworks for building APIs out there and it's being used more and more day by day.
Not only in terms of the number of queries handled per second, but also the speed of the development and its built-in data validation makes an ideal for the backend side.
A few other aspects which you need to think about like scalability, cost and ease of development.
If you use Fastapi on an app service, you will need to configure the
scaling rules yourself. An azure function on consumption or premium
plans will scale automatically.
Fastapi on an app service will cost money even if you are not
using it. But an azure function on consumption plan will cost only
if you use it and you get 1 million executions free every month so
the cost is basically nothing if your execution count is < 1M.
Since FastAPI is an API framework, it comes in built will correct
responses in case of errors and you just handle whatever you need to.
For Azure function you will need to code pretty much everything.
I'm trying to figure out how to connect my angular app to azure service bus. The reason I'm trying to do this is to setup real-time pub/sub solution for live auctions. I haven't really seen any start to finish documentation/tutorials on this with MEAN stack so I'm trying to piece it all together. In order to connect to my nodejs backend, what should I be using in Angular to make that connection? All the tutorials I see are referencing SignalR, but they are using .net. Is there a library that is equivalent for Nodejs or do I need to be using something like this?
I appreciate any help/direction!
It's not entirely clear if you are trying to connect your Angular front end to Service Bus as a replacement for SignalR. If so, it isn't a good idea as it would create a serious security hole. Service Bus is primarily for communication between servers. In this scenario, if you had multiple back end node servers, you could use Service Bus to sync the data they are pushing to your clients.
You are going in a better direction with SignalR. The technology you are looking for in real time server-browser communication is websockets. SignalR is just a .NET implementation of that standard. Once you start looking for websocket implementations on the MEAN stack, you should have a lot more success in finding guides. Here's a couple for generic JS implentations just as an example: Link 1 Link 2.
Edit for comment response:
You don't want to connect angular to Service Bus at all. Once you've exposed the keys publically, anyone can read/write whatever they want to your bus. Instead have Angular send the message to a HTTP function and have the function send the message to Service Bus.
The second problem with this plan is that websockets connections, the part that pushes data back to the client, is a long-running connection with constant communication back and forth. The consumption and premium plans are not built for this. Trying to use websockets on those plans will run up your costs a lot higher than they need to be if you have any significant traffic. You'll need to choose a plan that has a flat monthly cost instead. At that point you could still use Functions, but it may be easier to use a traditional web app.
In this case your system would look like this:
Angular new message -> HTTP Function/Web App -> Service Bus -> Websocket Function/Web app => Angular
If you are only running a single server, you can eliminate Service Bus completely.
The other option is to still use a HTTP function to receive new messages, but then use the SignalR service (not the .NET library) to handle pushing the data to the clients. This elimiates Service Bus as well.
This is what it would look like: Angular new message -> HTTP Function/Web App -> SignalR Service->Angular
Please help me understand why we say azure functions is a serverless compute service. It does require cloud to host it and run. Cloud is also a server still why we are saying it is serverless?
Serverless computing does not mean that servers are out of the picture. Servers are very much required, just like they have been for all these years, or else, where will your code run. The reason why the phrase was coined is that as a developer, you do not need to worry about what server your code runs on. In fact, you do not know which server it eventually runs on. Once your code is deployed, Azure assigns the responsibility of executing the code to the next available server. What Azure ensures, and what is ultimately important for you, is that your code will execute whenever required.
Ref: Serverless Computing with Azure Functions
Hope it makes sense :)
To get a better idea this is how we evolved. Cloud providers are making sure we should only worry about the business logic but nothing else.
IaaS (Infrastructure as a service)
You get a running VM somewhere in the data centre but you are required to maintain everything. From Deployment to patching your VMs or anything running on the VM.
PaaS(Platform as a service)
You are not longer required to maintain platForm but you are still responsible to manage your server in terms of load balancing etc.
FaaS(Function as a service)
Servers are abstracted from you . You are only required to maintain your code without worrying about what's under the hood or how to load balance your servers. It's then cloud provider responsibility to package your code and run it for you. But servers are still there.
Going by the official documentation of Azure Serverless computing service, Azure Functions can be defined as;
Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. Use Azure Functions to run a script or piece of code in response to a variety of events.
Azure Functions is an event driven, compute-on-demand experience that
extends the existing Azure application platform with capabilities to
implement code triggered by events occurring in virtually any Azure or
3rd party service as well as on-premises systems. Azure Functions
allows developers to take action by connecting to data sources or
messaging solutions, thus making it easy to process and react to
events. Azure Functions scale based on demand and you pay only for the
resources you consume.
Here the serverless compute service is like a metaphor which implies, the end user doesn't need to manage the servers or infrastructure to run the applications over the Azure and can spend time to focus on managing and improving the business logic.
Few more points to consider,
Serverless in Azure builds on an open-source foundation, the core of
which is Azure Functions, an event-driven compute experience and open
source project. Community contributions include support for new
languages, integrations and deployment targets.
Azure Functions can be used on-premises, in hybrid environments such as Azure Stack, on IoT Edge devices and deployed on top of orchestrators such as Kubernetes – as well as in other clouds.
They enable faster time to market with lower infrastructure and operating costs.
There are heaps of definition of serverless which you can easily google. But I will share my understanding anyways.
1. It does require cloud to host it and run.
You are correct with this. But anything on Cloud requires Cloud, doesn't it? Azure being one of the cloud providers consists of hundreds of services to cater to different needs people are after from using Cloud.
2. Cloud is also a server still why we are saying is serverless
This is not quite right. Cloud is different from a server. Server is a physical box sitting somewhere. With hundreds of thousands of servers all over the world, Cloud hosts all sorts of different services on these servers.
The reason we say Functions are serverless is that the infrastructure of hosting a Function is abstracted away from devs. It is still deployed to some servers but Azure is responsible for all the resource managing, configuration, load balancing, scaling and networking etc. It allows developers to focus primarily on their code, not having to worry about servers.
I need to deploy two node services to CF (each service in its own container).
The apps need to communicate. How is it recommended to implement this communication? I can't find any guide which explains service-to-service communication in CF, and since it should deploy to the cloud I need some best practices. Some examples will be very helpful.
This is a classic question that always come to solve any enterprise application integration pattern and it comes down to the point that, what type of integration needs one has.
If an app want to have synchronous communication to get a real time response, RESTFul APIs are the most loved integration style of this age. But one also need to consider that, creation of huge numbers of APIs (which is the downside of going with Microservices based architecture) also brings-in the huge overhead of maintaining the set and locating the correct one. An API Gateway and a Service Discovery tools should be of help here. I am a novice about Blue-mix but you can surely host a Spring-Cloud-Eureka or Consul based Service Discovery on it to serve the purpose, and similarly Spring Cloud Zuul to have an API Gateway.
Another simple catch here is to ensure not to build one central service as fat spof to cater to whole of your microservices world but rather have many such services each catering to a contextually bound microservices.
On the similar line, if the need is to have async communication, message brokers such as - RabbitMQ, Kakfa should be the best and simplest integration style for apps to communicate. The same catch of not building a SPOF service but rather have separate service instances one each for a set of bounded microservices applies here as well, with all these instances being further federated for wider communication should be taken care of.
Your answer will depend on what kind of communication you want between your apps.
If you're looking to deploy a microservice-based architecture pattern for your Node services, i.e. server code that performs an independent, granular business function, I would recommend getting started reading the docs here and using the new Bluemix Developer Console.
Here there is a growing set of patterns and starters that you can use to understand and develop cloud native apps that can communicate to each other by exposing API endpoints compliant with the Open API specification and auto-generating SDKs for your omnichannel client applications.
After downloading the selected starter, you can modify the code to expose an API that performs the business logic that you need. Subsequently, you can run your project locally in a container or deploy it to Bluemix using the bx dev command line tool.
After setting that up, you will have cross platform, language independent communication between your microservices and client applications.
I'm building a game for Windows Phone 8 and would like to use Windows Azure SQL Database for storing my users' data (mostly scores and rankings).
I have been reading Azure's documentation on SQL Database and found this link which describes just the scenario I'm looking for (it's Scenario B in the picture): I want my clients (the game running in a user's windows phone) to get data from an SQL Server through a middle application also hosted on Windows Azure.
By reading further the documentation (personally I think it's really messy and hard to find what you're looking for in there), I've learned that I could use Cloud Services for this middle application, however I'm not sure if I should use a background worker which provides an HTTP API or a worker with a Service Bus Relay (I discovered that I can use service bus in WP8 in this link).
I've got a few questions that I couldn't find an answer to:
1) What would be the "standard" way to go in this case?
2) If both ways are acceptable, are there other advantages to using a Service Bus other than an easier way to connect and send messages to my middle application? What are the disadvantages?
3) Is a cloud service really what I'm looking for (and not just a VM with the middle application code running in it)?
Its difficult to answer these sort of question as there are lots of considerations. I don't believe there is a necessarily 'standard way'.
The Service Bus' relay service's purpose is to help traverse firewalls and NATs, not something that directly relates to your scenario, I suspect.
The Service Bus, though, also includes a messaging capability which provides queues, topics and subscriptions to use to exchange messages between clients or client/server.
You could use the phone client to write and read messages to/from queues. you would then have a worker role hosting your application logic and accessing the database as needed.
Some of the advantages of using messaging include being load leveller, helping handling peaks in traffic (at the expense of latency), helping separating concerns and allowing you to accept requests from the clients when the backend is down as so can help with resiliency.
In theory they can also help you deliver messages to the client in the same fashion, by using a queue or subscription per client, but for a large number of clients this may become a management issue.
On the downside you would have to work with what is a proprietary protocol, and will need to understand the characteristics and limitations of the service bus. you will need to manage the queues and topics over time. there will also be some increased latency, although typically not an issue and, finally, you will have to implement asynchronous messaging on the client side which has advantages but is also harder to implement.
I would imagine that many architectures follow the WEB API route by using a web role cloud service exposing the API. The web role can then perform any business logic and connect to the database in the background.
A third option, which you didn't mention, is to use Windows Azure Mobile Services and implement your business logic as a service API there