Azure APIM ETimedout error intermittently - azure

I have two APIM(BFF and Enterprise). Both has azure functions as their backend. The azure functions are running on premium EP1 plan and has two instances running all the time.
BFF APIM receives call from an ios application and routes its call to a nest js graphql server. The nest js server is running as azure function. The function calls enterprise APIM. The enterprise APIM has its own .net core based azure function.
In the BFF nest js GQl server, we have a middleware function which makes a call to the Enterprise APIM before running the 'real' query. We started to receive 'Etimedout' error in last two weeks. But the error has really picked up in last couple of days and it has rendered the API unusable. The error typically says ETimeout error on the enterprise APIM 443.
I already check the SNAT. Also, scaled up the max burst instances of azure function to no avail. Nothing has changes in terms of load of the system. It has been consistent throughout last 8 weeks. Any ideas what we can do from diagnosing perspective?

Related

How to host in azure?

Looking for some advice on the best and cheapest way to host in azure and I’ve never used it before so I’m finding it fairly confusing if I’m honest. But here’s where I’m at...
I have a Vue.js front-end which calls an azure function back end (API in .net core using VS) which is connected to mysql workbench running as a windows service currently, which ultimately returns all data back to my front end.
I set up a free trial tonight, I’ve created a resource group and set up an azure function and pushed my API up to it. I then created an azure for mysql instance and managed to connect my DB up (again from mysql workbench running as a windows service) to my azure mysql instance and connected this also with credentials to my API.
I need to now host the Vue.js app and connect this to my azure function but how are the endpoints exposed ?
Also, I have registered a domain and I’d assume I’d have to connect this to the Vue.js app once it’s hosted but any tips ?
I need to keep this as cheap as possible.
One of the cheapest ways to host your website is using a Azure blob storage. You can set it up for static websites, I've done this a couple of times for Angular applications. blob documentation
As for exposing your API. Make sure that your Azure function has a http trigger and the correct authorization level. You can obtain an url as prescribed here.
Please let me know if you need anything else :).

Serverless SignalR Service - What do Azure Functions provide?

Say I'm working with an Azure SignalR Service run in serverless mode to implement a chat application. I'm wondering why would we use Azure Functions for this. What do they provide us? Couldn't we just build the connection with the SignalR Service on our own directly? Or say, after we negotiate an access token with an Azure Function, why can't we just use the connection we build with that token to broadcast messages, rather than relying on an additional Azure Function to broadcast messages?
In the past, people used to couple SignalR in their own web api or mvc project. The problem was that when there was a need to scale, it wasn't possible to scale things separately. Also, when comes to SignalR, it's hard to work with sticky sessions for example. This is when they released Azure SignalR Service, a managed service that would implement the backplane pattern for you.
More info:
https://learn.microsoft.com/en-us/aspnet/signalr/overview/performance/scaleout-in-signalr
The last piece would be to separate the real time bi-directional communication from the webapi / mvc project. They added Azure Functions as it's a light weight and easier to scale when comparating with webapi / mvc.
why can't we just use the connection we build with that token to
broadcast messages, rather than relying on an additional Azure
Function to broadcast messages
A: It's because the function is not being executed 100% time.

Single entry point for list of APIs hosted in Azure App Service

I have 7 APIs (API Apps) hosted in Azure App Service and I have a client app (SPA) which consumes all these APIs currently. For better maintainability at the client side, i would like to have a single entry point for all these APIs. Therefore, the APIs wouldn't be exposed to the client directly.
With the single entry point i can validate the authenticity (AD B2C token) of the client call and then route the request to respective API. In this case, the response from any API would return through the entry point to the client app.
I have couple of thoughts to implement this flow as below:
Azure API management would be a fit for my use case. However, the cost of the service looks little high for me.
Create one more API which would sit in-front of all the existing APIs, then it would act as an entry point of each request which comes from web app client. Here, for 'App Service Plan' a high end tier should be chosen to process thousands of request per second. I am expecting one thousand requests/second.
As i just started with Azure, i would like to take suggestions from you on achieving my use case. Please let me know your thoughts on suitable solution.
p.s, All the APIs are .net core web api (3.1)
Since you've already crossed out API Management, I would say creating a API "traffic router" is your best solution. You could use something like Application Gateway and use url routing to direct to a pool but you'll have to check the pricing to see if that works for you.
I would however, if it's not too difficult, place your 7 APIs into on one API project and just have the various controllers for your API endpoint. I certainly get if that's not possible. Should that be the case, I would place all API app services inside a VNet and restrict access so that only app services on that VNet could talk to it. That way, your "traffic router" API app can do the authorization and be able to talk to the other 7 API apps.

Azure App Service gives another response on different instance of app service

I am setting up a web app on Azure for which I am using an azure app service. At the moment, the app service scales down to 1 instance at night, and scales up again in the morning.
When a request is sent to the app service when there are 2 instances, the response depends on the instance which handles the request. I would expect a 200, but half of the time I get a 500 http response.
I figured out it depends on the instance because when I use a cookie ARRAffinity (which lets you choose the specific instance of the app service), I am able to reproduce always 200 reponses on 1 machine, and always 500 responses on the other machine.
WEBSITE_LOCALCACHE_ENABLED is false and hence the app service should use the same code, coming from 1 network share if I am not mistaken.
Because half of the time, the app acts normal, I think this is not a code problem, but an infrastructural problem on Azure.
The web app is written in .NET and uses .NET Core 2.2. OS Version is windows and 64 bit system.
It might be the problem with instance or it might be the problem with code. When you seen the issue try to do the Advanced restart from the portal and see if that helps.
Also during the time of the issue see Diagnose and solve problems Blade of the App service and see under Availability and Performance section for the logs information which will give better idea.

On how many instances is my Azure WebApi running

I am trying to implement SignalR hubs on my REST service (ASP.NET Web Api) hosted on azure. I've been reading some common stuff related to SignalR and I came to this one that it is server bound. You can check here. Which means that in order to be able to scale it on multiple server instancs I have to do some additional stuff. So, then, I started to ask myself "How many instances do I currently have running of my REST service on Azure? How do I know that?"
So, what I did was - I navigated to azure portal and opened my service > Process explorer
Does that mean than my web app scales automatically and I currently have 2 instances of my web api runing? I think it clearly says that there's currently only one instaces of it running 2 processes but how do I know if it will scale some time in the future?
No, your photo shows your app process and Kudu, which is a management interface you can access at https://yourappname.scm.azurewebsites.net.
You can see the instance count in your app's App Service Plan. If it is on Free or Shared, there can only be one. If it is Basic/Standard/Premium, it is one by default. If you haven't setup auto-scale, it won't scale to more than 1 instance unless you tell it to.

Resources