Azure B2C Performance Metrics - azure-ad-b2c

Is it possible to track page load times for each user journey? What other performance metrics are available for Azure B2C that we can also plugin to Azure Monitor? Can we track total time of execution for a user journey (at least the steps where we do not wait for user input)

Came across this github.com/yoelhor/aadb2c-load-test which might help to perform Azure AD B2C Load Testing.

Related

Azure User - End Active Session

I as an Azure Admin created an Azure User in AD - User1.
User1 is logged into Azure Portal.
As an admin, i want to end the User1's Active Sesson of Portal with a specific Time.
How do i do that?
I have tried:
Azure AD Condtional Access - Need Azure Premium P1 for this and it seems costly as its per user basis cost. I have large number of users with whom i need to set Session timeout and kill active session regularly. Which i am feeling will cost a lot for simple work. Also 1 Hour is minimum time that can be set here. Cannot set less then that.
Condtional Access Cost Details - https://azure.microsoft.com/en-in/pricing/details/active-directory/
Let me know if you know any other method or if my calculation for Azure Premium P1 is not correct.
Your calculation for Azure Active directory premium p1 is correct. Earlier it is done by using PowerShell as per this link but this is not recommended and going to depreciated.

Accessing a mailbox with the new Graph API how to estimate cost?

I have a need to read a dedicated mailbox and read new messages and its attachment. The preferred way to do this now, seems to be with the newer Graph API. This requires setting up an azure application. Ok, so forgive the newbie question, but how can I get an idea of what an expected monthly cost will be?
Taking for granted that you already have an Office 365 (Exchange Online) subscription and Azure AD tenant, neither AAD application registration nor Graph API requests wouldn't cost you anything extra.
As for the application itself, the cost will depend on the Azure service you'll choose it to deploy to. You've used the "azure-functions" tag in your question, so I assume that's what you are going to use. Well, it's quite hard to estimate an Azure function cost before actually running it (do you know the resource consumption of a piece of software which doesn't exist yet?), so I'll suggest you proceed this way:
Create a new Function App, be sure to select "Consumption Plan" as its hosting plan;
Go to your app -> Function app settings and set the "Daily Usage Quota (GB-Sec)" as 12900. This way you'll ensure your app will not exceed the 400,000 GB-s execution time, included for free in your subscription;
Deploy your application and have fun with the Graph API for free;
Enable Application Insights integration for your function app and monitor Execution Count and Function Execution Units metrics to have an idea of what's your function approximate consumption is.
P.S.: Please have in mind that other Azure resources you would utilize besides your function itself (storage account, application insights, outbound traffic, etc.) could result in some charges, though I doubt they will exceed a couple of bucks monthly if you don't store terabytes of data as a part of your app logic.

Ingest more than 30k users details from Azure Active Directory

I am facing an issue with iterating over 39k azure ad users in Azure Active directory.
I am able to get the ad users from Microsoft Graph API page by page.
As the Graph API provides results page by page, our scaling is limited to the number of records in a single page.
Thus, it takes a long time (more than an hour) to process the data for more than 10k users.
I want to know, if there is any other way I can use to get all the users in pages in parallel and process those batches of data in parallel.

Increasing the data retention for activity logs (Audit and Sign-ins) in Azure Active Directory

In the Azure Portal under Azure Active Directory I am looking for a way to persist the Audit and Sign-in activity data for 1-year or longer. Azure AD Premium 1-2 seems to only allow for a maximum of 30 days. I am in search of a method, preferably inside of the Azure ecosystem, to store this data longer. In my attempts to Google a solution, I found the ability to export the Azure Activity Log data to general purpose storage, but I do not see that option from within Azure Active Directory.
Is the only option to create a script to move this data to a more permanent location, or is there a way to extend the data retention for these logs within Azure?
I'm new to all things Azure, so if I am missing any obvious things, please inform me.
For now, AAD doesn't support increasing the data retention for Audit logs within Azure Active Directory.
Depending on your license, Azure Active Directory Actions stores activity reports for the following durations:
Report Azure AD Free Azure AD Premium P1 Azure AD Premium P2
Directory Audit 7 days 30 days 30 days
Sign-in Activity 7 days 30 days 30 days
If you need data for duration that is longer than 30 days, you can pull the data programmatically using the reporting API and store it on your side. Alternatively, you can integrate audit logs into your SIEM systems.
Hope this helps!

Questions about Azure resource management portal

I've recently been playing around with the Bing's image search api, however I have a concern I hope to resolve.
It is to do with the limit on the number of api requests that are allowed per month. After doing some reading it seems like if I were to exceed this limit, my Azure account would be billed depending on the number of api calls I have gone over my limit. Is it possible to set up some kind of alert through the Azure management portal that will stop the api from processing any more calls once a specific threshold has been passed?
If anyone has any experience using the Search api and can enlighten me, that would be great.
Try Metrics Monitoring. Go to the service within Azure Portal, Scroll Down to Monitoring -> Metrics and then click Add Metric Alert.
You can create an alert based on the number of successful calls or total calls and the alert can notify you via e-mail. Additionally, if you want to take action automatically after reaching the threshold, you can use Webhooks to make a call out to a web application or Azure Automation Runbook to automatically run PowerShell scripts or some code to prevent overuse. You can also use Logic Apps for that. Check the following link for further details and examples at the end of the page:
https://learn.microsoft.com/en-us/azure/monitoring-and-diagnostics/insights-webhooks-alerts

Resources