Mock out external services when e2e testing with playwright - jestjs

I want to write an E2E test of our signup flow using playwright. Part of our signup flow involves authentication via the Firebase SDK/API. We have a file called FirebaseClient that is responsible for this.
I'm not testing external services, so I want to mock out the successful response that the client will return.
Is it possible to create a version of FirebaseClient that is run instead of the original only during E2E tests?
There are several flows throughout our application that interacts with external services, so signup isn't the only place where this is important.

Related

How to use Azure API connector to register an user through signin-signup user-flow from an ASP.NET Core Web API?

I want to execute the signup user-flow in Azureb2c from an ASP.NET Core Web API instead of executing it from our frontend app. To achieve this I found an approach of using API connector.
The documentation of API-Connector says
You can use API connectors to integrate your sign-up user flows with REST APIs to customize the sign-up experience and integrate with external systems
I assume with this explanation that I can execute a user-flow defined in AzureB2C for signin-signup from a Web API as a Http Request. What that is not clear is how the API connector actually work. When we still are in development phase and want to execute the API-Connector (and execute the user-flow via this), what type of http request we are supposed to send? Will it contain the Id/password we had defined when creating the API-Connector along with the user-details that needs to be passed to signup user-flow? What the URL which we had added when creating the API-Connector will be used for?
Unfortunately there is not much data available on API-Connector apart from the Azure official documentation, which I felt a bit confusing.

Authenticating a Vue 2 Azure Static Web App Locally Against Auth0

I am researching the feasibility of porting an existing Vue 2 app to be an Azure Static Web App (SWA). A requirement is the ability to run the Vue app locally and authenticate against our Auth0 tenant to retrieve a access/bearer token to send along with our HTTP requests.
It appears that a SWA can utilize custom authentication (Auth0), and I was able to complete that successfully by following this article. However, I'm not seeing any information around capturing the access token. There is an /.auth/me/ endpoint which has user information, but that does not contain the access token:
I also looked into the Azure Static Web App Emulator which allows for defining an identity profile when running locally, but I'm not seeing a way to specify an access token here either.
Is it possible at the moment with a SWA to obtain an access token using a custom auth provider when running locally and when published live?
Managed Authentication in Azure is really only useful for fairly simple use cases. I think you're going to want to implement your security directly inside your Vue application.
https://auth0.com/docs/quickstart/spa/vuejs/01-login
You mentioned needing an access token but didn't say where it comes from or what you're doing with it. Are you trying to call an Auth0-secured API?
https://auth0.com/docs/quickstart/spa/vuejs/02-calling-an-api

How to develop and test B2C token enrichment with an API connector locally?

I am creating a React SPA that will connect to a dotnet Web API backend. I want to use b2c to handle auth.
I want to have endpoints in the API protected based off the claims in the b2c token, also want to check claims in the frontend aswell.
Following along with token enrichment docs - https://learn.microsoft.com/en-us/azure/active-directory-b2c/add-api-connector-token-enrichment?pivots=b2c-user-flow
I can use an API connector to enrich the token with additional claims before it is sent back to the client, presumably with a function to provide values from my database in a users table.
If my database will also be hosted in azure, how would I develop and test this locally?
Is this flow the best way to achieve the desired behaviour? If not what are alternatives?
I've done two methods when testing 'locally'.
Create an Azure Function echo API service in 'cold' mode to reduce cost. This will allow you to easily ping it and it will ping you back the content. You can also add test cases pretty easily. This isn't really 'local' but, it's low cost and allows you to collaborate with multiple team members.
Use a free web service like Post Test Server. With a couple clicks you are off and testing with lower effort. The problem with this solution is it is public so do not use any proprietary or confidential information. I've created very complex dot notation in body of REST-API technical profile via custom policies and this was effective testing data formatting and internal business logic.
Last option could be creating a web service locally on your device - I haven't done this end-to-end but you would follow this same process as hosting an application locally with localhost.
You can deploy an Azure database solely for development purposes. Or you can just mock the API response so that you can do some basic e2e testing: SPA -> B2C -> API.

Google API Authentication: are there alternatives to service account keys?

I'm seeking your advice to piece together a mechanism that would facilitate authentication to Dialogflow ES and CX to allow running experiments on multiple agents (projects) from our workbench application in a smooth and error-proof manner. The workbench is an internal tool written in TypeScript (using the dialogflow RPC node module) running outside of GCP. Our users analyze the results of sending the same inputs (utterances) to multiple agents, usually going back and forth between them in the course of their work.
With proper IAM configuration, we have been able to detect intents successfully by doing a gcloud auth application-default login, however we haven't found a way to update the quota project programmatically or to specify the quota project through the google.cloud.dialogflow library, so we haven't been able to fix the "switch easily between projects" part. It looks like tampering with the quota_project_id property in application_default_credentials.json once authenticated is the way to go (gcloud auth application-default set-quota-project <project>) but we would have preferred doing this programmatically.
Using service account keys (JSON) works as expected and that's what we have been doing so far, that's also what we do in our CI/CD pipeline and in our agents running in production. But we aim at reducing the amount of service account credentials file that we share with individuals. Ideally, speech/data scientists would use their own end-user credentials to perform experiments.
We are looking for alternatives so that users would authenticate once with gcloud auth application-default login and the workbench would handle the rest behind the scenes, using only, as additional argument, the project-id against which the experiment must be run. This would eliminate the need to pause the experiment to update the quota project (using set-quota-project), or to update the GOOGLE_APPLICATION_CREDENTIALS variable when using service account keys.
Another thing we tried was Service Account Impersonation, unfortunately this does not seem supported by the google.cloud.dialogflow library, so even though we were able to successfully submit requests (with Curl/Postman) to the Dialogflow RESTful API using impersonation, we haven’t been able to leverage this mechanism in our code.
Has anyone been able to overcome a similar challenge? Is there any other authentication mechanism that could help us achieve this goal?

How can i safely pass access token generated from Google OAuth to a NodeJS REST API?

I am creating two application :
1. Chrome extension for gmail.
2. It's IOS version
Now, since both the applications have same behavior and uses same google apis extensively, i decided to create a single project in google cloud platform for both. Now, when creating credentials, what will be my application type? I see both 'IOS' and 'Chrome App' under application type. Should i generate two Client IDs for chrome app and ios app?
To use single Client ID, i also tried creating a Node REST API (created a new project and set application type to 'web application' in google cloud platform) that will be used by both of my application to make request to google apis? But the authorization process includes, setting a callback url to get the authorization code and later use this code to get the access token. I guess this is not feasible for a REST API. Where should i keep the authorization part? In the application itself and later send the access token to my rest api? Is it possible?
I am very much confused about how should i start. Please could anyone suggest a better way to do this?
I generated two client ids, one for ios and for the web. Isolating both applications is a good way to start. Both the apps generate their own token id, pass the id to Node Rest API and later the use that token to make a request to Google APIs.

Resources