How to mock Google Pubsub requests in integration tests (node)? - node.js

I'm writing integration tests for an API in NodeJs that publishes to Google PubSub topics. I don't want to really trigger those external messages in testing environment, so I need some kind of mock.
Since it's an integration test, I think the best approach would be to intercept external network calls to the pubsub service, returning a mocked response.
I tried using nock for this, but it didn't work. I think maybe because google's lib implementation is over gRPC / http2 (?)
Is there any solution to this or a better approach? What am I missing?
Thanks!

Related

How to create provider pact verification in python

I am trying to build a pact verification against consumer contract using python. Basically I am reading the consumer pact from the broker and trying to verify it. The provider real API is hosted on GCP.
I am really confused if I need to create a provider mock ( I thought we create it only on consumer side) to run the verification or I have to run it against the production API (hosted on GCP)?
In the case it is a provider mock on localhost, how should I built it?
Actually when it is running locally, I feel like I am going to hard code the actual response as in the user-app.py. Hence, when the production API change, I have to reflect that change manually on the user-app.py. I feel like I am missing something.
Here is the contract
to run the verification:
pact-verifier --provider-base-url=http://localhost:5001 --pact-url=tests/recommendations.recommendations-api-recommendations.basket.model.json --provider-states-setup-url=http://localhost:5001/_pact/provider_states
With Pact you only mock the provider on the consumer side, because you're unit testing the consumer code.
When you test the provider, Pact stands in for the consumer, so you absolutely do not mock the provider here (otherwise, what confidence would you get?)
You should:
Run the provider locally (ideally not to a deployed environment, the kind of testing we are avoiding and usually helping to replace with contract testing is end-to-end integrated tests)
Mock out any third party dependencies to increase reliability/determinism
See also this page on testing scope: https://docs.pact.io/getting_started/testing-scope
You may find these examples helpful.

nestjs, GRPC server and micro services architecture

I've started a new project with nestjs with microservices, but it's my first microservices project and i don't' have enough knowledge.
During my documentation study, I can't find a way to use a microservice with grpc and HTTP at the same time.
In my architecture, I have got a few microservices that have to serve REST API for the client but have also to serve grpc request for "internal" purpose, is that a right decision?
It is not correct to say "I can't find a way to use a microservice with grpc and HTTP at the same time" since GRPC uses HTTP. GRPC is not a protocol, it is a way to serialize messages, by exposing HTTP endpoints you have the possibility to choose between different alternatives such as XML; REST/JSON or GRPC.
Normally following the "hexagonal architecture" philosophy (https://en.wikipedia.org/wiki/Hexagonal_architecture_(software)) you should be able to separate the logic from the adapters and your project can implement multiple adapters for the same logic, for example one adapter in HTTP/REST and another in HTTP/GRPC. On the other hand, a way to avoid having to implement multiple ports is to always choose HTTP/GRPC and use Envoy as a proxy between HTTP/REST and HTTP/GRPC (see https://grpc.io/docs/tutorials/basic/web/) but final solution depends on many factors

What is the best way to mock q-io/http requests?

I'm trying to write unit tests for my HTTP service. My service interacts with another remote HTTP service, and I'm using using q-io/http for that interaction.
I would like to use something like the nock package to mock my calls to the remote service, but q-io/http does not seem to be compatible with nock (I'm assuming that this means that the request module is not actually used under the covers of q-io/http as I'd hoped).
Are there any other approaches to mocking q-io/http requests? There does not seem to be an http mocking capability included in Q like there is for files.
It turns out that q-io/http does indeed use the standard request module under the covers, and subsequently, it is possible to use nock with the q-io/http module.
For me, the problem was that nock was not matching my requests, and the exception was getting swallowed up in a catch. Using the nock log(console.log) mechanism made the matching problems obvious:
nock(documentUrl)
.delete('/state')
.reply(204, {})
.log(console.log);

How would one run a Node.js script on Firebase?

I have a Node app/script that needs to constantly be running (it's a discord bot, done with discord.js, but I think that's mostly irrelevant), and I'd like to do it on Firebase.
It has its own client.on('event', ()=>{}) events system, so I don't believe that I could use Firebase's cloud functions. There's also what seems to be a website-hosting based way to have a node.js server, but that seems triggered by HTTP requests.
Is there any other way I could do it?
There is no way to run arbitrary node.js code on Firebase. Unless your script can run within Cloud Functions "triggered execution" mode, you'll need your own app server to run it.
You can of course create a service that maps Discord.js events to Firebase events, such as writes to the Realtime Database, Cloud Firestore, even just direct HTTPS calls to a Cloud Functions endpoint. You could even bypass Firebase there and have your mapping service write to Cloud PubSub and use that to trigger Cloud Functions.
One thing that looks promising in the Discord.js documentation is their mention of web hooks, which is just another way of describing HTTP endpoints. But from my quick scan I couldn't figure out if those would allow you to call your HTTP triggered Cloud Function.

How to use Socket.io in aws lambda functions

I'm using node js with aws lambda functions for my web application. I want to use web socket in my web app. And socket.io is a very famous library for node.js.
But I'm not sure how can I use it in aws lambda. can anyone let me know how to do it, is it possible using lambda functions?
Can't use socket.io with lambdas. Lambdas have a limit on time it can be executed, so holding open connection with a client is impossible.
What you can do though is use SNS. Lambdas can publish messages to SNS topics and lambdas can be invoked with SNS.
Workaround exists for this - it's the WebSockets over MQTT in AWS IoT. This way you can execute your Lambda functions from open socket connections.
Currently, AWS API Gateway supports WebSockets.
Unfortunately, I didn't manage to connect via socket.io since it generates a custom URL with additional params: /?EIO=3&transport=polling&sid=< id>
But I've found tiny WebSocket wrapper sockette (used this tutorial), and it works fine!

Resources