I'm using the Apollo graphql framework, seemingly getting the subscription up and running. But I'm unable to get any events triggered from the subscription and the documentation around Apollo client is rarely about pure nodejs and vanilla javascript.
The project is making a few nodes for node-red, which will use a vendors graphql interface to fetch different data on power consumption and electricity pricing. For this purpose, I need to use subscriptions as well. Quite new to graphql and was recommended the apollo framework.
result = tibber.getSubscription('subscription{ liveMeasurement(homeId:"c70dcbe5-4485-4821-933d-a8a86452737b"){timestamp power maxPower accumulatedConsumption accumulatedCost}}');
// returns an Observable from the apollo client subscribe()
// client.subscribe({ query: gql`${query}` });
let sub = result.subscribe({
next (data) { console.log(data); },
error (err) { console.log(err); }
});
Would have expected the console.log to trigger with some data after a few seconds, as this subscription works with the demo account on the api explorer of the vendor.
No errors returned and subscription object (sub) is in 'ready' state.
My advice: divide and conquer
I would start with standard react app with apollo client. You can use <Query/> component for simplicity. In this step the most important is gain familiarity with authentication
Next step: use client directly, without using components. Still using react use ApolloConsumer - your component gets access to client prop. You can use client.query() called from componentDidMount() (console.log, setState).
If you prefer vanilla this tutorial can be more suitable.
Next step would be working with subscriptions. Again, start with <Subscription/> component. After that you should have properly configured client (transport, auth).
At this step you're starting to move to node env, f.e. looking for examples like this and combining with earlier gained knowledge.
Related
I am confused as to how should the watch feature in the gmail API be implemented to recieve the push notificatons inside a node.js script. Should I call the method inside an infinite loop or something so that it doesn't stop listening for notifications for email once after the call is made?
Here's the sample code that I've written in node.js:
const getEmailNotification = () => {
return new Promise(async (resolve, reject) => {
try{
let auth = await authenticate();
const gmail = google.gmail({version: 'v1', auth});
await gmail.users.stop({
userId: '<email id>'
});
let watchResponse = await gmail.users.watch({
userId: '<email id>',
labelIds: ['INBOX'],
topicName: 'projects/<projectName>/topics/<topicName>'
})
return resolve(watchResponse);
} catch(err){
return reject(`Some error occurred`);
}
})
Thank you!
Summary
To receive push notifications through PUB/SUB you need to create a web-hook to receive them. What does this mean? You need a WEB application or any kind of service that exposes a URL where notifications can be received.
As stated in the Push subscription documentation:
The Pub/Sub server sends each message as an HTTPS request to the subscriber application at a pre-configured endpoint.
The endpoint acknowledges the message by returning an HTTP success status code. A non-success response indicates that the message should be resent.
Setup a channel for watch the notifications could be summarized in the following steps (the documentation you refer to indicates them):
Select/Create a project within the Google Cloud Console.
Create a new PUB/SUB topic
Create a subscription (PUSH) for that topic.
Add the necessary permissions, in this case add gmail-api-push#system.gserviceaccount.com as Pub/Sub Publisher.
Indicate what types of mail you want it to listen for via Users.watch() method (which is what you are doing in your script).
Example
I give you an example using Apps Script (it is an easy way to visualize it, but this could be achieved from any kind of WEB application, as you are using Node.js I suppose that you are familiarized with Express.js or related frameworks).
First I created a new Google Apps Script project, this will be my web-hook. Basically I want it to make a log of all HTTP/POST requests inside a Google Doc that I have previously created. For it I use the doPost() equal to app.post() in Express. If you want to know more about how Apps Script works, you can visit this link), but this is not the main topic.
Code.gs
const doPost = (e) => {
const doc = DocumentApp.openById(<DOC_ID>)
doc.getBody().appendParagraph(JSON.stringify(e, null, 2))
}
Later I made a new implementation as a Web App where I say that it is accessible by anyone, I write down the URL for later. This will be similar to deploying your Node.js application to the internet.
I select a project in the Cloud Console, as indicated in the Prerequisites of Cloud Pub/Sub.
Inside this project, I create a new topic that I call GmailAPIPush. After, click in Add Main (in the right bar of the Topics section ) and add gmail-api-push#system.gserviceaccount.com with the Pub/Sub Publisher role. This is a requirement that grants Gmail privileges to publish notification.
In the same project, I create a Subscription. I tell it to be of the Push type and add the URL of the Web App that I have previously created.
This is the most critical part and makes the difference of how you want your application to work. If you want to know which type of subscription best suits your needs (PUSH or PULL), you have a detailed documentation that will help you choose between these two types.
Finally we are left with the simplest part, configuring the Gmail account to send updates on the mailbox. I am going to do this from Apps Script, but it is exactly the same as with Node.
const watchUserGmail = () => {
const request = {
'labelIds': ['INBOX'],
'topicName': 'projects/my_project_name/topics/GmailAPIPush'
}
Gmail.Users.watch(request, 'me')
}
Once the function is executed, I send a test message, and voila, the notification appears in my document.
Returning to the case that you expose, I am going to try to explain it with a metaphor. Imagine you have a mailbox, and you are waiting for a very important letter. As you are nervous, you go every 5 minutes to check if the letter has arrived (similar to what you propose with setInterval), that makes that most of the times that you go to check your mailbox, there is nothing new. However, you train your dog to bark (push notification) every time the mailman comes, so you only go to check your mailbox when you know you have new letters.
I started using Azure recently and It has been an overwhelming experience. I started experimenting with eventhubs and I'm basically following the official tutorials on how to send and receive messages from eventhubs using nodejs.
Everything worked perfectly so I built a small web app (static frontend app) and I connected it with a node backend, where the communication with eventhubs occurs. So basically my app is built like this:
frontend <----> node server <-----> eventhubs
As you can see it is very simple. The node server is fetching data from eventhubs and sending it forward to the frontend, where the values are shown. It is a cool experience and I'm enjoying MS Azure until this error occured:
azure.eventhub.common.EventHubError: ErrorCodes.ResourceLimitExceeded: Exceeded the maximum number of allowed receivers per partition in a consumer group which is 5. List of connected receivers - nil, nil, nil, nil, nil.
This error is really confusing. Im using the default consumer group and only one app. I never tried to access this consumer group from another app. It said the limit is 5, I'm using only one app so it should be fine or am I missing something? I'm not checking what is happening here.
I wasted too much time googling and researching about this but I didn't get it. At the end, I thought that maybe every time I deploy the app (my frontend and my node server) on azure, this would be counted as one consumer and since I deployed the app more than 5 times then this error is showing up. Am I right or this is nonsense?
Edit
I'm using websockets as a communication protocol between my app (frontend) and my node server (backend). The node server is using the default consumer group ( I didn't change nothing), I just followed this official example from Microsoft. I'm basically using the code from MS docs that's why I didn't post any code snippet from my node server and since the error happens in backend and not frontend then it will not be helpful if I posted any frontend code.
So to wrap up, I'm using websocket to connect front & backend. It works perfectly for a day or two and then this error starts to happen. Sometimes I open more than one client (for example a client from the browser and client from my smartphone).
I think I don't understand the concept of this consumer group. Like is every client a consumer? so if I open my app (the same app) in 5 different tabs in my browser, do I have 5 consumers then?
I didn't quite understand the answer below and what is meant by "pooling client", therefore, I will try to post code examples here to show you what I'm trying to do.
Code snippets
Here is the function I'm using on the server side to communicate with eventhubs and receive/consume a message
async function receiveEventhubMessage(socket, eventHubName, connectionString) {
const consumerClient = new EventHubConsumerClient(consumerGroup, connectionString, eventHubName);
const subscription = consumerClient.subscribe({
processEvents: async (events, context) => {
for (const event of events) {
console.log("[ consumer ] Message received : " + event.body);
io.emit('msg-received', event.body);
}
},
processError: async (err, context) => {
console.log(`Error : ${err}`);
}
}
);
If you notice, I'm giving the eventhub and connection string as an argument in order to be able to change that. Now in the frontend, I have a list of multiple topics and each topic have its own eventhubname but they have the same eventhub namespace.
Here is an example of two eventhubnames that I have:
{
"EventHubName": "eh-test-command"
"EventHubName": "eh-test-telemetry"
}
If the user chooses to send a command (from the frontend, I just have a list of buttons that the user can click to fire an event over websockets) then the CommandEventHubName will be sent from the frontend to the node server. The server will receive that eventhubname and switch the consumerClient in the function I posted above.
Here is the code where I'm calling that:
// io is a socket.io object
io.on('connection', socket => {
socket.on('onUserChoice', choice => {
// choice is an object sent from the frontend based on what the user choosed. e.g if the user choosed command then choice = {"EventhubName": "eh-test-command", "payload": "whatever"}
receiveEventhubMessage(socket, choice.EventHubName, choice.EventHubNameSpace)
.catch(err => console.log(`[ consumerClient ] Error while receiving eventhub messages: ${err}`));
}
}
The app I'm building will be extending in the future to a real use case in the automotive field, that's why this is important for me. Therefore, I'm trying to figure out how can I switch between eventhubs without creating a new consumerClient each time the eventhubname changes?
I must say that I didn't understand the example with the "pooling client". I am seeking more elaboration or, ideally, a minimal example just to put me on the way.
Based on the conversation in the issue, it would seem that the root cause of this is that your backend is creating a new EventHubConsumerClient for each request coming from your frontend. Because each client will open a dedicated connection to the service, if you have more than 5 requests for the same Event Hub instance using the same consumer group, you'll exceed the quota.
To get around this, you'll want to consider pooling your EventHubConsumerClient instances so that you're starting with one per Event Hub instance. You can safely use the pooled client to handle a request for your frontend by calling subscribe. This will allow you to share the connection amongst multiple frontend requests.
The key idea being that your consumerClient is not created for every request, but shares an instance among requests. Using your snippet to illustrate the simplest approach, you'd end up hoisting your client creation to outside the function to receive. It may look something like:
const consumerClient = new EventHubConsumerClient(consumerGroup, connectionString, eventHubName);
async function receiveEventhubMessage(socket, eventHubName, connectionString) {
const subscription = consumerClient.subscribe({
processEvents: async (events, context) => {
for (const event of events) {
console.log("[ consumer ] Message received : " + event.body);
io.emit('msg-received', event.body);
}
},
processError: async (err, context) => {
console.log(`Error : ${err}`);
}
}
);
That said, the above may not be adequate for your environment depending on the architecture of the application. If whatever is hosting receiveEventHubMessage is created dynamically for each request, nothing changes. In that case, you'd want to consider something like a singleton or dependency injection to help extend the lifespan.
If you end up having issues scaling to meet your requests, you can consider increasing the number of clients for each Event Hub and/or spreading requests out to different consumer groups.
Beginner here, I'm using Firebase real time database and I need my API to constantly return that value when something has been added see my code below.
apiCalls.get('/api/getallusers',function(req,res){
userFunc.getAllUsers(function(err,result){
if (err) return res.status(500).send('internal server error!');
res.status(200).write(JSON.stringify(result));
res.end();
return res;
})
})
this will return the error
Error [ERR_STREAM_WRITE_AFTER_END]: write after end
but if i remove res.end it will show 1 record and constantly load until the page times out..
is what I'm doing possible or are there different ways to do it.
also I'm using firebase cloud functions for this api.
UPDATE:
Uploaded the API but it does not return anything...
here is the link https://us-central1-testproject-e6819.cloudfunctions.net/api1/api/getUser
tried axios and Event Source
Firebase functions logs the values but it does not return it..
If you're viewing the API response like a web page, your browser is buffering the data it's received until there's enough of it to form a more full page. Your browser is expecting content that ends, not some endless stream of data.
You should remove .end() if you expect to be able to continue to write to the output stream.
Also, I recommend using the Server-Sent Events (SSE) protocol for this. https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events It provides a nice standards-based abstraction that makes it very easy to handle event streams client-side.
const eventSource = new EventSource('https://api.example.com/someApi');
eventSource.addEventListener('userupdate', (e) => {
console.log(e.data);
});
Server-side, there are a couple Express-based middlewares to make this even easier than it already is.
Operations in Cloud Functions must be relatively short-lived and end deterministically. There is no way to keep a connection open from Cloud Functions to the client.
Typically consider what triggers the need to send new data. For example, if it is triggered by the fact that a new user is registered, you can use trigger your Cloud Functions from Firebase Authentication. Then the function could for example write to the Realtime Database (or Cloud Firestore), and your client/app listens to the database for realtime updates. That way you're using all the pieces of Firebase in the way they're designed: Cloud Functions for short-lived updates triggered from events in the system, and the Realtime Database or Cloud Firestore for sending realtime updates.
If that doesn't work for your use-case, you'll need a runtime environment that allows you to keep processes alive. Something like App Engine flex, Kubernetes, or many other options come to mind for that.
I've recently started to work with node.js and I have to build an architecture that should use multiple express.js services. Some of these services will have to be located on one server, anothers - on other server machines. I want to build a base service (like API Gateway), but I don't know what the proper way to communicate between this Gateway and microservices, or between two microservices.
Currently I'm working with a solution based on this:
# inside Gateway server I call another service:
http.get('http://127.0.0.1:5001/users', (service_res) ->
data = ''
service_res.on 'data', (chunk) ->
data += chunk
service_res.on 'end', ->
# some logic on data
).end()
I have a strong feeling that this approach is not right. What the proper way to build communication logic between API Gateway and microservices?
The logic you have is not incorrect but what would probably be better is to build a layer of abstraction on top of making requests to an another service eg. the API gateway to another microservice. Lets just call that microservice B for this instance (API gateway to make a request to B).
B in this case should provide its own client on how another service should interact with it, whether its through HTTP or WebSockets, the protocol is up to B because B understands how one should communicate with it. The argument for the client and the service being implemented together is that these two components should have a higher level of cohesion since technically they are bound by a contract eg. if a requests needs to be made to a service, it needs to adhere to the contract that the service requires.
In simple pseudocode with Express:
// implemented elsewhere, ideally next to the service that it communicates with
function BServiceClient() {
// ...
}
// the API gateway's calling code
app.get('...', function(request, response, next) {
// create an instance of the service client
var bServiceClient = new BServiceClient();
// retrieving the users from an abstracted endpoint
bServiceClient.GetUsers();
// do some processing and then render a response or call next
});
In order for it to be more testable, you might have to write your own wrapper around the app to do the proper dependency injection for injecting the client to make the routes more testable. Otherwise, you might be able to create another function that can inject the client and create the client at the handler level that calls the newly created function. The newly created function could then be tested. However, I prefer the former approach of using the wrapper. Hope this helps!
What i would do is,
Create separate modules for each microservice. Depending on what microservice you want to run, just have a route for that in express.
Inject the modules you want into an instance of express().
Example + shameless plug - https://github.com/swarajgiri/express-bootstrap/blob/master/core/index.js
Disclaimer - The above solution is a highly opinionated way of solving your problem.
I need a legacy java application to pull information from a meteor's collection.
Ideally, I would need a simple service where my app would be able to download the latest list of items prices. A scenario like going on (through an http GET):
www.mystore.com/listOfPrices
would return a json with an array
[{"item":"beer", price:"2.50"}, {"item":"water":, price:"1"}]
The problem is that I cannot make a meteor page printing the result "as is" because meteor assumes the client supports javascript. Note that I do plan to implement the java DDP client in a latter stage but here I would like to start with a very simple service.
Idea: I thought of doing my own Node.js request aside of the running meteor service in order to retrieve a snapshot of the collection. Then this request would be using a server based javascript DDP client in order to subscribe and filter to then return the collection once loaded as a jSON document (array).
Any idea on how to achieve this ?
Looks like you want to provide a REST interface. See the MeteorPedia page on REST for how to expose collection data. It might be as simple as
prices = new Mongo.Collection('prices');
// Add access points for `GET`, `POST`, `PUT`, `DELETE`
HTTP.publish({collection: prices}, function (data) {
// here you have access to this.userId, this.query, this.params
return prices.find({});
});