I want to configure my Angular app with ELK stack.
Option 1:send from Angular an http post request to logstash.
Option 2: send from Angular an http post request to node.js server, and that server will send the messages to logstash.
Maybe there are another options, but i dont know. I didnt find any tutorials about it.
What is right way to do that?
Create a usable API with either REST or GraphQL or other (gRPC up to you really) to connect the client with the server, it is the most complete technique for anything wholesome.
From there you can then add more features to it and have a seed project for the future aswell.
Technically the Client sends a query to the API of your choice then the API talks with the server that the logic dictates (this can be elasticsearch / express / or any other) and then it either logs data on elastic or the DB or any other.
Related
Is it best to make API calls directly to RabbitMQ from the frontend React Native app, or is it better to make an API call to a backend server endpoint, and bind/queue the messages there, in order to return a JSON response to the frontend once the message is consumed?
My plan is to make a React Native app that uploads large files to Digital Ocean Spaces, and then store other data in Firebase collections. I have a Node.JS Express server running on the backend, and I'm wondering if it's best to queue RabbitMQ messages by going through the Express server first, or if I should just queue the messages to RabbitMQ directly from the frontend React Native app?
Here's a SO post of an example fetch() API call to RabbitMQ directly from a frontend React Native app, but I'm wondering how secure this is (because you need to pass user and password credentials in a JSON object), and if it's best to just send all messages to the backend Express server first. I suppose a lot of this may depend on app architecture, but my thinking is that it's best to queue, produce, and consume messages by first going through a 3rd-party client library on the backend, using amqplib for example, especially since most RabbitMQ examples found online do this.
I know it is a very much a beginner question but I am struggling to grasp a few things when it comes to the MERN stack and GraphQL. There is this particular project on github where the web app is developed using MongoDB, Express, React and Nodejs along with GraphQL.
I do understand that MongoDB is used for data storage and React for the front end but I can't wrap my head around as for why Express and Nodejs is used if an API is created with GraphQL which POSTs and GETs data directly to/from the MongoDB database? What is the role and interconnection between nodejs, express and graphql?
This question might not make sense to you because I am missing the knowledge of basic concepts of web app development and understanding of web dev stacks such as MERN.
Node.js is a JavaScript runtime environment -- it's what actually executes all your server-side code. Express is a framework that provides basic features for developing a web application in Node.js. While Node.js is already capable of listening to requests on a port, Express makes it simpler to set up a web server by eliminating boilerplate and offering a simpler API for creating endpoints.
GraphQL is a query language. GraphQL.js is the JavaScript implementation of GraphQL. Neither is capable of creating an endpoint or web server. GraphQL itself doesn't listen to requests being made to a particular port. This is what we use Express for -- combined with another library like apollo-server-express or express-graphql, Express sets up our endpoint, listens for incoming requests to the endpoint, parses them and hands them over to GraphQL to execute. It then sends the execution result back to the client that made the request.
Similarly, GraphQL.js is not capable of directly interfacing with a database. GraphQL simply executes the code you provide in response to a request. The actual interaction with the database is typically done through a driver (like mongodb) or an ORM (like mongoose).
So a client (like your React application) makes a request to your Express application, which parses the request and hands it to GraphQL, which executes your query and in the process, calls some code that then gets data from your database. This data is formatted into a proper response and sent back to the client.
For a beginner, the missing project detail you are referencing is as follows:
Used Node.js to create an environment for the API generation or running your code. GraphQL can't do this alone.
Used Express for body parsing middleware, authentication middleware(it will authenticate every GraphQL request) and express-graphql for the integration of GraphQL with express framework(means graphQL API functions will be called after authentication middleware next() function trigger).
GraphQL to create API that needs after auth middleware will call next() function.
So the project is working like the following:
Mongoose is connected first.
Node.js starts a server.
When API calls to send to the server, then
a) They are parsed with express bodyParser
b) Headers are set on those requests.
c) auth middleware call.
d) Now it is the job of GraphQL to handle the API.
What I want to achieve is having a react application receive data posted to a Node-js server. Currently, the Node server receives a POST from an external source with a list of items. When the Node server receives the data, I want to send the data to the react application. This data will be used in the react application to display the listed items. How would I proceed to make this possible? Any advice is appreciated!
Thank you!
You can do this using Web Sockets. It would look something like this
Your app connects to your API and maintains a socket
The external source will POST to your API
API handles the POST request
Then the API passes some data on to the React app over the web socket
Your app consumes said data
Profit?
Socket.IO is a popular JavaScript library for web sockets with fallbacks and stuff.
I'm currently setting up the following application:
Node backend with Express
Postgres DB with Knex as an interface
React frontend
Everything is working as intended and I am making good progress, my question is more architectural:
What is the preferred/recommended/best way to notify the frontend when database changes occur?
I saw that Postgres has a LISTEN/NOTIFY feature but that is not currently (ever) supported by Knex (https://github.com/tgriesser/knex/issues/285).
My thoughts:
Polling (every x seconds query the DB). This seems wasteful and antiquated but it would be easy to set up.
Sockets. Rewrite all my Express endpoints to use sockets?
?
I'm interested to see how others handle this.
Thanks!
I've had a similar situation before. I have a front end which connects via web sockets to the API. The API emits a message on successful database commit with the API endpoint matching the update. The front end components listen for these update socket messages and if the updated type is relevant to that component the component will query the API endpoint over https for the new data. Using a web socket only to advertise that an update is available won't necessitate rewriting the entire API.
I have working PHP application. It allows user create private projects and invite others in it. Now, using node.js and socket.io, I want to make real-time comments, posts etc.
What is the best architecture?
I see two solutions now.
The first is:
User sends AJAX query to PHP backend:
http://example.com/comment_add.php?text=...
comment_add.php adds
comment to database and via AMPQ (or something better?) notifies
node.js server which broadcasts comment to channel's subscribers.
The second is:
User sends AJAX query to node.js server: http://example.com:3000/comment_add
Node.js sends request to PHP backend (But how? And what about authorization?), receives response, and then broadcasts to channel's subscribers.
What is the best way? Is there another methods? How to implement this properly?
When you decided to use node.js + socket.io to make a real-time web-app, you don't need to think about PHP anymore and forget Ajax also... Socket.io will be the communication between client and server.
But yes, you can use Ajax and PHP for building websites fast, and some other functions that don't need real-time
The second way is the best method. You can use http to communicate with PHP from node.js. Authorization can be done in node.js but passing auth credentials everytime to PHP
Finally my working solution is #1.
When user establishing connection to node.js/socket.io he just send 'subscribe' message to node.js with his PHP session id. Node.js checks authorization using POST request to PHP backend and if all is OK allows user to establish connection.
Frontend sends all requests to PHP as it was before node.js.
PHP modifies some object, checks who can access modified object and sends message (via AMQP or redis pub/sub etc.) to node.js:
{
id_object: 125342,
users: [5, 23, 9882]
}
node.js then check who from listed users have active sockets and for each user sends GET request to PHP:
{
userId: 5,
id_object: 125342
}
Special PHP controller receiving this request runs query to get object with rights of given user id and then sends message to node.js with resulting answer. Node.js then via socket sends answer to user's frontend.
I faced this same question a year ago when starting my final year project at University. I realized that my project was much better suited to using Node as a standalone. Node is very good at dealing with I/O, this can be anything from a HTTP requests to a database query. Adding in a PHP based Web Server behind Node is going to add un-needed complexity. If your application needs to perform CPU intensive tasks you can quite easilly spawn 'child' node processed which perform the needed operation, and return the result to your parent node.
However out of the two you methods you have mentioned I would choose #2. Node.js can communicate with your PHP server in a number of ways, you could look at creating a unix socket connection between your PHP server and Node. If that is unavailable you could simply communicate between Node and your PHP back end using HTTP. :)
Take a look here, here is a solution to a question very similar to your own:
http://forum.kohanaframework.org/discussion/comment/57607