Does anyone used rpc framework inside libevent? - rpc

I have a multiserver multiclient application and I would like to keep some common data managed by a single daemon (to avoid a nightmare f concurrency), so the servers can just ask it when they need to manipulate the shared data.
I am already using libevent in the servers so I would like to stick to it and use it's RPC framework but I could not find an example of it used in real world.

Google Protobuf provides a RPC framework. And it is also used inside Google for RPC and many other things.
Protobuf is a library for data exchanging.
It handles data serialization, deserialization, compression, and so on.
It is created and opensourced by Google.
However, they didn't opensource the part of RPC implementation.
It only provides a framework.
You can integrate Protobuf with your existing libevent program.
I have personally implemented a RPC with Protobuf and libev(which is a similar project to libevent). And they work fine.

Related

Replaying RPC calls for testing purposes

We are using a 3rd party library (Google Spanner) that uses gRPC in a node application. One of pain points we have is ability to easily mock responses from this library for testing purposes.
If anyone had similar issues, were you able to solve it? I was thinking of a tool that could record/replay rpc calls (there are many great libraries for recording/replaying HTTP calls) but couldn't find anything similar for RPC. I came up across Google's rpcreplay (https://github.com/GoogleCloudPlatform/google-cloud-go/tree/master/rpcreplay) but to my understanding it's intended to be used in Go applications.
At Traffic Parrot we have been working on a solution to your problem in our service virtualization tool which includes a user interface that can be used to define the mock behaviour.
We have recently added a tutorial on how to mock gRPC responses over the wire given a proto file.
You can also find information on how to record and replay over the wire in the documentation.

Do we really need to import Corda's code for RPC ? How in the future?

I know that Corda is in the process of removing its web server module and in the documentation they suggest the use of other frameworks.
In one example ("spring-observable-stream") they use Spring Boot for the server-side APIs and use an RPC call to the actual running Corda node. That's fine and comparable with what I have to do.
In that example, the author import the specific Corda's RPC code, along with the code of the actual flow (and states) needed.
What I want to ask here is if it's possible to avoid that tangle and keep the web server APIs independent from the actual Corda/CordApp's code by using a general RPC library (any advice ?).
If, instead, I must import the Corda-specific code (there is a reason ?), I'd like to ask you:
What is the minimum necessary to do so from a Gradle perspective ?
Is it possible to implement some sort of plugin on the CordApp to reduce that tangle ?
To be honest, I'm interested in a more general way for interacting with the CordApp (e.g. from Python), but I know that due to the AMQP integration not yet ready, we have to stay on the JVM for now. So, feel free to answer just about what we need to do as of today from Kotlin (which I have to use for a short-term PoC)…
Thank you in advance!
Currently, your server has to depend on the Corda RPC library to interact with nodes via RPC. Corda doesn't yet expose an independent format for sending and receiving messages via RPC.
Your server also needs to depend on any CorDapps that contain flows that the server will start via RPC, or that contain types that will be returned via RPC. Otherwise, your server will not be able to start the flows or deserialise the returned types.
If you're using Gradle, here's what a minimal dependencies block might look like:
dependencies {
compile "org.jetbrains.kotlin:kotlin-stdlib-jre8:$kotlin_version"
cordaCompile "net.corda:corda-rpc:$corda_release_version"
compile "com.github.corda:cordapp-example:release-V1-SNAPSHOT"
}
Here, we're depending on the corda-rpc library, and also using JitPack to depend on the CorDapp where we define the flows and states that we want to start/return via RPC.
If you want, you can modularise the CorDapp so that all the classes you need to depend on for RPC are included in a separate module, and only depend on that module.

Methods for calling APIs in one Nodejs app from another Nodejs app

Our application will have a website and a mobile app both communicating to the same API backend. I have one Nodejs application for serving only APIs and a second Nodejs app serving html pages for the website. I am using Expressjs web framework for both of these apps.
What are different methods to call APIs in one Nodejs from another Nodejs app? Additional information on when to use each method would be great.
EDIT:
Example,
I have the following applications
NodejsAPI (node & express)
NodejsWebsite (node & express)
MobileApp
NodejsAPI will provide access to APIs for the MobileApp and the NodejsWebsite. MobileApp will access APIs over http. But I want to know what are the options for NodejsWebsite to call APIs in NodejsAPI app. From what I understand this will be inter process communication between the two processes. For .net applications such communications could be done using .net pipes, tcp communication etc. What are the equivalent methods for Nodejs applications on unix and linux platforms?
Thinking from IPC perspective I found the following to be useful,
What's the most efficient node.js inter-process communication library/method?
https://www.npmjs.org/package/node-ipc
There's node's vanilla http client, http client swiss army knife, request, then there's superagent, similar to jQuery.ajax. To make your life easier there's armrest and fementa, both different flavors of the same thing.
Now if you want to reach for more performance and have another interface of your application, you can use one of these RPC solutions:
dnode: One of the most popular solutions. It's makes things very easy. It's makes using remote interfaces seamless. phantomjs-node uses dnode. Doesn't perform well with huge objects compared to others. For small stuff, it's perfect. There's other ports for other languages too.
zerorpc: Uses zeromq as it's socket library which is famous for being reliable. It supports connecting to a python client too.
smith: RPC systems used in cloud9 editor backend. Basically almost as nice as dnode, but faster. Both smith and zerorpc uses msgpack instead of JSON, so they will save bytes on the wire.
axon-rpc: A lightweight solution. As nice to use as zerorpc. You can configure it to use msgpack with axon-msgpack.
All of above work on both TCP(To be used on different machines) or Unix Domain Sockets(faster than TCP, but only on the same machine).
If you want more performance, you can embed your NodejsAPI in your NodejsWebsite, by just simply requiring it's interface module.
If you want answers better than this, write a more specific question. The question as it is, is too broad.

What is node.js based on, under the hood?

What is node.js based on, under the hood? Is it written from scratch, or is it based on another project?
(Also, can anybody pinpoint the web server technology that makes the http module?)
...is it based on another project?
Node.js leverages a couple of projects:
v8 (GOOG)
libev
libeio
c-ares (from the authors of curl)
evcom
http-parser
Via: http://blog.zorinaq.com/?e=34
Node.js is a event driven platform built on top of Chrome V8 javascript engine.
Its based on similar platforms built in other languages, for example Twisted in Python, EventMachine in Ruby or libevent in C.
Its written from scratch. You can read more about it here http://nodejs.org/about/. You can also join the nodejs developer mailing lists if you want a slightly deeper answer.
Specifically, about the node.js http server, extracted from Node about page linked above:
HTTP is a first class protocol in Node. Node's HTTP library has grown
out of the author's experiences developing and working with web
servers. For example, streaming data through most web frameworks is
impossible. Node attempts to correct these problems in its HTTP parser
and API. Coupled with Node's purely evented infrastructure, it makes a
good foundation for web libraries or frameworks.

Easiest way to work with WebSockets in Node.js

I want to work with WebSockets in Node.js web app, and I am looking for the easiest way to do this. I've seen so many github repositories seemingly providing some ease of use.
But, I'm just looking to see if there's one that stands out as having the most support, or most widely implemented.
I was kind of leaning towards Socket.IO but I'm not entirely sure.
Any advice?
Thanks!
use now now or socket.io.
now is an abstraction build on socket.io which allows you to define methods on a shared object across client and server. This means you dont have to interact with the stream manually and can just seemingly call methods. Do read their best practices before use though.
now also has a grouping system in build which means you can talk to clients in groups rather then one or all.
socket.io itself is recommended because of it's excellent browser support with its range of fallbacks. It's also owned/maintained by a node.js startup so it's more likely to be maintained in the future. And it also has a range of server-side socket.io implementations for platforms other then node.js so you can use the same API on multiple platforms.
If you find socket.IO too large or bloated you can go for the lightweight websocket-server. This is just a simple websocket implementation and is reasonably stable. I have personally used this if I want something which is a very minimal abstraction and if I want more low level access to the websocket server itself.
Take a look at this blog post, it's very informative...

Resources