Async access to MongoDB using Aleph/Lamina - node.js

I have been reading about Clojure for some time and I'm considering it as a replacement to Node.js (which I have used for another project). The most promising library seems to be Aleph/Lamina, which unfortunately doesn't have nearly as many examples as Node. My questions are:
How can I process requests with a chain of async operations, such as reading a document from MongoDB, doing some calculations, saving the new document and sending it in the response? I was not able to write it from the examples in Lamina wiki page. It sounds like a pretty common use case and I was surprised not to found any code showing it. It would be great if you could show me some sample code.
Is this setup adequate for a heavy-load server (say, tens of thousands requests per second)? I can't afford to create one thread for each new request, so I need something similar to the Node approach.
Is there any example of a medium- or large-sized company out there using any of this?
Is there any better Clojure replacement to Node (besides Aleph/Lamina)? Perhaps Clojurescript targetting Node? My client is not written in Javascript, so using the same language in both client and server is not an advantage in my case.
Thanks!

Few pointers:
You need to look at Aleph which builds HTTP abstractions over Lamina channels abstraction.
Reading and writing docs to MongoDB can be async but the library should provide this. In Node.js the MongoDB library has to be async other wise it would screw up the Node programming model, where as this is not the case with Clojure so most probably the Clojure MongoDB library provides non-async function.
Async operations are only helpful in case of IO i.e reading/writing to mongodb, sending response back etc. Generation computations are CPU bound operations and has nothing to do with async model.
Vert.x is Java world Node.js. Clojure support is on roadmap. I would prefer Aleph as you can play in async and non-async world as required.

Related

node.js package written in rust to perform db queries - how to make it efficiently?

I'm a rust newbie, want to write a node.js package related to database querying.
I'm using napi-rs for the package.
In node.js we have own async stuff, in rust we have similar thing called "tokio" for async stuff.
I want to create ORM for node.js, and learn rust at the same time, so idea is to construct queries on node.js side and perform them on rust side, give response back to node.
I see two ways:
Use tokio with tokio-postgres
Not to use tokio, instead to write own database adapter library which will rely on node async functionality and node sockets. Not even sure if I can do this but can try. Not sure if that makes sense.
First way is way more simple, but will it work? Is efficient to include tokio to node.js package?
execute_tokio_future method of napi works just perfect! Details on how to use it I found here: https://forum.safedev.org/t/adventures-in-rust-node-js-and-safe/2959/10
I was able to get benchmark with better results using rust addon than a node library, so the idea paid off.

Making website with Node.js without framework

i want to create a website based on node js and mysql , but i've read that there is a framework called express for node js , and i'm wondering if i must to use such kind of a framework to create a decent website or it is possible without it and just work with pure node js.
No framework is required. You can write a full-blown web server using only the http module or if you really want to write everything yourself, you can even do it with only the net module.
It's really about what is the most effective use of your time and skill as a developer. Except for academic or pure learning experience reasons, if you're just trying to accomplish a task as efficiently as possible and free, pre-existing, pre-tested code exists that makes your job easier, then that's a better way to go.
For example, if I need to do a file upload from a browser to my back-end and the data is coming in as the multipart/formdata content-type from the browser, I have zero interest in reading and learning the multipart/formdata RFC then writing my own code to parse the multipart/formdata content-type. Pre-existing, already tested code exists to do that for me and I'm adding no value to the goals of my project by re-implementing and then testing it all myself. Therefore, I'd like to use a pre-built module that does all that for me. I can just configure the right library on the right route and out plops my uploaded file in only the amount of time it takes to understand the interface to the 3rd party module and how to use it properly.
This is where Express comes in. Not only does it offer a useful set of features and architecture for configuring routes, installing middleware, sending responses, using template engines, handling errors, etc... but there are also thousands of third party modules that are built to hook into Express and it is easiest to use them if you're using Express as your core framework. Some of these modules could be used outside of Express, some cannot - it really depends upon how they're designed and what Express interfaces they do or don't use.
Also, Express is fairly "un-opinionated" and fairly "lightweight" which means it doesn't force you into a particular methodology. It just offers you easier ways to do things you were already going to have to write code for yourself.
Look at it this way. When you get node.js, there are thousands of APIs that offer lots of already tested things such as a TCP library, a file I/O library, etc... Those are frameworks (in a sense) too. You don't have to use them either. You could rewrite whatever functionality you need from scratch. But, you wouldn't even think about doing that because tested code already exists that solves your problem. So, you happily build on top of things that are already done.
One of the BIG advantages of coding with node.js is getting access to the tens of thousands of pre-built modules on NPM that already solve problems that many people have. Coding in node.js with a mindset that you will never use any outside modules from NPM is throwing away one of the biggest advantages of coding with node.js.
could you tell me what are the Routes used for in frameworks?
A route is a URL that you wish for your web server to respond to. So, if you want http://myserver.com/categories to be URL that your server responds to, then you create a route for /categories so that you can write code for what should happen when that URL is requested. A framework like Express allows you to create that route very simply with just a single statement such as:
app.get('/categories', function(req, res) {
// put code here to handle that request
});
This is just the tip of the iceberg for what Express supports. It allows you to use wildcards in route definitions, identify parameters in urls, create middleware that does prep work on lots of routes (such as check if the user is logged in), etc...
You don't have to use a framework but it is recommended to use one of them since frameworks like Express make your life easier in many ways. Check this: What is Express.js?
Yes you CAN write a Node.js-based backend without any back end implementation framework such as Express. And if you are using Node.js for the first time without any previous experience of asynchronous coding, I'd advise against using Express, KOA or other Node implementation frameworks for your simple learner apps (e.g. those needing things like register/login form processing, logout button, user preference updates to database, etc) because:
(1) Node.js is a core skill for JavaScript back ends.
Stupid analogies between server tasking and restaurant waiters are no use to a real web engineer. You must first know what exactly Node can/cannot do in the server CPU that makes it different to most other back end technologies. Then you have to see how the Node process actually does this. Using Express/KOA/Hapi/etc you are sometimes effectively removing the mental challenges that come with a Node back end. Any time-saving is achieved at the expense of gaining a proper working understanding of what Node is and how it really operates.
(2) Learning Node.js and its asynchronous coding is hard enough without the added complication of coding with an unknown framework like Express/KOA that assumes users' familiarity with JavaScript constructs like callback functions and Promises. It's always better to learn something in isolation so you get the essence of its individual effects, rather than the overall effects if used with other packages/frameworks. So many of these Node.js Express tutorials are the software equivalent of learning to make a cake by watching Momma do it. We can copy it but we don't know how or why it's working. Professional coders can't just be good copycats.
(3) Available learning tutorials using Express often drag in other technologies like MongoDB, Mongoose, Mustache, Handlebars, etc that make learning Node.js even more awkward still.
(4) A share of basic web apps can be written more efficiently with Node.js, custom JS and existing JS modules imported off the npm repository rather than with Express.
(5) Once asynchronous coding and the JavaScript constructs available to assist with it are understood clearly, pure Node.js apps for basic tasks aren't that hard.
(5) After you do get your head around Node.js and can get basic web app functionalities working using server-side JavaScript constructs, you can then judiciously start to explore Express/Hapi/KOA/etc and see what an implementation framework can do for your workflow when doing larger projects needing numerous functionalities. At this point you know what Express code should be doing and why it is done the way it is.
Node.js has become the back-end technology of choice for most small to medium scale web applications over the last 10 years. It is also the major reason why the JavaScript language has evolved from a mere front-end scripting tool with a limited set of Java-aping constructs to the innovative and comprehensive language that it is today. It is also the most popular language in use today. Investing time in understanding the Node server framework, and the latest JavaScript constructs used in Node, is time well spent. Implementation frameworks such as Express, KOA, Hapi, Sails, etc have great benefit when writing more elaborate back ends on the Node.js platform. But all these implementation frameworks are predicated on the behaviour patterns of Node.js. So unless Node itself is understood first, the full utility of Express/KOA/Sails/etc will never be enjoyed.
Try here for the pure Node.js.

Vertx , Node or Play for calling multiple webservice asynchrously

I am developing a project and In this project I need calling multiple concurent web services ( least 5 webservice) asynchrously.
For that , whcih framework can be used Vertx , node , or Play
thanks
In play there is too much working out of the box. Node js will be nice if you can lose some time for writing your own tools.
All of these frameworks can be used.
Disclaimer: I work on the Play framework.
Given Play's Scala heritage, even if you're using Java, we provide Promises so that you can reason the flow of making async calls without suffering from what is known as "callback-hell". You may want to consider using promises for Node also... I believe they are available. I think Vert.x may offer something there too. I'm unsure but I don't believe Node and vert.x provide promises out of the box.
You may find this page useful: http://www.playframework.com/documentation/2.2.x/ScalaWS
Play documentation is not so good. If you are thinking to implement in scala play can be good option, but for java you may not find it so great as few things are not supported in Java (Eg to write a body parser of your own you need to use scala)
Nodejs can be a good choice however, in nodejs utilizing all the cpu cores is hard. There is a framework jxcore which claims to have a solution for this, but, I have not used it.
Vert.X IMO can be a good framework, it makes good use of all cpu cores, provides N event loops. Optionally, you can use worker thread pool if you really need to do cpu intensive operation and be responsive.
You can use vertx with RxJava module ( https://github.com/vert-x/mod-rxvertx ). You can combine your async results in anyway you want. rxvertx module supports wrappers for EventBus, HttpServer, HttpClient, NetServer, NetClient and Timer.

Simulate website load on Node.JS

I am thinking of creating my own simple load test, where I can hit my website with multiple requests (like 100-1000 concurrent users) to see how it performs. I want to try Node.js out, but I don't know if it is the wrong technology for the job, since Node.js don't use threads?
Can I with the async model that Node.js uses, simulate the many user requests, or would that be more appropiate to do in another language like Ruby/.NET/Python?
Node.js ought to be perfect for the task. I do this at work. The one crucial piece that you will have to change is the http socket pool. The following code snipped will disable pooling entirely, letting you starve your Node.js process if you want to.
var http = require('http');
var req = http.request(..., agent: false)
You can read about this more at the http.Agent documentation.
Your concern about threads is astute, but even if you hit that limit (Node is very good at keeping your resources efficient) the solution is simple: start multiple instances (processes) of your load test. As it is, you may have to use multiple machines entirely to correctly simulate load.
In any case, you will not win automatically by using Ruby or Python for this. Asynchronous programming is ideal for I/O and network-bound tasks, and Node excels at this. Similarly, while Ruby and Python have third-party asynchronous frameworks, they're by definition more obscure than the standard asynchronous framework given in Node.
Node can fire off pretty much as many requests as you want it to (though you may have to change the defaults for http:Agent). You're more likely to be limited by what your OS can do than by anything inherent in node (and of course such limitations will apply in any other language you use).
It's simple to create load tests with nodeload.

Is there something like Perl's Catalyst routing for node.js?

Perl's Catalyst framework has an excellent URL dispatching/routing mechanism that allows chaining methods together to modularize routing.
Through rigorous application of the Scientific Method, I have determined that is 1942.49 times better than rails-style routing for my current projects. I'm currently writing something using node.js.
Is there a framework for node that uses catalyst-style dispatching (especially "chaining")?
If you are using express you could potentially use middleware http://expressjs.com/guide.html#middleware and connect to simulate simular solution. https://github.com/senchalabs/connect
I hope this helps a bit with this question :).

Resources