I'm currently in the planning stages of a new project which is composed of a storefront, a highly reactive user dashboard, and the individual products being offered via the storefront being highly interactive mini-apps. We're trying to get away with making the entire platform a SPA and design the entire thing on a Flux architecture with React for the front-end views.
One issue, as with most SPAs, is SEO. I've prototyped an isomorphic solution based on the este.js dev stack. One issue is that our app consumes pretty much all of its data from a RESTful server, which is separate from the web server serving up the SPA. This means that the web server would need to fetch a considerable amount of data from the RESTful server, to isomorphically generate the HTML snapshot.
I've considered having a separate crawler process of my own crawl the entire storefront periodically and isomorphically generate HTML snapshots of the pages that could be served up when the web server encounters a search engine crawler. I'm not sure if this is a good approach though, as it would likely introduce additional maintenance and, frankly, seems a bit fragile. I could just have the web server isomorphically generate the HTML on the fly, but I fear bogging the server down for ordinary users as the server would be pulling considerable data from the REST API...
Is there a better way to handle such a case?
Check out Yahoo's Fetchr, an open-source library that allows you to isomorphically hit your API. It ties into Facebook's Flux architecture, so you need to have Stores, but at the very least you can glean some code and concepts from it. If you're in the planning phase, you might even consider going with the Flux or Fluxible.
http://fluxible.io/guides/data-services.html
https://github.com/yahoo/fetchr
Related
I have built a Rest-API with Express and mongoose, now it's time to connect it to a Frontend and I am bit confused, I have started building the Frontend of the app with hbs as the templating engine and that works well but I was also considering React for example and that brings me to my question.
What is the best solution here? to build the whole App in one folder so to speak, with a templating engine taking care of the Frontend or to create the API, host it and then use it with a Frontend application? It is a matter of preference or one is better than the other?
This really is a matter of preference. There are many benefits and trade-offs for each method. Here are some:
API with Single-Page Application
Benefits
Easier to make a more dynamic app
With an API you can allow third-parties to integrate with your app easily
Fast - after the app has been initially downloaded only data needs to be transferred!
API and frontend separation can help keep business logic in one place (on the backend)
Offline and caching are easy!
Downsides
SEO isn't as easy (but still very much possible)
Slow - if your app is big, the initial download speed can be slow (there are lots of solutions for this)
Multi-Page Application
Benefits
Fast (page download can be faster)
SEO is slightly easier
More secure by default (due to cross-site scripting on SPAs)
Downsides
Slow - unlike a SPA, you have to download every page
Harder to build and debug
This is by no means a comprehensive list of trade-offs but hopefully, it will help you make an informed decision. Personally, I prefer the SPA approach because I have multiple sites/apps using one backend as well as the ease of development.
I have decided on a stack for a web app project. Its as follows.
Express JS + Knex + postgresql backend as a Web API layer.
VUE JS as the front end.
I have completed rough design of the whole system. I am stuck with the implementation part. Do I build the UI first and flesh out the API, or vice versa.
Usually you build both at the same time - preferably by two different teams to minimize tight coupling and leaky abstractions. Sometimes the API is build first and then the web or mobile or some other frontends are built for that. Sometimes a frontend is build first as if the API already existed and it results in a solid specification for the API to get built later. Sometimes the specification is created first and then both backend and frontend(s) are built to follow the spec. It all depends on the specific work style and requirements. It's more important how you to it than when.
I'm working alone on a personal project and my approach have been to work only at the frontend then mock the http part with a realistic mock that emulates a real api behaviour and only at the end moving at the api development.
I decided to use this apporach because in my experience no matter how the model and functional specifications are clear, they will always be subject to change request and you can prevent the side effects on your development workflow by testing and interacting with the actual UI.
Then you will find that the api developemnt will be completed in a matter of weeks not months with a better clear understanding of what are your (or your client's) needs.
Hope this could help you
I have an existing web application. It uses Spring MVC on Tomcat and MySQL at the back end with transactions (multi table updates). I am also using spring security for role based authorization. This is very typical website without any real time notification, chat etc.
Now my client wants to add real time notifications like Facebook, chat module etc. Typically on front end some action will be taken, and all or specific logged in users need to get notified. After receiving notification, I need to update some <div> content. Frequency would be high. (Currently user needs to refresh browser.)
I completed POCs using Node.js/Express and looks like it's easy to accomplish these 2 things with Node.js.
I have 3 options:
Move front end to Node.js and may be Angular.js. Manage form validations / security through Node.js/Angular.js but all database calls are still managed by my old website (in a RESTful manner) as I can reuse my existing stuff. Most of the method calls return to a tile but we can easily convert to return JSON object.
Keep existing website as it is and plug in Node.js into it just for real time notification / chat etc.
Scrap existing website and redo everything including security, transactions using Node.js. There are many posts saying as Node.js is kind of new, not preferable for enterprise application and also this might be too much.
Approach 2 would be my preferred approach as we have expertise in Spring, Node would be completely new for us but I don't know whether it's practical & doable approach, what kind of issues I might be facing. I am also not sure how to integrate Spring MVC running on Tomcat and Node.js together.
Please could anybody help me in deciding, out of the three what's the best way to go? Any other approach which might be easier?
You already have an existing Spring MVC codebase and you can reuse it. However you can go ahead with AngularJS for your front-end technology. Your front-end AngularJS code can talk to your existing Spring MVC via REST api and to NodeJS for real-time features which you plan to develop.
You need to use Socket.io within NodeJS which will provide the features you are looking for.
One major problem you might face is related to session when talking to two different backend stack.
We are going to build big social web app. We have to implements 2 big modules:
FrontEnd - single page app (Backbone.js)
CMS - system to manage contents of FrontEnd (daily content, sponsors, banners, links, special offers, upload media etcetc)
FrontEnd will use Node.js powered REST api which will use DB in the cloud (PG or Mongo - didnt decide yet).
My question is: should CMS also use same REST api as FrontEnd? Or should we make separate app (not node.js neccessery) for CMS that would "talk" with db in the cloud directly? My question arises because on previous project we had this issue:
Single REST api for FrontEnd and CMS.
When we wanted new functionality in CMS we had to implement it in RESTapi - and then we had to restart whole APP (RESTapi) which was problematic in production...
So:
Implement 2 RestApis - one for FrontEnd and one for BackEnd?
Implement 1 RestApi for FrontEnd and implemnt CMS as separate app talking directly to database?
How do you do it?
Out goal is to implement super-fast FrontEnd and Big/Heavy CMS (its is going to be bigger than FrontEnd). So we are thinking of completly separating CMS module from FrontEnd module. Eventual need for communication between modules would be implemented through redis pub/sub for example - What do you think?
Software architecture decisions are always very contextual - the people most qualified to make the call are you and your team, since you know way more than we do. That being said, based on the info you've shared, here are some things to consider:
Content Management as a problem space is pretty mature. Unless part of your revenue model involves innovations in how you handle Content Management, you would be unwise to build one. There are fantastic CMSes both open source and commercial, ranging in price from hundreds of dollars to hundreds of thousands. I cannot caution you strongly enough against the common developer fallacy to discount the value of our own time. Even if you spend an entire engineer's salary-worth on a CMS, you'll almost certainly come out ahead.
An architecture that uses a CMS should reflect the reality of #1, that CMSes are mature and stable. You want a strong and well-defined interface boundary between the parts of your system that are unique to you and specific to your revenue model, and the parts that are interchangeable with COTS (commercial off-the-shelf) - even if you do end up building it yourself (which again, I strongly caution against) - you'll run into impedance mismatch problems that are very hard to get out of and create friction to new feature delivery across the entire system if you design something as if it's bespoke when it's not (or vice versa).
I'm currently building a new web-application with user registration, profiles, image upload and so on. I was using the MEAN stack (MongoDB, ExpressJS, Angular, NodeJS) for previous projects and now want to try out couchDB.
couchDB delivers a REST-API for free. I could shift all the logic to the client and make sure, that the input is valid by couchDBs validation functions. Therefore I could make the requests from client directly to the database and I would not have to code annoying things like CRUD Operations in my expressJS controllers. Authentication, Validation and simple CRUD operations - it's all there and for free.
Is there a reason not to do so? I would then pass the request to my server and then pass it on to the couchDB from there, which pretty much eradicates all the nice benefits over mongoDB.
greetings,
Michel
I think your proposal is at least theoretically true and you might want to go ahead and do it, perhaps forwarding requests from the browser to couchdb with a reverse proxy like nginx or node-http-proxy. I believe there are products on the market espousing this "no application server" architecture such as parse.com, which provides some social proof that this idea is at least interesting and worth exploring.
However I think you will at some point discover there is such a thing as an application server and people use them and write code for them in nearly every application for good reason. Debugging problems with your couchdb data validation code is probably going to be cumbersome at best. Compare that to the amazing features you have debugging node.js code with node-inspector and the chrome developer tools debugger.
couchdb is also probably not going to provide realistically granular enough authorization capabilities. This means eventually your application will be exposed to malicious users just doing a PUT with the right document id and gaining access to data they are unauthorized to see or change.
Very few applications are simple enough that UI + DB can handle all of the data transitions and operations that are needed. You could in theory code some of this logic in the browser, but having the Internet between your compound query logic and your database is going to add so much latency to your app to make some features impossible, especially if you have to do a query, get some results, then do a secondary query based on each of those results. That is sometimes feasible between a server-side application and its couchdb, but doing that across the Internet will suffer from the latency.