Building Servers For Web Applications [closed] - node.js

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Question(s)
I would like to deploy a server for a relatively basic HTML, CSS, JS website. I want it to be able to run all of the time, and for *potentially 1000s of people to view it each day. Technically I will not need to store any data, above what is included in the code.
a. Can I use Node.JS to do this, through setting up a local server? e.g. by connecting a domain to it etc.
b. Could I use Node.JS to do this through alternative means - rather than using a local server.
c. Would a service like Firebase solve the problem, and nullify the need for any server building on Node.JS
d. If I did need to store a lot of data e.g. sale orders, email addresses, account info: stuff like that, would it change any of the answers.
I have been putting a lot of research into what the best way to host my website would be, but these were the questions that remained unresolved. Many thanks!!

Firebase Hosting is one key for a static website.
Firebase Functions can be used if you are looking for any computation/manipulations.
Firebase Database can be used for storage and retrieval of data as documents.
As per your question:
a. Yes, you can serve your files using serve. Later you need a static IP and then point it to your domain.
b. You can use Firebase Hosting, GitHub Pages. But you can't use Node.JS in this case.
c. Yes, definitely it would cover up all the stuff with a Spark Plan for initial deployment.
d. In this case, go for a Firebase Database or Mongodb Atlas.
I hope this will help you.

You can use Heroku server (https://www.heroku.com/)
You can run a server on it completely for free for the first 100ds of users. If you want to include a DB, you can run MongoDB Atlas for DB storing, though if you don't really need DB, you can just use Heroku for a free server.
https://devcenter.heroku.com/ this is the documentation

Related

Nginx as a web server or Node JS with Cloudfront CDN [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I have read many articles that the main advantage of using Nginx as a web server is that it fastens you performance, especially because it works fast with a static data.
I already uses Cloudfront CDN (Amazon Services) for fastening the static data.
So, do I have any real reasons to prefer using Nginx over Node JS as a web server?
No advantage in your case. Choosing a CDN to deliver your static content was a great idea. (CDN's not only serve your content, but they cache it around their network so most locales see equivalent performance.) This offloads a significant amount of labor from your NodeJS application server.
However NGINX can be very useful in conjunction with an application server like NodeJS. Most people use NGINX as a reverse proxy, that is it sits in front of a cluster of application servers and distributes traffic load evenly.
Other cool tricks include hotswapping NGINX configuration for blue green deployments; so you never have to halt your service for an upgrade.
If you have the money and the time these are tricks well worth having up your sleeve.
It depends what do you believe "real reasons" are.
I believe the most important reason includes security issues - Nginx is a dedicated web server, whereas Node.js is a JavaScript runtime, therefore Nginx focuses on serving-related issues, whereas Nodejs builtin server is just an additional utility, thus not receiving that much attention. Another benefit of such deployment is configurability - for example, you can switch your node runtimes with virtually no downtime (since you can run two nodejs instances at the same time) or even slowly move the traffic to your new server.
You can also take a look at: Using Node.js only vs. using Node.js with Apache/Nginx and http://blog.modulus.io/supercharge-your-nodejs-applications-with-nginx
Perhaps you may be interested in a more general question as well, regarding application servers vs web servers: What is the difference between application server and web server?

Automating Node and Angular frontend/backend Integration, production preparation and deployment? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have been experimenting with Node.js for a while and have hacked together simple projects with Express, Hapi and Restify. The frontends on these small projects were very small and limited, as I was primarily focused on the backend, so I never had the need to automate a lot of the details when deploying them to production servers.
I recently started getting into front end development and stumbled across some great automation tools; Yeoman (overall workflow), Yo (for scaffolding), Grunt (automation) and Bower (for dependencies).
I absolutely love the workflow of these tools and they have gotten me really excited about trying to learn front end development, structuring and workflow. However, learning all of this has also raised questions on the proper way to automate other areas of workflow that integrate the front and back ends:
1) Should I be maintaining the frontend and backend in the same repository? They are both written in JavaScript but it seems semi clunky. I know this is a matter of preference but I would love to see some thoughts on current best practices. Would it be best to have the public directory in my app be a sym link to another.
2) How do I wire my front end to my backend? Specifically talking about socket.io calls. I am not sure the best way to automate how to best set this variable. In the backend I would use process.env, but I am unsure how to accomplish wiring them together based on test, production settings. I do not want to hand change this everytime I deploy to server.
1) I like the idea of keeping them in the same repository. It makes your life so much easier as you go along. The one thing you will feel happy is once your server and web pages are being served from same web server, you dont need to deal with any of the CORS issues. Or you don't need to have a reverse proxy just to resolve CORS issue.
2) We use angular js with socket.io calls. Since you use yeoman, you can read a config file into the Gruntfile.js like this. grunt.file.readJSON('config.json'). You can reuse your grunt scripts if you have a CI.
nodeServer : {
prod : {
port: 8080, //other params
}
}

How do companies like facebook release features slowly to portions of their user base? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I like how facebook releases features incrementally and not all at once to their entire user base. I get that this can be replicated with a bunch if statements smattered all throughout your code, but there needs to be a better way to do this. Perhaps that really is all they are doing, but that seems rather inelegant. Does anyone know if there is an industry standard for an architecture than can incrementally release features to portions of a user base?
On that same note, I have a feeling that all of their employees see an entirely different completely beta view of the site. So it seems that they are able to deem certain portions of their website as beta and others as production and have some sort of access control list to guide what people see? That seems like it would be slow.
Thanks!
Facebook has a lot of servers so they can apply new features only on some of them. Also they have some servers where they test new features before commiting to the production.
A more elegant solution is, if statements and feature flags using systems like gargoyle (in python).
Using a system like this you could do something like:
if feature_flag.is_active(MY_FEATURE_NAME, request, user, other_key_objects):
# do some stuff
In a web interface you would be able to isolate describe users, requests, or any other key object your system has and deliver your feature to them. In fact, via requests you could do things like direct X% of traffic to the new feature, and thus do things like A/B test and gather analytics.
An approach to this is to have a tiered architecture where the authentication tier hands-off to the product tier.
A user enters the product URL and that is resolved to direct them to a cluster of authentication servers. These servers handle authentication and then hand off the session to a cluster of product servers.
Using this approach you can:
Separate out your product servers in to 'zones' that run different versions of your application
Run logic on your authentication servers that decides which zone to route the session to
As an example, you could have Zone A running the latest production code and Zone B running beta code. At the point of login the authentication server sends every user with a user name starting with a-m to Zone A and n-z to Zone B. That way roughly half the users are running on the beta product.
Depending on the information you have available at the point of login you could even do something more sophisticated than this. For example you could target a particular user demographic (e.g. age ranges, gender, location, etc).

Are there any examples of group data-sharing using a replicated database, such as CouchDB? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Background: I am working on a proposal for a PHP/web-based P2P replication layer for PDO databases. My vision is that someone with a need to crowd-source data sets up this software on a web server, hooks it up to their preferred db platform, and then writes a web app around it to add/edit/delete data locally. Other parties, if they wish, may set up a similar thing - with their own web apps written around it - and set up data-sharing agreements with one or more peers. In the general case, changes made to one database are written to another on a versioned basis, such that they eventually flow around the whole network.
Someone has asked me why I'm not using CouchDB, since it has bi-directional replication and record versioning offered as standard. I wasn't aware of these capabilities, so this turns out to be an excellent question! It occurs to me, if this facility is already available, are there any existing examples of server-to-server replication between separate groups? I've done a great deal of hunting and not found anything.
(I suppose what I am looking for is examples of "group-sourcing": give groups a means to access a shared dataset locally, plus the benefits of critical mass they would be unable to build individually, whilst avoiding the political ownership/control problems associated with the traditional centralised model.)
You might want to check out http://refuge.io/
It is built around couchdb, but more specifically to form peer groups.
Also, here is a couchbase sponsored case study of replication between various groups
http://site.couchio.couchone.com/case-study-assay-depot
This can be achived on standard couchdb installs.
Hope that gives you a start.

Can nodejs be installed on a free webhost [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
My goal is to create a chatting website. Not so much for the sake of the website, but for the experience so I know how; just something to work towards gradually. I tried long polling, but that always ends up pissing off the webhosts whose servers I'm using. I was told to use nodejs instead. I have some idea of what it is, but no idea how to use it.
I'm guessing that the reason I can't find the answer to this question anywhere is because of how obvious it is... to everyone else.
I've been looking around and all I see are tutorials on installing it on your server when you own the server. I know you can install forums on webhost's servers, so can you also install nodejs?
Yes. You can check the full listing at https://github.com/joyent/node/wiki/Node-Hosting to check each site but it does not categorize it by free hosting..
Some I know of, I personally use Heroku.
Heroku
Nodester
Most standard LAMP hosting companies don't let you run node.js.
I currently recommend you use the Cloud9 IDE to get up and running with not only your tests and development, but also potential deployment. Cloud9 allows you to run your app from their IDE and will provide you with URL to see your app running and get familiar with node.js development.
A more manual way is to find a node.js PAAS (Platform as a Service) such as Joyent or Nodester.
Another one is Open Shift. I use them a lot and they allow you to use your own domain on the free plan. I use Heroku as well and have tried AppFog and Modulus.
But what it comes down to is whether I can use my own domain and how much they throttle my traffic. AppFog and Modulus don't allow custom domains on their free plans and seriously throttle traffic. They will cut your website off if you have one visitor an hour.
Another issue I was concerned about was with the upload of files. In particular, with my website content is added via markdown files. Most node webhosts use a variation on git deploys to update websites, with content supplied by databases. However, if you are trying to run a website without a database, using flat files, then each update must be done by a git deploy. This takes the whole website down and recreates a new website altogether (it just happens to look like the previous one). This will normally take a few minutes. Probably not a problem for a low volume website. But imagine if you are making a blog entry and you deploy it and then notice you've made a spelling mistake. You need to do a deploy all over again.
So, one of the things that attracted me to Open Shift was that they have a reserved area for flat files within your project. You can upload your files there and when your project is re-started these files will be preserved.
Appfog provides a free plan where you can host NodeJS and many other technos.
However, free plans don't allow custom domain name anymore.
There is also the Node.js Smart Machine service from Joyent.

Resources