Configure my web server to expose the separate folders as separate web servers? - linux

How can I configure my web server to make the 5 separate folders as 5 different separate web servers?

Depends on what HTTP daemon you are using, but with Apache you probably want to look at using Name Based Virtual Hosts - http://www.tecmint.com/apache-ip-based-and-name-based-virtual-hosting/

Related

Hosting multiple .net core application on Ubuntu using Nginx

I created sample web api on .net core and registered it on default file in Nginx and was able to access it from outside.
The API looked like https://<>/api/values.
Now I want to add more configurations to host more web api with different port number. The problem is how will default file differentiate between multiple APIs since base URL is same i.e localhost\<> for all.
You need to create server blocks. Each of these server blocks will handle/listen/respond to different app. You can host as many apps you want to on a single Ubuntu machine using nginx this way.
This will be very helpful and describe the entire process of creating server blocks for your nginx server.

Deploy a MEAN stack application to an existing server

I have a Ubuntu Server on DigitalOcean which hosts a website, and a Windows Server on AWS which hosts another website.
I just built a mean.js stack app on my MAC, and I plan to deploy it to production.
It seems that most of the existing threads discuss about using a new dedicated server. For example, this thread is about deploying on a new AWS EC2 instance; this video is about deploying on a new Windows Azure server; this is to create a new droplet in DigitalOcean.
My question is, is it possible to use an existing server (which hosts other websites), rather than creating a new server? If yes, will there be any difference in terms of performance?
My question is, is it possible to use an existing server (which hosts other websites), rather than creating a new server?
Yes. Both Windows and Ubuntu allows you to deploy multiple applications on same instance.
For Ubuntu you can read this post which will help you server multiple apps.
In this example used Nginx, but you can follow to this example and use it without any server like Apache or Nginx. If you need subdomains I would suggest to use Apache virtual hosts with reverse proxy module and pm2
For Windows and its IIS I would suggest to use iisnode, in google you can find a lot of articles how to configure it.
will there be any difference in terms of performance?
It is depended on your applications, if you are already serving applications which handles huge traffic and need CPU and memory, I would not suggest you to use multiple apps on same instance, but if you are going to use simple web apps, you can easily use same instance.
Hope this answer will help you!

Setup Node server with multiple websites and have each site on its own thread

I have a laptop that I am running node on, a Ubuntu Server with a quad core processor.
There is a plan for 2-3 sites on this server and I am not a really good admin and needed help getting this one site going so I dont want to start from scratch and run a hypervisor. Is there a way to have node host 3 sites and have each of the run on their own thread of the processor? I understand Node is single threaded and while I really dont need to do this for performance (because its just for development) I do like this as an exercise in doing things in node and it would be cool! There is an entire second laptop for the database so Im not worried about resources.
So 3 sites on one instance of Ubuntu Server all on different threads.....
It's not entirely clear what you're trying to accomplish. Here are a couple scenarios:
Create three separate node.js servers, each listening to their own port and they will each be running their own node.js process independent of the other. Then have each client connect to the appropriate port.
Create three separate node.js servers, each listening to their own port and they will each be running their own node.js process independent of the other. Use NGINX as a proxy in front of the three web servers and you can let NGINX direct requests all on port 80 from each of the three domains to the appropriate node.js web server. Using NGINX this way, all three web servers can appear to be be running on the default port 80 (or 443) and NGINX will separate them out and direct them to the appropriate web server process.
Create your own master node.js process that receives requests for all three domains, looks that the host header to see what domain the request was actually directed at and then forward that request to the appropriate child process. This would be similar to the way clustering works in node.js, but each child process would be each of your different web servers. Personally, I'd use the pre-built functionality in NGINX to do this for you (as described in option 2 above), but you could code it yourself if you didn't want to run NGINX.
Instead of NGINX, use some sort of load balancer that your ISP may already have to direct the incoming connections to the right server process.
If you run 3 different applications ie. sites then they will be running as different processes on your server which assuming all run on different ports, there should be no problem running them simultaneously. When you refer to node being single threaded that applies to a single process so each process has its own event loop running.

AWS Node.js multiple apps / urls

Is it possible to run multiple node apps on multiple domains on a single AWS EC2 instance?
If so, what kind of stack would you need?
The easiest way is to use nginx and set up virtual hosts and configure your nodejs instances to use different ports.
Once you know the ports of your app you can configure different domains on those ports.
https://www.digitalocean.com/community/tutorials/how-to-host-multiple-node-js-applications-on-a-single-vps-with-nginx-forever-and-crontab
Here's how to do that, the guide is for digitalocean but applies to EC2, since it's like a real machine anyway.

Best practices for shared image folder on a Linux cluster?

I'm building a web app that will scale into a linux cluster with tomcat and nginx. There will be one nginx web server load balancing multiple tomcat app servers. And a database server in behind. All running on CentOS 6.
The app involves users uploading photos. I plan to keep all the images on the file system of the front nginx box and have pointers to them stored in the database. This way nginx can serve them full speed without involving the app servers.
The app resizes the image in the browser before uploading. So file size will not be too extreme.
What is the most efficient/reliable way of writing the images from the app servers to the nginx front end server? I can think of several ways I could do it. But I suspect some kind of network files system would be best.
What are current best practices?
Assuming you do not use CMS (Content Management System), you could use the following options :
If you have only one front end web server then the suggestion would be to store it locally on the web server in a local Unix filesystem.
If you have multiple web servers, you could store the files on a SAN or NAS shared network device. This way you would not need to synchronize the files across the servers. Make sure that the shared resource is redundant else if it goes down, your site will be down.

Resources