I currently have a website where I need to use node.js, I am not able to use node.js however, because the web host does not support it. What is the best way I go about hosting a server without having to completely change hosts?
[…] without having to completely change hosts?
If your current hosting provider doesn't support nodejs and you want to use nodejs, then you have to change hosting provider. Sorry.
I can recommend Google Cloud Engine. You can create a virtual machine, e.g. running Fedora, access it via SSH and install what you need, i.e. apache2, nodejs, etc.
If you're not comfortable with that, you should go for a managed hosting solution instead. It will probably be a little more expensive, and you'll have less flexibility in what programs you can use (since you share your virtual machine with other customers and can't make changes to the system yourself), but on the upside, most of the setup is done for you. There are many providers you can choose from; google "managed hostinig with nodejs" if you want an overview. I have used 1and1 before and was mostly happy with it. As you can see here, they have nodejs installed on their servers.
Your question makes hardly any sense... but
Heroku is really great for Node.js app hosting
Related
Is ngrok a safe tool to use? I was reading a tutorial which recommended to use ngrok test API responses that I make to outside services that need to connect to my endpoints also.
There is no source code available for Version 2.0, considering it started as an open source project in 2014. I am suspect of any code that opens a tunnel to my localhost from the cloud. Pretty scary stuff especially without source code!
It opens up a tunnel to your dev machine, which is partially secured by obscurity (a hard to guess subdomain), and can be further secured by requiring a password. But you're still opening yourself up to ngrok itself, and the company is completely opaque (no address, no employees, no business name, no LinkedIn presence; all I can find is that it has 1-10 employees and is private; not even sure what country its based in). On top of that the code is not open-sourced. No reason to think they're not legit, but not a lot of information available to build trust.
You may be able to use ngrok and other local tunnel services with more security by encrypting the traffic. See https://security.stackexchange.com/questions/177280/end-to-end-encryption-for-localtunnel-ngrok-setup/177357#177357 for more information.
I found good rating, but vacuous information here:
http://www.scamadviser.com/is-ngrok.com-a-fake-site.html
The kicker for me is
https://developer.atlassian.com/blog/2015/05/secure-localhost-tunnels-with-ngrok/
where the Atlassian folks recommend it highly.
I think I am going to use it.
If anyone is concerning compromising their development environment, you can use Docker. There are many ngrok/docker projects but here is the one I chose: https://github.com/gtriggiano/ngrok-tunnel
for macOS, use "TARGET_HOST=docker.for.mac.localhost"
They now offer a service where you locally run only ssh, no need to run any of their code on your machine.
You run something like ssh -R 80:localhost:8501 tunnel.us.ngrok.com http. This connects to one of their hosts and forwards connections they receive back to your machine and the service you run on localhost:8501.
This seems secure to me, the only thing is that you don't know what information they collect and who is connecting to your exposed service. They print all connections, but it's their binary that does this and someone might well listen in without you noticing. You can check connections on your end, but you cannot be sure who it is that connects.
Ngrok is a convenient and highly secure utility for creating tunnels to locally hosted applications via a reverse proxy. This is a utility for publishing locally hosted applications on the web. style="letter-spacing: 0px;">Simply put, any locally hosted application provides a publicly accessible web URL to the . H. Either a Spring Boot or Nodejs based web application, or a webhook for a chat application, etc.
I'm working for a small company on something like a new PHP environment for future projects. I'd like to cram in as much modernization and automization as possible (while I can).
The thing is, I always come across solutions that require Node.js (Grunt, Autoprefixer, ...). None of our customer's hosting providers support Node.js (not even our own managed server). Most of the time I don't even have shell access.
I come across npm this and npm that so often, almost as if it's some always available quasistandard. Do I have some misunderstanding here – or is this simply only usable by people hosting their projects on their own servers? Am I just out of luck if I have to support a wide range of (sometimes questionable) shared hosting providers?
Comparing most PHP applications and most Node.js applications is apples and oranges.
Most PHP applications are fairly self-contained and intended to be used with web servers and a mostly stock PHP configuration. Most Node.js applications have a ton of NPM dependencies that need to be installed, and while HTTP is used to connect between the web server and the Node.js application, it isn't always clear what port that will be on. Plus, the Node.js application may require extra configuration, command line parameters, etc. Some hosting for Node.js is smart enough to look at the package.json file (Elastic Beanstalk for example) and figure out how to start your Node.js application.
These days you will find PHP going the same way. A lot of software is built with Composer packages that must be set up and installed. You won't find many folks getting that working on shared hosting either. Many Node.js applications have nothing to do with the web or web servers. That is increasingly becoming the case with PHP as well, but you won't find shared hosting for PHP applications.
Basically, you're looking at two entirely different ecosystems.
I think that your company needs to realize that you're sacrificing an awful lot just to stay compatible with cheap crappy shared hosting. These days you can get a $5/mo. VPS to run whatever you want, and that's often the same price as your shared hosting. Why waste time and resources while building a substandard application if you can pay $10 more a year and do what you want/need to do?
Use the technologies that you need to get the job done. If what you can do works fine in a normal PHP web application framework, then use that. If you need to build a persistent server application and feel that Node.js is right for you, use that.
I've been reading up on a few node tutorials but there are a couple of best/common practices that I would like to ask about for those out there that have built real node apps before.
Who do you run the node application as on your linux box? None of the tutorials I've read mention anything about adding a node user and group so I'm curious if it's because they just neglect to mention it or because they do something else.
Where do you keep your projects? '/home/'? '/var/'?
Do you typically put something in front of your node app? Such as nginx or haproxy?
Do you run other resources, such as storage(redis, mongo, mysql, ...), mq, etc..., on the same machine or separate machines?
I am guessing this question is mostly about setting up your online server and not your local development machine.
In the irc channel somebody answered the same question and said that he uses a separate user for each application. So I am guessing that this is a good common practice.
I mostly do /home/user/apps
I see a lot of nginx examples so I am guessing that is what most people use. I have a server with varnish in front of the a node.js application and that works well and was easy to setup. There are some pure node.js solutions but for something as important as your reversed proxy I would go for something that is a little more battle-tested.
To answer this correctly you probably have to ask your self. What are my resources? Can I afford many small servers? How important is your application? Will you lose money if your app goes down?
If you run a full stack on lets say one VPS then if there is a problem with that VPS then only one of your apps is affected.
In terms of maintenance having for example one database server for multiple apps might seem attractive. You could reason that if you need to update your database to patch a security hole you only need to do it in one place. On the other hand you now have a single point of failure for all the apps depending on that database server.
I personally went for many full stack server and I am learning how to automate deployment and maintenance. Tools like Puppet and Chef seem to be really helpful for this.
I only owned my own Linux servers for the last 3 months and have been a Linux user for 1.5 years. So before setting up a server park based on these answers make sure you do some additional research.
Here's what I think:
Using separate user for each app is the way I'm doing this.
I keep it in /home/user/ to make sure that only user (and root of course) has access to the app.
Some time ago I've created my own reverse proxy in Node JS based on node-http-proxy module. If you don't want to use reverse proxy then there's no point in putting anything in front of Node. There's even more: it may harm the app, since for example nginx can't use HTTP/1.1 (at least at the moment).
All resources I run on the same machine. Only when I actually need to distribute my app between separate machines I start thinking about seperate machines. There's no need to preoptimize. App's code is a different thing, though.
Visit the following links::
nettuts
nodetuts
lynda nodejs tutorials
Best practice seems to be to use the same user/group as you would for Apache or a similar web server.
On Debian, that is www-data:www-data
However, that can be problematic with some applications that might require higher permissions. For example, I've been trying to write something similar to Webmin using Node and this requires root permissions (or at least adm group) for a number of tasks.
On Debian, I use /var/nodejs (I use /var/www for "normal" web applications such as PHP)
One of the reasons I'm still reluctant to use Node (apart from the appalling lack of good quality documentation) is the need to assign multiple IP Ports when running multiple applications. I think that for any reasonably sized production environment you would use virtual servers to partition out the Node server processes.
One thing that Node developers seem to often forget is that, in many enterprise environments, IP ports are very tightly controlled. Getting a new port opened through the firewall is a very painful and time-consuming task.
The other thing to remember if you are using a reverse proxy is that web apps often fail when run from behind a proxy - especially if mapping a virtual folder (e.g. https://extdomain/folder -> http://localhost:1234), you need to keep testing.
I'm just running a single VPS for my own systems. However, for a production app, you would need to understand the requirements. Production apps would be very likely to need multiple servers if only for resilience and scalability.
Every LAMP or XAMPP writeup or tutorial I see says "Not for production use", so what do I use for production?
you use apache, php and mysql installed as they should be for production. xampp is all those things in one package with basically no set up security .. root passwords are empty .. users are well known .. but the components are the same as the ones you would use if you downloaded them each ..
The XAMPP philosophy says:
The philosophy behind XAMPP is to build an easy to install distribution for developers to get into the world of Apache. To make it convenient for developers XAMPP is configured with all features turned on.
The default configuration is not good from a securtiy point of view and it's not secure enough for a production environment - please don't use XAMPP in such environment.
So it’s primarily designated as a development environment and not as a production environment.
Given the right installation options, you can use them as a starting point for a production server. But there are some holes to fill in, mainly wrt security. The disclaimers you refer to are (wisely) to make sure you are wary and suspicious of what you start with (and also make sure no one can claim they supplied you with something dangerous without letting you know, so don't blame them if Bad Things happen.)
It's like selling you a car without seatbelts.
But what you learn, and the solutions you develop, are generally fully compatible with a "real" server.
XAMPP installation is easy as compared to LAMP. If you are on development server it makes no difference, but if you are on production server than it is good to do from basic instead of relying on 3rd party and you will get exact services you want on your production server, no extra services which can lower down your production server performance.
And try to keep same setup on development and production server, it will make sure that if application is running on development server will also run on production without doing any extra settings.
For production you've make your own configuration, it depends on visitors count, RAM installed on your server, scripts you're using. F.E. You need only 5 PHP extensions for Wordpress, but if you're using Woocommerce you need more PHP memory, for multiple sites you've use VirtualHosts, and if you have not domains registered also user UserDir modules.
All I've said is only examples, you must know your script requirements, install everything is not good idea.
Is there a detailed guide which explains how to host a website on your own server on linux.
I have currently hosted it on one of the commerical web-hosts.
Also the domain is registered to a different vendor.
Thanks
This guide is probably more info than you really requested, but webserver information is in there. It's Gentoo-specific, but you can apply the same information with minor translations to any other distro.
I would look into installing apache
99% of linux distributions will have a package for it.
On ubuntu you can run:
sudo apt-get install apache2
Are you considering hosting a web page locally for the internet? Or is this just for development etc..
If it's for an internet server, you will need a stable internet connection with a good upstream.
You may also need a static IP address so you can setup DNS to point to the right place.
While I don't have an url to a good tutorial in english, I would just warn you that this is not something you should take lightly. Administrating a server involves getting your hands dirty in linux stuff and dealing with security can be pretty complex depending on your knowledge and requirements.
So if you know nothing about it, you should be very careful and if the website you host has is of any commercial importance you are probably better off hiring a server admin.
Just to point out; if this is a personal (home) server, as opposed to one in a corporate environment, then it's better not to bother hosting it - you won't necessarily have the bandwidth, and your ISP may not allow it.
As mentioned above, you will also need a static IP address, and you'll need to set up DNS records to point to the correct location, which your domain vendor may or may not help you with.
I think it depends on how familiar you are with linux. Certainly, many people do this for hobbyist websites.
There are many aspects involved - you should begin with something simple like getting apache running and visible to the outside world.