Redis-commander won't load static files when run in iframe - node.js

I have the following problem:
There's a server with Redis on it.
I want to be able to see Redis data and change it on my web-site. So I installed redis-commander on the server via npm. It usually listens to 8081 port.
And in my front-end part I created a button "Redis manager", which showes view with the following html:
<iframe id="frame" src="{{redisUIUrl}}" />
redisUIUrl looks like: https://example.com/redis
there's NGINX between front-end and server, which is used for reverse-proxy purpose. - it makes proxy_pass to oh_so_very_secret_syte.com:8081
So, when we make this GET, we get views from redis-commander. Everything is fine. But in this views there are links to the static of type: /css/default.css - relative ones.
And the problem is that instead of loading static via the address https://example.com/redis/css/default.css it tries to load it via https://example.com/css/default.css and fails.
I don't really understand, what is the problem and how to deal with it. So, I am asking for your help. Thank you in advance.

/css/default.css may be a relative URL, but it has an absolute path component. Which means that all references are relative to your server root.
If you require a path relative URL you need to drop the leading /.
css/default.css
will reference resources in the same directory as the current document.

Related

ngrok does not load NodeJS app from outside (ERR_CONNECTION_TIMED_OUT)

ngrok is a program with which you can make a local tunnel, it generates a temporary domain for you so you can redirect people to your local content, and also use https via localhost.
https://ngrok.com/
localtunnel is just another alternative.
So I have set up either ngrok and a localtunnel but both show a white page with only the HTML loading and not css or js when loaded outside of my network (with data plan for example)
The problem is nothing gives an error, the only thing I can see is ERR_CONNECTION_TIMED_OUT when using a hotspot.
Everything works within my own network.
I have tried turning off the firewall already but it seems to make no difference, also tried looking with the chrome remote debugger but it just disconnects when I load the url.
Thing is when I go to https:// on the ngrok url I get a bunch of mixed content errors, but not when I go to http. Seem illogical to me that it would default to http when using a https link... all of my script/style tags are relative paths.
Anyway so far this is only thing that I can figure out, any ideas on what might cause this?
So it's either
ERR_CONNECTION_TIMED_OUT
or
Blocked loading mixed active content
or both?
So I go it working now by changing the base href.
<base href="http://yoururl.ngrok.io">
And also changing some paths in my config to either /app/ or ../ respectively, but most of these were already set correctly and all I did was revert them after changing the base url.
As far I understand now the problems only really start to occur when connecting to the url through a data plan and not wifi.
Some random image paths in css/js will not load, and it also appears to behave differently on Firefox and Chrome for some reason.
Problem is I cannot keep testing this indefinitely as data will obviously run out some time and I have no reliable way of debugging the console errors on mobile...
In conclusion it works on a "normal" connection now (ie Wifi/Cable) but not on data.

Allow requests to travel up the directory tree (../) in express.js

There might be an answer to this, but I don't know how to phrase it properly, so I'll just ask.
I use a module, called build-url. It is designed for the browser and provides an easy way to construct URLs with query parameters.
I want to store my website in a folder, called site while not having to type it in the URL. I managed to do that easily. However, when the client asks for the build-url.js file, he receives a 404.
Here's how the folder structure looks like:
node_modules
build-url
dist
build-url.js
site
index.html
In my code, I have:
app.use("/", express.static(__dirname + "/site"));
The problem
When I go to http://localhost:5000/, the index.html file is served and everything is fine. Inside, I include the script:
<script src="../node_modules/build-url/dist/build-url.js"></script>
However, in the browser console, I see:
GET http://localhost:5000/node_modules/build-url/dist/build-url.js 404 (Not Found)
The server looks for a node_modules folder inside the site folder, instead of going up the directory tree. How do I fix that?
It is not a very good idea to allow your web server to send files from outside of the specified folder because it will allow the bad guys to download anything from your server including your confidential data.
You can copy required file to static folder or you can setup gulp or any other automation tool to do it automatically.

Changing localhost server files are served from using Node (Webstorm /maybe IntelliJ)

I'm not sure what I'm missing here, so hopefully someone can help me out. I'm working on a project where we're using Node and in the Run/Edit configurations I've down the following:
Node interpreter: This is the path to the node.exe file
which I checked out from Subversion
Working directory: this is where the "app.js" file is, this is the
path that from the command line you type node app.js and it starts the server
JavaScript file: app.js This is the name of the file that actually creates the server
Now from the main nav bar when I do Run / Run my server the box at the bottom pops up and tells me that Express server is listening on port 3000. Cool.
I can navigate to localhost:3000/myPage.html and I can get to the page just fine.
I added as JSON file to the same directory on my hard drive that myPage.html is in, and I can navigate to that as well by localhost:3000/largeTestData.json.
So the server is up and running and serving file as it should. My problem is that in my Webstorm project, I want to make an AJAX request to that largeTestData file. I do so using jQuery like:
var data = $.get('localhost:3000/largeTestData.json');
data.done(function(data){
console.log('here is your data');
cnosole.log(data);
})
When I do that I get the error (in Chrome)
XMLHttpRequest cannot load localhost:3000/largeTestData.json. Cross origin requests are only supported for HTTP.
and so I look at the URL and I'm seeing:
http://localhost:63342/
Obviously Webstorm has started the server correctly, but when I view an HTML file, it's not using that server (which, of course is why I'm getting the CORS error.
There's some fundamental stuff here which I'm obviously not getting. I need my IDE to deploy to the Web server that it started up, but it's not doing that. Please, someone give me a once over on all the technologies that I'm missing out on here.
WebStrom didn't start your node.js server, but serves static pages by its own internal HTTP server which doesn't know anything about node.js and Express.
The main problem:
When you start your node.js server, it's serving JSON files on port 3000. If you open an HTML-page with the little menu in WebStorm (where you can choose the browser), WebStorm opens the browser with an URL pointing to its own internal webserver running on a different port (e.g. 63342). JavaScript security prohibits loading data from a different host/port Same-origin policy.
It's not WebStorm's fault and you need a solution for this problem in production or you can't go live.
General Solution:
Either you have to ensure that HTML pages and JSON data come from the same host+port, or you can circumnavigate with (a) setting server-side headers ('Access-Control-Allow-Origin: *') as #lena suggested, or (b) using JSONP. Below you find some thoughts using nginx as a reverse proxy so from browser's point of view all requests go to the same host+proxy. It's a very common solution, but as mentioned above, there are other options.
Primitive solution:
Don't use WebStorm to open your browser. Load the page from http://localhost:3000/ and change the URL of the REST resource to $.get('/largeTestData.json'). You'll miss some comfort from your IDE, but you can immediately see that your program is working.
Comfortable solution:
As #lena suggested, there is a way to configure your Express/node.js as a server known to WebStorm. I haven't tried it, but I suppose you can then just press the Run-button and maybe the node.js plugin in WebStorm is as intelligent to know the static-maps in Express and know how to map an HTML-file to a web application URL and open the page in the browser with the URL served by your node.js application. (I'd be surprised once again if this really works magically, but maybe you can configure a mapping from files to URLs manually, I don't know.)
Dirty solution
With some options you can disable security checks, at least in Google Chrome. Then it's possible to load JSON data from a different port than your HTML page. I wouldn't recommend using these options (just my opinion).
Additional Hints
If you do more than just playing around with node.js and some UI fun and you have to serve your application "production-ready", then have a look at nginx to serve your static files and reverse proxy node.js requests from there. I'm using this setup even for development and it works like a charm.
Of course node.js / Express is able to serve static files as well, but IMO placing something like nginx in front of node.js (clustered) bring a bunch of advantages for production sites, e.g. load-balancing, ssl-offloading, avoid JSONP, in many cases performance, easier deployment updates, availability.
To get your code working, just change the URL in $.get() to full URL (including protocol):
var data = $.get('http://localhost:3000/phones.json');
In Webstorm 2016.3 (and probably earlier) there is now another option. Under the Configuration Settings for NodeJS runs, one can manually set the page and port to be loaded via Webstorm's "Browser/Live Edit" settings.
See the screenshot below for settings one can change.

Direct a URL directly to a GlassFish application in a virtual server

We have a domain name with DNS management facility. We also have a web application developed in a GlassFish server hosted in a virtual server with a path is
http://198.98.103.233:8080/pemis/
I want to direct to the home page of that application when some one type the domain name. After navigating through the pages, we must be able to see
http://www.pemis.lk/faces/public.xhtml
in the browser rather than
http://198.98.103.233:8080/pemis/faces/public.xhtml
How can we configure that.
Thanks in advance.
You need to install your application as the root application in Glassfish, as explained here. But it's not hard:
asadmin deploy --contextroot "/" your-webapp.war
or set the context-root property in the sun-web.xml or glassfish-web.xml depending on the version of Glassfish you use.
To change the port Glassfish listens on you need to modify the HTTP Listener configuration. On default installations you'll want to change http-listener-1's port. You can do so using the console. But you can also directly edit the domain's domain.xml:
<network-listeners>
<network-listener port="80" protocol="http-listener-1" transport="tcp" name="http-listener-1" thread-pool="http-thread-pool"></network-listener>
...
</network-listeners>
Last, to make www.pemis.lk point to that server you need a DNS entry that points to the address the server is attached to. The details of how to do that depend on the comapny that sold you the domain, quite often they have online tools that allow you to enter or modify the name-address mapping. In case of doubt it's best to contact them by phone or mail.
I'm on the same path and, as you don't posted the solution that you found (if you found it), I'll add here some future reference for anyone facing this problem.
I'll break the question in two parts: Eliminating host:port and changing how the URL behave.
I don't have a complete response to the first, however if you chose to listen at port 80, by HTML standard, you will supress the port on the URL, getting half the solution you want.
The second part, changing the URL behavior and/or shortening it can be achieved by either using mod_rewrite in apache or Tuckey's URL Rewrite Filter (http://www.tuckey.org/urlrewrite/). A google search using URL Rewrite can achieve you a more in depth explanation and there's a guide on the website.
You should, however, update your question with an answer, if you found one.

[NodeJS]Is my backend code secured?

I'd like to create a simple site on NodeJS. For example, it has two files (app.js - main application file) and router,js (a url file). I'd like to know - if it possibke for anyone just to access mydomain.com/router.js to get the source code of my application? I'm asking 'cause for example in PHP you cant just access to php, as you know server just gives you the result of working of this PHP-file, but not the file itself. So, how to make my nodejs-app invisible for public access? Thanks!
I make sure that all files for Node.js are never in a path that is served by another web server such as Apache. That way, there is little danger of the source ever being served by accident.
My node program's and files go in /var/nodejs with a sub-folder for each application in Node. By default of course, Node will not serve ANYTHING unless you tell it to.
At the root of my Apache configuration, I make sure that ALL folders are secured so that I explicitly have to enable serving on any folder structure even under the /var/www folders that I use for all Apache sites.
So you are pretty safe with a default setup of Node and Apache as long as you keep the folders separate. Also Node will not serve source code accidentally, you would have to set up a Node server that read the file as text and wrote it to the http stream.
That depends on how you are using Node.js and what you are using for a web server in front of it. Unlike PHP running as CGI or as a module in Apache, node and the node application itself is a server.
If you have a webserver with your node source directory exposed then the url you provided in the question will most likely result in your source code being served. Even if you were using Apache and proxying to node, there is usually no output filter involved. Instead requests are passed to the backend node server which interprets them.

Resources