I have a web page project developed in MEAN stack instanced in a server with Docker.
I have 2 container running, one has the API and the service, the other runs the mongodb service. Everything seems right but I have one big problem, in certain periods of time, the system drops my databases.
The last issue was on 11/03/2018 and the previous one 2 week ago.
These are the mongodb container last logs:
There is any configuration that maybe I can edit, or do you have passed through this problem?
If you need more information please do ask. Thanks.
Your database gets accessed from an external (public) IP. Your MongoDB server is publicly accessible and you have no authentication activated.
This official doc describes how you can enable authentication: https://docs.mongodb.com/manual/tutorial/enable-authentication/
Furthermore: You can restrict access to the server. If your app server and your mongodb server are in the same network, you need not to expose your mongodb to the public.
Related
Problem:
I have an AWS EC2 instance running FreeBSD. In there, I'm running a NodeJS TLS/TCP server. I'd like to create a set of rules (in my NodeJS application) to be able to individually block IP addresses programmatically based on a few logical conditions.
I'd like to run an external (not on the same machine/instance) firewall or load-balancer, that I can control from NodeJS programmatically, such that when certain conditions are given, I can block a specific remote-address(IP) before it reaches the NodeJS instance.
Things I've tried:
I have initially looked into nginx as an option, running it on a second instance, and placing my NodeJS server behind it, but after skimming through the NGINX
Cookbook
Advanced Recipes for High Performance
Load Balancing I've learned that only the NGINX Plus (the paid version) allows for remote/API control & customization. While I believe that paying $3500/license is not too much (considering all NGINX Plus' features), I simply can not afford to buy it at this point in time; in addition the only feature I'd be using (at this point) would be the remote API control and the IP address blocking.
My second thought was to go with the AWS/ELB (elastic-load-balancer) by integrating AWS' SDK into my project. That sounded feasible, unfortunately, after reading a few forum threads and part of their documentation (unless I'm mistaken) it seems these two features I need are not available on the AWS/ELB. AWS seems to offer an entire different service called WAF that I honestly don't understand very well (both as a service and from a feature-stand-point).
I have also (briefly) looked into CloudFlare, as it was recommended in one of the posts, here on Sackoverflow, though I can't really tell if their firewall would allow this level of (remote) control.
Question:
What are my options? What would you guys recommend I did?
I think Nginx provide such kind of functionality please refer to link
If you want to block an IP with Node TCP you can just edit a nginx config file and deny IP address.
Frankly speaking, If I were you, I would use AWS WAF but if you don’t want to use it, you can simply use Node JS
In Node JS You should have a global array variable where you will store all blocked IP addresses and upon connection, you will check whether connected host IP is in blocked IP variable. However there occurs a problem when machine or application is restarted, you will lose all information about blocked IP-s. So as a solution to that you can just setup Redis (It is key-value database but there are also other datatypes) DB and store blocked IP-s there. Inasmuch as Redis DB is in RAM all interaction with DB will be instantly and as long as machine or node is restarted, Redis makes a backup on hard drive and it syncs from it and continue to work in RAM with old databases.
I have a question regarding the Node Red dashboard. I've got my dashboard all set up and working. Now, I want to be able to access the dashboard outside of my local network. Right now I do this through a VNC server. What needs to happen next is that clients need to able to access the dashboard, but they are not getting access to my VNC server of course. I have done my fair amount of Google work. I (somewhat) understand that a service like ngrok (ngrok.com) or dataplicity (dataplicity.com) is what I am looking for. What would be the best way of setting this up safely?
Might be useful to clarify: I'm using a raspberry Pi!
Thanks in advance!
If you want to give the outside world access to your dashboard, you can also consider to host your node-red application in the cloud. See links at the bottom-left of page https://nodered.org/docs/getting-started/
Most of those services have a free tier - so it might you cost nothing.
If you cannot deploy your complete node-red in the cloud (e.g. because it is reading local sensors) then you can split your node-red application into 2 node-red applications: one running locally and one (with the dashboard) running in the cloud. Of course then the 2 node-red applications need to exchange messages: for this the cloud services mentioned on that page also provides a secure way to send and receive events from the node-red cloud application that you can use.
I want to have three different Neo4J instances running (each with a different database). Then I need three different Neo4J browser visualizers (think project1.domainname.com / project2.domainname.com / project3.domainname.com) with each one mapping to a specific database instance
I've managed to get the three different database instances running on a single Azure VM - so far so good.
But I'm not sure how to create and map those browsers. I'd like to run them each as Azure websites as that would help with some other problems I've forseen.
1) Where is this browser HTML etc. so I can load it to the Azure Website?
2) Where in that code would I specify the IP Address and Port that browser should be talking to?
I've also heard some people talking about an Azure for Neo4J project but that is nearly five years old and the Neo4J guys said to put the database instances on VM's. Were they right>
The Neo4j Browser is an angular app that you can find in the neo4j source code at https://github.com/neo4j/neo4j/tree/2.3/community/browser
You can set the host that the browser app can talk to:
:config host:"http://host:port"
This is an undocumented feature and might be removed.
For 3.0 the intent is to decouple browser and Neo4j anyway.
I have a web application (using MongoDB database, AngularJS on front-end and NodeJS on back-end) that deployed on 2 places. First is on static ip so that it can access from anywhere and second is on one local machine so that user can use it when the internet connection is not available. So on both places, data can be inserted by user. My requirement is to sync the both databases, when internet connection is available on local machine i.e. from local system database to remote system database and vice-versa without loosing any data on both places.
One way I am thinking about is provide the sync button in the application and sync the databases using insert/update query. I am not sure is there any better and automated way to do this task so that the databases sync automatically like data copied in replica set.
Please provide the best solution to do this task. Thanks in advance.
I have a scenario with two nodejs apps deployed on two Dokku droplets. One of my apps is three admin app which stores data to a mongodb database. The other app is the main application which reads data from that database.
How can I make this main app communicate to the database?
You need to link the database to the dokku container via environment variables. You basically need to follow this methodology: http://12factor.net/
The database needs to be accessible via an IP and port combination on one of your two servers. If you need both servers to communicate with the database then you will need to make sure it is externally accessible and properly secured (for example via a VPN).
You can then set an environment variable like so:
dokku config:set DB_URL='mongo://10.0.0.1:4192/database_name'
obviously changing the above to match your setup.
Another potentially easier way of doing the above is to use a dokku plugin which will basically automate those steps.
A list of plugins is available at: http://progrium.viewdocs.io/dokku/plugins
There is a mongo plugin which may suit your needs, I've used some of the others and they work well.