Accessing local database using Heroku - node.js

So I've developed a Node.js app that accesses an Oracle database (that can be accesssed by all connected clients) for connected users, and sends them push notifications depending on certain query results. I used the web-push API (details at https://www.npmjs.com/package/web-push) to send push notifications to the connected clients, who essentially send a subscription with an endpoint to the server, and receive notifications based off this data.
1) I can't seem to host this app over simple LAN using my IP address since it doesn't have SSL certificate (https extension) and service workers don't work.
2) I deployed the app using Heroku, and now can send push notifications to various clients, but am unable to access the database (throws me an error).
How do I connect to the oracle database when deployed on Heroku? Is there a workaround?
I use the following simple code to connect to my database:
connection = await oracledb.getConnection( {
user : <username>,
password : <password>,
connectString : <connectionString>
});

You can use service like ngrok which allows to access local port publicly.
The process of how to setup ngrok can be found in the link below
https://ngrok.com/docs
For example if I was running PostgreSQL server on my local machine and I want to expose that PostgreSQL server listening on the default port, I will run below command with ngrok
ngrok tcp 5432
And the tcp url & port that will be displayed after port is exposed successfully will be database host name & port respectively.

Its better to use oracle cloud services..ATP database is an always free feature..

Related

Test Connection from nodejs application to Oracle DB

I have developed a desktop application with electron nodejs.I have a requirement to do Test connection to Oracle Database. I have all the details of database source information in my application.How can I do test connection from my application.
have attached screenshot below of the application
hostname should be just the hostname, not hostname:port/service_name. i.e. "localhost" if the database is on the same server as the application, or the actual remote hostname or IP address if it is running somewhere else.

How to restrict a server so it only allows SSH, a connection to an external RabbitMQ server, and a connection to an external MongoDB server?

I have a Ubuntu server which needs to only do the following:
Allow SSH access from me. I'm thinking of using a combination of private key + password for this. I would mainly be pulling code from Github and updating software on the server. The solution for this seems to be AuthenticationMethods.
Allow connections to an external RabbitMQ server. (There would be a local node app connecting to the RabbitMQ server).
Allow connections to an external MongoDB database. (There would be a local node app connecting to the MongoDB database).
All other connections should be blocked.
How can I accomplish this?
Thank you.
Put a firewall, allow only:
rabbitmq-ip:rabbitmq-port
mongodb-ip:mongodb-port
all-ips:ssh-port
And use a private key for SSH. This way, you can use that key for SSH connections and only people who have that key can do SSH to the server.
You also have to use SSH for git pulls as this configuration does not allow HTTPS connections to the server.

Connect nodeJs app to on-premise server using Secure Gateway

I'm trying to connect from a nodejs webapp to a REST api hosted on premise. I bounded a Secure Gateway instance and created a destination on port 80 to the machine where the SG client for RHEL 6 is running.
The request is still throwing a Timeout exception.
Do I have to modify the nodejs application code in any way or the SG should allow me to access the REST api transparently?
Your Node.js app needs to talk to the Secure Gateway service and not the API directly. Where you establish a connection to your on-premise API, replace the host name and port number with the cloud host name and port number that you were given when you created the destination.
There is an npm module to help your app obtain that host name and port - https://www.npmjs.com/package/bluemix-secure-gateway
And an example - https://www.ibm.com/blogs/bluemix/2015/04/reaching-enterprise-backend-bluemix-secure-gateway-via-sdk-api/

IBM API Connect apps published to Bluemix inaccessible

I followed API Connect getting started guide to create a local loopback API app and tested successfully. Then I am trying to follow Publish Your API to Bluemix. The publishing is successful. The app is running. But clicking the app yields Chrome error:
This site can’t provide a secure connection
ddd.abbr-dev2.apic.mybluemix.net sent an invalid response.
I suspect the problem is incorrect port. According to CloudFoundry Nodejs tips, the port should use process.env.PORT, but loopback defaults to 3000. Following this clue, I tried adding config.local.js:
module.exports = {
port: process.env.PORT
};
But the service end point is still inaccessible.
Please Help. Thanks
This is actually by design. Since your API's implementation is on the public internet, it is secured via Mutual TLS. The only way to access it is via the API Connect gateway, thus ensuring the API is managed.
If you want to make it accessible publicly, open the app in the Bluemix console and add an additional route to the app, using the mybluemix.net domain.

How to connect Mongodb Admin GUI to Cloud Foundry?

I am looking a way to browse my Cloud Foundry Mongodb services. Look like there are two options:
Tunneling to a Cloud Foundry Service with Caldecott http://docs.cloudfoundry.com/tools/vmc/caldecott.html. I never tried this but I guess it may work.
My question is this: Is it possible to connect directly into Cloud Foundry from Mongodb Admin GUI such as mViewer or Mongovue? But if so, how do I know the username/password in process.env.VCAP_SERVICES['mongodb-1.8'][0]['credentials']?
https://github.com/Imaginea/mViewer
http://www.mongovue.com/2011/08/04/mongovue-connection-to-remote-server-over-ssh/
By using the GUI client you have to get a tunnel to the service. Once you open it in a CLI console the connection info will be generated and displayed, including the host address, usually 127.0.0.1, port number, username and password. You cannot connect using the values from VCAP_SERVICES if you try to do that from outside environment because these will be local values behind the CF router.
You need to create a tunnel using Caldecott.
See http://docs.cloudfoundry.com/tools/vmc/caldecott.html.
When you open the tunnel, it should provide you with either a command line client, or the credentials to use.
In case it does not, create a piece of code that returns a dump of process.env.VCAP_SERVICES when visit a certain url on your server.

Resources