Connect to AWS RDS Aurora for Postgres from Node JS - node.js

I have a Node JS application hosted on EC2 instances that I am trying to connect with an RDS Aurora Postgres cluster. The cluster consists of 1 writer and 2 readers. I have whitelisted the security group attached to the EC2 instances as an ingress rule on the security group associated with the database cluster which allows the EC2 instances to communicate with the database in the cluster.
I am having some issues connecting the application on the EC2 instance to the database. I have read this link, but that discusses using a JDBC driver in order to connect the database. I'm not sure if I can use a JDBC driver to allow a Node JS application to connect to this database cluster. I'm not finding any useful examples to connect a Node app to Aurora Postgres.
Any advice on connecting Node JS to Aurora Postgres DB cluster would be helpful.

Related

Imperva - Data security Fabric - diagnostic logs RDS in AWS

I am install data security fabric in my company, i work with postgre in heroku api in conection on AWS. i need collectar logs RDS for Imperva data security audit, is possible used heroku? wath logs is send for the RDS AWS? It is possible work with what is send for AWS?
I'm connect is AWS, but not moving in install

How to migare the docker containarized mongoDB to azure cosmosDB with Data Migration Service provided by azure?

I have my mongo database within a docker container, which is hosted in an amazon ec2 instance. I wish to migrate the database to Azure CosmosDB with the help of data migration service provided by Azure. While setting up the DMS(Data Migration Service) project, I can't find a way to specify the source database, which resides in my ec2 instance. I gave the public ip of my instance in DMS as the source host. The docker container of mongo DB is open to the port 27107 of the instance. I have opened that port of instance to the public access as well. Still the migration service is not able to find the source database. If anyone can shed some light on this issue it will be so much helpful

How to connect to AWS ElastiCache Cluster from AWS CloudFront using Node.js?

I am new to AWS CloudFront and AWS in general. I have a Next.js (React SSR framework) website which I deployed onto AWS using serverless-nextjs (https://github.com/serverless-nextjs/serverless-next.js). However, I also need some sort of caching for my web app. I decided to use redis ElastiCache from AWS. I created an redis ElastiCache Cluster on the AWS console.
My attempt:
I setup the code for connecting to the redis ElastiCache like this:
import redis from 'redis';
...
export async function getServerSideProps() { // Server side function for Next.js
const cache = redis.createClient(6379, "{PRIMARY-ENDPOINT-URL-AWS}");
}
and I run the website locally on my PC. However, I get a timeout error from redis: Error: connect ETIMEDOUT.
How would I be able to connect to the redis ElastiCache Cluster from CloudFront and on my local PC?
Screenshot of the redis ElastiCache Cluster window:
redis ElastiCache
You can't connect to ES from outside (i.e. your local workstation) of AWS directly. ES domains are designed to be only accessible from within your resources (e.g. instances) in the same VPC as your ES domain. From docs:
Elasticache is a service designed to be used internally to your VPC. External access is discouraged due to the latency of Internet traffic and security concerns. However, if external access to Elasticache is required for test or development purposes, it can be done through a VPN.
The only way to enable connections from outside AWS to your ES is if you establish a VPN connection between home/work network or Direct Connect as explained in AWS docs:
This scenario is also supported providing there is connectivity between the customers’ VPC and the data center either through VPN or Direct Connect.
However, for quick, ad-hock setup you can use ssh tunnel between your local workstation and ES domain. This will require some jump/basion EC2 instance which the tunnel will go through.

AWS elastic beanstalk Node.js app is not connecting to Mongodb atlas

I am trying to connect to MongoDB atlas from elastic beanstalk using a Node.js app. on Mongo atlas, I opened the connection publicly for testing reasons (added 0.0.0.0/0 to the whitelist) and AWS security group allows all traffic.
I still can connect to MongoDB atlas from my localhost but not from AWS EBS.
Even I have faced the same issue and it solves by restart the aws elastic beanstalk instance.
Actually, we open do MongoClient.connect once when your app boots up
and reuse the db object. It's not a singleton connection pool each
.connect creates a new connection pool.
So for that purpose, we have to restart the instance and it will work but for the security purpose, we can try VPC Peering for MongoDB Atlas.
Hope this will help some one else..!!

what approach I should use for external access to Cassandra running inside kubernetes

I have a StatefulSet Cassandra deployment that works great for services deployed to Kubernetes with namespace access, but I also have an ETL job that runs in EMR and needs to load data into that Cassandra cluster.
What would be the main approach/Kubernetes way of doing this?
I can think of two options.
Simple one is you can create the server with Type: NodePort, with this you can connect server with Node IP Address:PortNumber.
Second option is you can create the Ingress Load Balancer and connect to Cassandra cluster.

Resources