package.json for postgres+node+AWS elastic beanstalk - node.js

The Postgres interface (to a remote DB) isnt working on my node app using Sequelize. mysql works. postgres did need some kicking on my development machine, so I am not surprised it didnt just magically work. Now when sequelize attempts to use the postgres dialect it fails.
I have ssh'd onto my instance, and the pg and pg-hstore stuff is there, but my server just crashes on start up (I stuck an error message in just prior to sequelize init which gets printed about 16 times a restart, but I jave no other information).
I wanted to manually do some npm install -save stuff, just to get the lights on, but I cant just do that it seems (no node or npm).
Either way, I dont really want to be SSHing about, I just want to know how to install postgres node stuff on elastic beanstalk. Is there some magic I need to add in package.json, if I want this to be scalable then I assume all the into needs to go into that.
EDIT:
My problems were nothing to do with postgres at all, and everything to do with the INTERVAL data type I added to data-types.js in sequelize.
postgres installs just fine with
"pg":"latest"
"pg-hstore":"latest"

Elastic Beanstalk is mainly designed to serve as a web server, so installing databases is outside of its comfort zone. I'd personally recommend you'd use AWS RDS with postgres, which comes pre-installed and ready to use with Elastic Beanstalk. A very simple example can be found here.
You tried to install PG manually via SSH, but that won't help, as Elastic Beanstalk loses all changes upon restart, and it will restart at its own free will without your control. So please don't count on any SSH change you make.
Now if you really need to install PG yourself, I'd go with the docker approach as you can just grab a ready made postgres installation from here.
Lastly, if you need help resolving any error you get when you install PG manually, you'll need to share the errors in this thread.

Related

How to properly install nvm/node/npm in AWS EC2 for multiple developers?

Trying to understand how node is supposed to be installed for multiple developers in AWS EC2 as an administrator. (I am also one of the devs).
I have an EC2 server with nginx running on port 80. Should I now go to the webroot and install nvm/node/npm as ec2-user? Or my own user, and then all the other users after? (No one can actually use the ec2-user account except server admins.)
How about other developers who need to use node? I was hoping to install nvm/node/npm for everyone in advance who needs it so that they could use it immediately after getting access to the server, but maybe everyone should install nvm/node/npm themselves?
Or it would be nice if there is a way to install it as ec2-user and then share it with all the users properly/securely? What's the right way to set this up?
(When I ran through this myself as my own user and installed nvm for the first time in EC2 Linux 2 AMI, I noticed that when I switched to another user or root, the "node -v" command didn't work for other accounts - and basically I'm trying to do an install that covers all the users.)
In fact, in AWS EC2, you need only one user and one NodeJS running. I would suggest below set-up for development and deployment.
Let all developers have their dev environment set-up in their local machines.
Let developers check-in their code to Github or a similar repository.
Using a CI/CD pipeline. Integrate the code and build the code and deploy it into EC2.
Instead of EC2, I would recommend using AWS Beanstalk.
If this makes sense for you, we can elaborate this into a solution and implement it.

Node Can't connect to vagrant box

I am not sure if this is the correct place to ask my question, but really I am out of ideas, and my clock is ticking.
In short, I got a new machine that I need to make development ready.
This project is based on rather old program versions, that is a task to update.
In short I have set up the Vagrant (1.8.1) in VirtualBox (5.0.14). Chef (0.10.0) created all dependencies successfully and I can SSH to machine and see all is fine, all services are running as set in VagrantFile.
Vagrant box is latest ubunty/trystu64. My host machine is MacOs HighSierra(10.13.3).
Now, I open for example an mySQL editor (mySQL Workbench) and it connects to the Box, I can see DB and manipulate it.
My problem is with the NodeJS (I think). When I run my tests, it simply refuses to connect to the Box. More precisely, it attempts to connect to 127.0.0.1: 3306 (mySQL) and it errors. While MySQL Workbench performs the same connection without problems.
It seems the port forwarding in Vegrant works fine, as mySQL workbench is being forwarded to a box. Nodejs is not being forwarded, or something.
Is it Node doing it? Something else that I need to allow?
I have tried many different things, I have lost count. And always the same issue.
Is there something that I can do to Node, so it behaves as mySQL Workbench? Any idea is appreciated.
This identical setup used to work before, but not now.

how to dockerize my nodejs express application hosted on amazon linux ami?

1.my technology stack for above application is expressjs, nodejs, mongoDB, redisDB, s3(storage).
2.API is hosted on Linux AMI
3.I need to create docker container image for my application.
First of all you will need to decide to either keep everything inside a single container (monolithic, cannot really recommend it) or separate the concern and run a separate express/nodejs container, a mongodb container, and a redisDB container, s3 is a service you cannot run for yourself,
If you chose the later approach, there are already officially supported images on the docker hub for redis, and mongo, now for the actual app server (node) you need to set express as a dependency on node and start the official node image with an npm install command (which would get express on it) and then npm start (or whatever command you use for it), dont forget to include your code as a volume for this to work,
Now, bear in mind that if your app uses any reference data inside mongodb, you should make sure to insert it when the mongodb container starts or create an image based on the official mongodb that already has said data on it!
Another valuable note is that you should pass all connections inside your expressjs app as env vars, that way you can change them when deploying your app container (useful for when you distribute your system accross several hosts),
At the end of the day you would then start the containers in this order: mongodb, redis, and node/express. Now, the connection to s3 should already be handled inside your node app, so it is irrelevant in this context, just make sure the node app can reach the bucket!
If you want just to build a monolithic container, just start with a debian jessie image, get a shell inside the container, install everything as you would on a server, get your code running and commit the image to your repo, then use it to run your app, Still i cannot recommend this approach at all!
BR,

Different database for production and development in nodejs

I know that Ruby on Rails has this feature, and in the railstutorial it specifically encourages it. However, I have not found such a thing in nodejs. If I want to run Sqlite3 on my machine so I can have easy to use database access, but postgres in production on Heroku, how would I do this in Nodejs? I can't see to find any tutorials on it.
Thank you!
EDIT: I meant to include Node.JS + Express.
It's possible of course, but be aware that this is probably a bad idea: http://12factor.net/dev-prod-parity
If you don't want to go through the hassle of setting up postgres locally, you could instead use a free postgres plan on Heroku and connect to it from your local machine:
DATABASE_URL=url node server.j
A .env file can make this easier:
https://devcenter.heroku.com/articles/heroku-local#copy-heroku-config-vars-to-your-local-env-file
To switch between production and development Db you use different ports for running you application locally and on Heroku.
As Heroku by default runs the application to port 80 you have a some other port while running your app locally.
This will help you to figure out in run time if your application is running locally or in production and you can switch the Databases accordingly.
You could use something like jugglingdb to do this:
JugglingDB(3) is cross-db ORM for nodejs, providing common interface to access most popular database formats. Currently supported are: mysql, sqlite3, postgres, couchdb, mongodb, redis, neo4j and js-memory-storage (yep, self-written engine for test-usage only). You can add your favorite database adapter, checkout one of the existing adapters to learn how, it's super-easy, I guarantee.
Jugglingdb also works on client-side (using WebService and Memory adapters), which allows to write rich client-side apps talking to server using JSON API.
I personally haven't used it, but having a common API to access all your database instances would make it super simple to use one locally and one in production - you could wire up some location detection without too much trouble as well and have it automatically select the target db depending on the environment it's in.

is there any reason not to install a second instance of MongoDB?

I already have mongoDB on my mac (OS mavericks) because it comes packaged with Meteor. I'm learning some pure, non-Meteor node.js right now. I'd like to work with mongoDB, but I'm afraid to change any of the configuration I've already got on my machine, as I don't want to screw up the Mongo that comes packaged with Meteor.
Is this something I should be concerned about? How do I protect my other mongo instance?
I assume by the MongoDB that comes with Meteor you mean the MongoDB database Meteor uses internally when you type "meteor" and that resides in .meteor inside your app folder. In that case it's no problem adding a MongoDB installation to the OS, they won't conflict.
In fact, I recommend to separately install MongoDB for different reasons. When you are running a production app it's easier to scale, let multiple apps use the same database etc.
First install MongoDB, for example with Homebrew. Then you just run your app with
MONGO_URL=mongodb://127.0.0.1/<db> meteor
According to mongodb's documentation:
...In many cases running multiple instances of mongod on a single system is not recommended but for testing purposes of course possible.
I don't think that meteor has done intensive configuration changes to mongodb's out-of-the-box configuration (except of course if you've done already configuration amendments for special sharding, Oplog tailing strategies etc.)

Resources