Change db instances in ForestAdmin and Lumber - node.js

As admin part I want to use ForestAdmin.
Also I am gonna use already existing database (Mattermost`s actually).
Everything was deployed to AWS (ForestAdmin and Mattermost).
Due to tutorial I can install ForestAdmin via Lumber and connect it to existing db. Actually I have done this, do some playing with code to expand some functionalities.
The main question is: when I install ForestAdmin locally via Lumber - I point to my local instance of database and lumber generates code (models and CRUD). So also I can install ForestAdmin locally and point to Mattermost`s database on AWS. BUT can I somehow change which database I want to use - local or that one on AWS?
Maybe I explain this problem bad - so also I will try to rephrase with example of desired flow:
Mattermost with database already installed on AWS
ForestAdmin also installed via Lumber and connected to Mattermost`s db
I got local instance of Forestadmin and Mattermost, write some code, test it in local environment.
Push to repo, connect via ssh to AWS and pull changes which was made.
AWS instances works as local but with separate instances of db.
Or maybe there is some better way to push changes from local Forestadmin and pull it to AWS instance of Forestadmin?
Thanks for any help in advance!

Yes you definitely can, you can use Forest environment for that :
https://docs.forestadmin.com/documentation/reference-guide/how-it-works/environments

Related

Mongo db replication using node js

I am new to MongoDB and NodeJS. I am running an ec2 instance on AWS and I want to sync the data of AWS to local systems and local to AWS. I don't have any idea about how to do it. I have tried some configuration in database files but I am not able to connect my local system with remote server.
I found this repository on git hub https://github.com/sheharyarn/mongo-sync but I don't know how to use that repository in my code.
Kindly help me.
Thank you
Mongodb atlas provides free 512MB of space. You can create cluster in mongoldb atlas and use online database from anywhere. it will work on your local machine and also on aws server.
Just open mongoldb website you can find mongoldb atlas easily. after sign up.

Heroku wont work with my Keyv and Sqlite database

Ive made a database from sqlite and i uploaded that with my github to Heroku but its only getting the data from the database and not changing it. No errors, just not working. When i am testing it on my pc it works fine.
I'm not sure why you're only able to get data and not change it. If you can share an example of how you're getting and setting, I might be able to help you get it going temporarily.
I learned, however, that SQLite only offers temporary storage on Heroku:
Why is SQLite a bad fit for running on Heroku?
Disk backed storage
SQLite runs in memory, and backs up its data store in files on disk. While this strategy works well for development, Heroku’s Cedar stack has an ephemeral filesystem. You can write to it, and you can read from it, but the contents will be cleared periodically. If you were to use SQLite on Heroku, you would lose your entire database at least once every 24 hours.
Even if Heroku’s disks were persistent running SQLite would still not be a good fit. Since SQLite does not run as a service, each dyno would run a separate running copy. Each of these copies need their own disk backed store. This would mean that each dyno powering your app would have a different set of data since the disks are not synchronized.
Instead of using SQLite on Heroku you can configure your app to run on Postgres.
I then followed their instructions for setting up Postgre. It's worth reading through the instructions, but the gist of it to use the Heroku CLI:
From the section Provisioning Heroku Postgres:
"Use the heroku addons command to determine whether your app already has Heroku Postgres provisioned"
If heroku-postgresql doesn’t appear in your app’s list of add-ons, you can provision it with the following CLI command: heroku addons:create heroku-postgresql:hobby-dev
As of writing, the hobby tier is free. Read about plans here.
This command adds an environment variable to your project named DATABASE_URL.
I'm using keyv by Luke Childs. I installed its companion #keyv/postgres. (I also uninstalled my sqlite stuff.)
I used the newly added DATABASE_URL environment variable to wire into the keyv steps linked above:
const Keyv = require('keyv');
const keyv =
process.env.NODE_ENV !== "production"
? new Keyv()
: new Keyv(process.env.DATABASE_URL);
I haven't found the best solution yet for developing/testing Postgre locally. Heroku Postgre requires SSL for connecting remotely (when your app is running locally). In the code block above, you'll see that I'm initializing Keyv without a database while developing locally (new Keyv()).
From here, if I need to verify the DB storage, I can set up a PostgreDB for developing locally, but I also imagine that it's possible to connect to the Heroku Postgre using SSL. If you or anyone has a solution they like for this step, please let me know.
#T. Rotzooi, I'm three months late to your question, but perhaps this explanation could help future people. I haven't found any other resources discussing this issue that you and I both encountered.

How to run and start mongodb from within nodejs

Basically I don't want to use an existing mongodb database site like the official mongocloud or whatever-- how can I do what they do, but myself? Do I just include the database folder, along with all of the mongodb executable, in my nodejs folder and call require("child_process").spawn("mongodb.exe", /insert params here/), or is there some kind of way to do this in the mongo module?
And also do I need my own virtual machine to be able to do this or can the following work on a standard heroku nodejs application for example?
Anyone?
Heroku's hosting solution has only ephemeral volumes, so you can't use it for a database. Any files you create are temporary and will be purged on a regular basis.
For example, when your application is idle Heroku will de-provision that resource and clear out any data you've left there.
You can't use Heroku like this, you must use an external database service, or one of their many add-on offerings.

package.json for postgres+node+AWS elastic beanstalk

The Postgres interface (to a remote DB) isnt working on my node app using Sequelize. mysql works. postgres did need some kicking on my development machine, so I am not surprised it didnt just magically work. Now when sequelize attempts to use the postgres dialect it fails.
I have ssh'd onto my instance, and the pg and pg-hstore stuff is there, but my server just crashes on start up (I stuck an error message in just prior to sequelize init which gets printed about 16 times a restart, but I jave no other information).
I wanted to manually do some npm install -save stuff, just to get the lights on, but I cant just do that it seems (no node or npm).
Either way, I dont really want to be SSHing about, I just want to know how to install postgres node stuff on elastic beanstalk. Is there some magic I need to add in package.json, if I want this to be scalable then I assume all the into needs to go into that.
EDIT:
My problems were nothing to do with postgres at all, and everything to do with the INTERVAL data type I added to data-types.js in sequelize.
postgres installs just fine with
"pg":"latest"
"pg-hstore":"latest"
Elastic Beanstalk is mainly designed to serve as a web server, so installing databases is outside of its comfort zone. I'd personally recommend you'd use AWS RDS with postgres, which comes pre-installed and ready to use with Elastic Beanstalk. A very simple example can be found here.
You tried to install PG manually via SSH, but that won't help, as Elastic Beanstalk loses all changes upon restart, and it will restart at its own free will without your control. So please don't count on any SSH change you make.
Now if you really need to install PG yourself, I'd go with the docker approach as you can just grab a ready made postgres installation from here.
Lastly, if you need help resolving any error you get when you install PG manually, you'll need to share the errors in this thread.

Is a Amazon Machine Images (AMI's) static or it's code be modified and rebuilt

I have a customer who wishes me to do some customisations of the erp system opentaps, which they used via opentaps Amazon Elastic Computing Cloud (EC2) images, I've only worked with it on a normal server and don't know anything about images in the cloud. When I ssh in with the details the client gave me there is no sign of the erp installation directory I'd expect to see. I did originally expect that the image wouldn't be accessible, but the client assured me it was. I suppose they could be confused.
Would one have to create a new image and swap it out or is there a way to alter the source and rebuild like on a normal server?
Something is not quite clear to me here. First of all EC2 images running in the cloud are just like normal virtual servers, so If you have an access to the running instance there is no difference between instance in the cloud and instance on another pc in your home for example.
You have to find out how opentaps are installed on the provided amis, then do your modifications, create an image from the modified instance and save it to s3 for backup if necessary.
If you want to start with fresh instance, you can start up any linux/windows distro on the EC2, install opentaps yourself your way and you are done.

Resources