How to ensure that I can only access a database while testing? - node.js

I have built an API in Node.js which communicates with a MongoDB as database. I am also using Vitest as my testing library.
I have two databases: prod and test. I want to restrict the access to the test database to only during the testing calls made due to test files ran by Vitest. Is there any way to do the same?
Proposed Solution (Maybe?)
Configure Vitest to setup an ENV flag or a NODE_ENVIRONMENT flag to TESTING whenever npm run test is called, and revert that back to DEVELOPMENT after the command is complete.
Add check while connecting to the database that if the environment is TESTING, only then connect to the test database, otherwise connect to the usual database.
The problem I am facing in this approach is configuring step 1. Any guides on the same? (Or other possible solutions to the problem?)

Related

Create a sqlite database to use on tests

I have a server implemented on Adonis.js with tests that perform operations against the main database. I want to use #Adonis.js/Vow and create a sqlite database dedicated only to use on the tests. The server is running on docker, and i use a docker-compose file to build the complete solution (3 servers and 2 other databases on postgreSQL). How can i create a sqlite database on docker to run it and connect to it on my tests?
Like always, the first step is to install the package from npm by running the following command.
npm i #adonisjs/lucid#alpha
Once done, run the following command to set up the package.
node ace invoke #adonisjs/lucid
1.You can choose sqlite of the available databases.
Right after the setup command is completed, we need to copy the code for validating the environment variables to the env.ts file. Since environment variables are injected from the outside, AdonisJS recommends you validate them and ensure that your app is always running with the correct set of configuration values.
The config/database.ts file holds all the configuration related to the database.
Finally, the config file relies on the environment variables and you can update them inside the .env file.
PG_HOST=localhost
PG_PORT=5432
PG_USER=root
PG_PASSWORD=
PG_DB_NAME=tests
learn more
database introduction

typeorm synchronize in production

In Typeorm there is a feature called synchronize. You can synchronize entities with a database, so there is no need for migirations. But as you know synchronize is dangerous for production.
Here is the question, when should I use the synchronize feature? Imagine at first (in the development environment) I started using the synchronize feature. If I disable it in production while I have no migration, how should my production database will going to be created?
Also, I'm going to deliver the project on some milestones. Should I disable it at the first milestone or at the end? And for long time maintenance, should I use synchronize disabled and use migration after the first production release?
Any idea would be appreciated.
Migrations in TypeORM
Even Though synchronization is a good option to synchronize your entity with the database, it is unsafe for production databases. Therefore migrations can be an alternative solution for safer migrations in production databases.
When doing a migration, you should follow the below steps.
1. Update the Typeorm config file and package.json file
You should change the synchronize attribute to false in Typeorm config file as the first step to prevent schema synchronization.
Then add the following command to the scripts attribute under the package.json file.
“typeorm”: “ts-node ./node_modules/typeorm/cli -f ./ormconfig.json”
2. Generate the migration
npm run typeorm migration:generate -n
Here you can give a name to your migration. After you run the command you will find a migration file under migrations with the name .
In the migration file, there are two functions namely up and down where up function responsible for running the migration and down for reverting the migration.
3. Run the migration
npm run typeorm migration:run
This command will run the migration which you have already created in the above command. When you run this command, it will execute the up function in the migration file.
4. Revert the migration
npm run typeorm migration:revert
This command will revert the migration which you have already executed in the above command. When you run this command, it will revert all the migrations which you have already done. Basically, it will run the down command of the migration file.
Synchronize is a great option to get up an running, but in my opinion you should always default to creating migrations. This is because it will enforce you to run your development environment similar to production, which is always key. You want to make your Dev environment run like production.
migration:generate is a great middle ground to building your migration files from your entities.
This was my question too, so I searched for it. I found out that as the documentation says:
Once you get into production you'll need to synchronize model changes into the database. Typically, it is unsafe to use synchronize: true for schema synchronization on production once you get data in your database. Here is where migrations come to help.
We can realize that once you have valuable data in your production database, you should turn synchronization off forever and start using migrations on development and production
source
I also have to decide that issue now. And i will use the syncronize option only for my e2e test db. As Roger King already mentioned, you want to have your dev and prod environments using the same tools to change the database. In this way you can prevent different behaviors between them.

Node JS db-migrate for different database

I'm already using db-migrate package to migrate MySQL database script and it works well, I've even setup the DATABASE_URL variable in the server environment.
Now I've a requirement to store few details in the sqlite in the same service, I checked db-migrate package for this feature apparently nothing mentioned regard to the executing sql scripts in multiple different databases at one go. Is it possible to do in db-migrate? or do I've to write my own service for this?

Is there a way to set test coverage for tests (f.e. 95%) using cypress.io in TeamCity?

We are using TeamCity to run cypress.io for our NodeJs application and some of the tests are failing due to timeouts. These timeouts seem based on latency to the database (AWS RDS) and vary from build-to-build.
What we would like to do is to try setting test coverage to a 95% success rate and see if this allows the build to continue.
There is an option in TeamCity to have build steps to run regardless if the previous steps failed, but we would like our tests to not run in this fashion.
Any advice would be appreciated. Thanks!
We ended up modifying the tests so that they would behave as expected in the new environment. We also decided to run the tests as they were built to run with a local Postgres database.
The significant issue we were dealing with was our Cypress tests were extremely fragile when moving to an RDS database. The tests were configured for a local dev environment with a local Postgres database and moving to RDS in the CI environment broke them.
My recommendation is for anyone setting up automated tests to make sure tests run in your CI environment as they do in their development, not to configure/edit your tests to pass in CI.
In other words, if your tests break in your CI environment, then they need to be fixed in the dev environment.

What is the recommended way to initialise a MongoDB database and user for a Sails.js application?

I am exploring writing a Sails.js application that talks to a mongodb data store. I've got it all working just fine but I'd like to be able to add some sort of initialisation process that runs at the npm install point that runs analogously to rake admin:create in the Rails world.
I'm guessing that this could be done by via a grunt task somehow, but I figured, before I roll my own solution, I'd ask other Sails / MongoDB users what they actually do in the real-world and see if a consistent, or best-practice solution has emerged.
Database initialization for Sails apps is typically done in the config/bootstrap.js file. A simple one would be something like:
module.exports = function(cb) {
User.findOrCreate(
// Search for user with "admin" flag
{admin: true},
// Create one if no such user is found
{admin: true, name: 'Admin User', ...}
).exec(cb);
}
The bootstrap will run only when Sails is lifted, and the server won't start until the callback is executed. If you send anything as the first argument to the callback, it'll be considered an error (like most standard Node callbacks) which will halt lifting of the server.
Note that you can't create the actual database with Waterline. A Grunt task would be a fine place for that, but you have to consider that Grunt doesn't have access to the Sails config, so you'd have to hard-code the name of the database to use, or find some other way to make it configurable. This could have implications for different environments (i.e. development vs. production). In our projects at Balderdash, we typically leave database creation as a manual step for the developer after they check out the code, and then use bootstrap.js to initialize the data. This allows the developer to maintain their own database settings using config/local.js.
All that being said, I can see the value in setting up a Grunt task for creating a db for production, to make it easier to deploy your Sails app to its final destination.
I personally think it was a wise decision by the sails.js team to use Grunt. Grunt provides a reliable and flexible solution to whatever automation you need.
I use grunt for everything in my sails.js projects including....
Importing data from legacy databases
Compiling sass (even though sails uses less by default)
Compiling coffeescript
Assembling knockout templates
Loading indexes into elasticsearch
You wont be disappointed using Grunt, It's pretty easy to get started with....So get to it!

Resources