Load .env in globalTearDown JEST - node.js

.env variable are not visible during --globalTearDown function in JEST.
I'm currently loading them in the --setupFiles ( or in --setupFilesAfterEnv), but still when calling the teardown funciton they are not there, unless I reload them with dotenv.config().
I would like to know if it is fine to re-load them? and why this is happening?

Related

Jest prevent clearMocks for test scenario or file

I use Jest with the following configuration, to clear all mocks automatically after each test.
clearMocks: true
Is it possible to prevent this behavior now for a test scenario describe or a file? So that the mocks will not cleared for this.

How to get environment variables defined in serverless.yml in tests

I am using the serverless framework for running lambda functions on AWS.
In my serverless.yml there are environment variables that are fetched from SSM.
When I write integration tests for the code, I need the code to have the environment variables and I can't find a good way to do this.
I don't want to duplicate all the variables definitions just for the tests, they are already defined in the serverless.yml. Also, some are secrets and I can't commit them to source conrol, so I would have to also repeat them in the ci environment.
Tried using the serverless-jest-plugin but it is not working and not well maintained.
Ideas I had for solutions:
Make the tests exec sls invoke - this will work but would mean that the code cannot be debugged, I won't know the test coverage, and it will be slow.
Parse the serverless.yml myself and export the env variables - possible but rewriting the logic of pulling the SSM variables just for tests seems wrong.
Any ideas?
The solution we ended up using is a serverless plugin called serverless-export-env.
After adding this plugin you can run serverless export-env to export all the resolved environment variables to an .env file. This resolves ssm parameters correctly and made integration testing much simpler for us.
BTW, to get the environment variables set from the .env file use the the dotenv npm package.
Credit to grishezz for finding the solution
You can run node with --require option to inject .env file to a serverless command.
Create .env at the project root with package.json, and list variables in .env.
Install serverless and dotenv in the project by yarn add -D serverless dotenv.
Run a command like node -r dotenv/config ./node_modules/.bin/sls invoke.
Then, you can get environment variables in the handler process.env.XXX.
Are you looking to do mocked unit tests, or something more like integration tests?
In the first case, you don't need real values for the environment variables. Mock your database, or whatever requires environment variables set. This is actually the preferable way because the tests will run super quickly with proper mocks.
If you are actually looking to go with end-to-end/integration kind of approach, then you would do something like sls invoke, but from jest using javascript. So, like regular network calls to your deployed api.
Also, I would recommend not to store keys in serverless.yml. Try the secret: ${env:MY_SECRET} syntax instead (https://serverless.com/framework/docs/providers/aws/guide/variables#referencing-environment-variables), and use environment variables instead. If you have a ci/cd build server, you can store your secrets there.
After searching I did my custom solution
import * as data from './secrets.[stage].json'
if( process.env.NODE_ENV === 'test'){
process.env = Object.assign( data, process.env );
}
//'data' is the object that has the Serverless environment variables
The SLS environment variables in my case at the file secrets.[stage].json
Serverless.yml has
custom:
secrets: ${file(secrets.[stage].json)}

Node - Override function in all files except one

I've been looking around at somehow disabling console.log in my application while running unit tests, and I found answers that say you can override the console.log like this:
console.log = function(){};
I tried putting this in app.js, and it overrides console.log when I'm running the app, but not when running unit tests, so I tried adding it the to test file, but then it overrides mocha / chai's console.log, and I get a blank screen.
Is there a way to override the console.log in all files except the one running?
What you would probably want to do instead is use a logging library like Loggly or Bunyan. With these you pass the message you want to log to the client and then you can output those logs based on the environment you are in. In your case you want to log during production but not during testing (kindof odd, but whatever). So you would set process.NODE_ENV to dev or prod accordingly and the logger would take care of the logging for you. Here's an overview of some loggers.

Gruntfile to run app and mock test from single grunt command

I have a Node.js Express REST API app that works. Good.
I have a Mocha/Chai/Supertest mock that tests the API app above. Good.
But I have to start the app and then independently run the mock test.
How can I run a single grunt command that starts the API app, let's it get up and going, and then runs the mock test?
Or do I need to run the API app in some kind of test mode (via env var) and have test-only logic somehow invoke the mock test?
I can try some things and get something to work, but what is the good way? (Avoiding overused phrase 'best practice'.)
You can do that with grunt-express-server and grunt-mocha-test, you will juste have to setup your task like below :
grunt.registerTask('test', ['express:test', 'mochaTest']);
This will run your express server with the config you have set for the test environement then run mocha when you run grunt test.
Since you are using supertest I suppose you are doing functionnal testing which means that you will be using the same database for developement and testing (if you are not mocking something). That can be time loosing and make your test fail because of bad data. Using two different environement makes sure of the state of your data when you are running the test.
You can still use grunt watch plugins to relaunch your test on file change if you don't want to have to do it manually.
Hope this helps

is require.js the right tool (for the job) for "normal" web sites"

I have started building a web site and have used Require.js for selectively loading scripts which I require to implement functionality. What I am experiencing is:
the "main" script has not finished executing (or even downloading) before some of my code uses "require" to load dependencies. What this means is that the require.js config has not run and does not know the locations of my scripts.
because the require.js config has not run by the time my code needs to use it, the "shim" mechanism has not been initialised and cannot be used.
The Common Errors page along with a lot of the issues I seem to be reading about online while trying to solve my own problems seem to suggest that this is not the right tool for the job.
This seems to be useful for single page applications or node.js applications, but not traditional sites where other scripts could be running before require.js has been initialised.
If require.js is not the right tool for the job, is there a right tool for this job? If so then what is?
Are you loading the require.js script asynchronously (with an async='async')? You want requirejs to load synchronously. Once it's loaded, it will load further scripts, like your main.js file, asynchronously. They may all load out of order, but the code will actually get executed in the right order (respecting the declared dependencies).
So in your page template, you would have this:
<script src="/Scripts/require.js" type="text/javascript" data-main="main.js"></script>
That will load RequireJS, and once it's loaded it starts loading your main.js asynchronously. Typically main.js does not define any modules, it just makes use of modules defined in other files. These dependencies are listed in the require() call:
require(["moduleA", "moduleB"], function(A, B){
// Do something with A and B
A.someFunction();
B.someOtherFunction();
});
The files moduleA.js and moduleB.js must wrap their contents inside a define(). In moduleA.js (which depends on module C):
define(["moduleC"], function () {
// Build up an A
var A = ....;
return A;
});
I wonder now if you're wrapping your modules in a define call. Not doing that could explain the out-of-order execution you're experiencing.
RequireJS is a perfectly valid tool on a traditional site, not just on a single-page site.

Resources