I have created three env files : .env ,.env.test,.env.prod in my root directory.
.env file has
PORT=4000
DATABASE_URL="postgresql://postgres:1234#localhost:5432/postgres?schema=public"
JWT_SECRET="THIS_IS_A_SECRET_KEY_DEV_ENVIRONMENT"
JWT_EXPIRATION_TIME=3600
And my .env.test has
PORT=4001
DATABASE_URL="postgresql://postgres:1234#localhost:5432/postgres?schema=public"
JWT_SECRET="THIS_IS_A_SECRET_KEY_TEST_ENVIRONMENT"
JWT_EXPIRATION_TIME=2400
My app.moudle.ts looks like
const ENV = process.env.NODE_ENV;
console.log(ENV);
#Module({
imports: [
ConfigModule.forRoot({
envFilePath: !ENV ? '.env' : `.env.${ENV}`,
load: [configuration],
ignoreEnvFile: true,
}),
providers: [AppService],
controllers: [AppController],
})
export class AppModule {
constructor(private connection: Connection) {}
}
configuration.ts :
export default () => ({
port: parseInt(process.env.PORT, 10) || 3000,
jwt:{
secret:process.env.JWT_SECRET,
expiresIn:process.env.JWT_EXPIRATION_TIME
}
});
And my scripts are :
"start:dev": "nest build && nest start --watch",
"start:test": "NODE_ENV=test nest start --watch",
Controller code :
Logger.log(this.configService.get<string>('jwt.secret'));
When i inject conifgService in any controller and try log the jwt key it always fetches the key from .env file.Even when i run the app using script "start:test"
It always gets the value from .env file irrespective of script/environment
Logs this for all the scripts "THIS_IS_A_SECRET_KEY_DEV_ENVIRONMENT"
Related
I have a barebones NestJS app where all I have done is add a .env file with PORT=3001 as the content and then modified my main.ts according to the NestJS docs:
import { ConfigService } from '#nestjs/config';
import { NestFactory } from '#nestjs/core';
import { AppModule } from './app.module';
async function bootstrap() {
const app = await NestFactory.create(AppModule, {bufferLogs: true});
const configService = app.get(ConfigService);
const PORT = configService.get('PORT');
app.listen(PORT);
}
bootstrap();
My AppModule:
#Module({
imports: [
ConfigModule.forRoot({isGlobal: true}),
UsersModule
],
controllers: [AppController],
providers: [AppService],
})
export class AppModule {}
When I run the app, it always runs on port 3000. It never runs on port 3001. What is going on???
So it turns out that npm run start:dev (aka "start:dev": "nest start --watch") doesn't actually rebuild! I had to kill the process, run npm run build to update my dist folder, and then run npm run start:dev. What a headache.
I have a legacy code for my express app that read all routes files in specific dir and require them in a loop. Notice this code cant be changed:
app.js
const normalizedRoutes = fs.readdirSync(__dirname + '/src/routes/')
.map(routeFile => `/src/routes/${routeFile}`);
normalizedRoutes.forEach((normalizedRouteDir: string) => {
require(normalizedRouteDir)(app);
})
Now, I want to combine a Server Side Rendered application with the code above, using some JSX in routes files.
My problem is because the routes files are loaded on run time webpack not recognize them when creating the bundle.js file.
Therefore there are not routes files in the /src/routes/${routeFile} and when I run the bundle.js file I get an error message of:
Error: ENOENT: no such file or directory, scandir '/Users/******/build/src/routes/'
(the stars are for hiding full path)
webpack configs:
webpack.base.js
const MiniCssExtractPlugin = require("mini-css-extract-plugin");
module.exports = {
plugins: [new MiniCssExtractPlugin()],
module: { //remain
rules: [
{
test: /\.(ts|js)x?$/,
loader:'babel-loader',
exclude: /node_modules/,
options:{
presets:[
'#babel/react',
['#babel/env',{targets:{browsers:['last 2 versions']}}]
]
}
},
{
test: /\.css$/i,
use: [MiniCssExtractPlugin.loader, "css-loader"],
},
],
}
};
webpack.server.js
const path = require('path')
const {merge} = require('webpack-merge')
const baseConfig = require('./webpack.base.js');
const webpackNodeexternals = require('webpack-node-externals');
const CopyWebpackPlugin = require('copy-webpack-plugin');
const config = {
mode: "development",
entry: {
main:"./app.ts",
},
resolve: {
extensions: [".js", ".jsx", ".json", ".ts"],
},
node: {
__dirname: true
},
output: {
libraryTarget: "commonjs",
path: path.join(__dirname, "build"),
filename: "bundle.js",
},
target: "node",
//Avoid put node modules of server when sending to browser
externals: [webpackNodeexternals()]
}
module.exports = merge(baseConfig,config)
scripts from package.json:
"dev:server": "nodemon --watch build --exec \"node build/bundle.js\" ",
"dev:build-server": "webpack --config webpack.server.js --watch",
When I copy the route files (js files) to the build directory it works of course but that means I don't run webpack on these files and therefore I can't include JSX\es6 features inside these files.
So my question is:
Is there any possible way to make these requires identify by webpack/babel to add them to bundle.js and avoid the need for seperate files (bundle.js and routes files)
If we cant do it, how can I run webpack on a folder seperatly from the bundle.js output and create a route folder in the correct path but after processed by babel?
Thanks!
Instead of using a Webpack you can try using a programmatic interface of babel, and transpile the files before requiring them.
Here is the link https://babeljs.io/docs/en/babel-core
I developed an full stack app using Angular as the frontend and Nestjs as the backend. The project is organized in a monorepository with NX. The project works fine on my local machine, including authentication (Passport library from Nestjs).
So I decided to dockerize the app, but when the app is run in the docker container the protected routes from the Nestjs backend are not accessible and I am getting a 401 error despite using the JWT token that I got from the same instance running in the docker container.
I migrated the project from bcrypt to bcryptjs because I was getting an error when building the docker container. The thing that confuses me mostly is that on my local machine everything works fine, but in the docker container the protected routes are not accessible despite using the JWT token that I got from the backend.
Dockerfile
FROM node:14
ENV PORT=3333
WORKDIR /app
COPY ["package.json", "package-lock.json*", "npm-shrinkwrap.json*","nx.json", "./"]
RUN npm install
COPY ./apps .
EXPOSE 3333
CMD npm start
docker-compose.yml
version: '3.4'
services:
myapp:
image: myapp
build:
context: .
dockerfile: ./Dockerfile
volumes:
- .:/app
depends_on:
- postgres
environment:
API_PORT: 3333
JWT_SECRET: verystrongsecret
JWT_EXPIRES_IN: 3600
DB_TYPE: postgres
DB_PORT: 5432
DB_HOST: postgres
DB_USERNAME: user
DB_PASSWORD: password
DB_NAME: db
NODE_ENV: development
TYPEORM_SYNC: 'true'
ports:
- 3333:3333
postgres:
image: postgres:10.4
ports:
- 5432:5432
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: db
How to solve the issue with the authentication? What could cause docker to this strange behavior?
JwtStrategy:
export class JwtStrategy extends PassportStrategy(Strategy) {
constructor(
#InjectRepository(UserRepository) private userRepository: UserRepository
) {
super({
jwtFromRequest: ExtractJwt.fromAuthHeaderAsBearerToken(),
secretOrKey: process.env.JWT_SECRET || config.get('jwt.secret'),
});
}
async validate(payload: JwtPayload): Promise<User> {
const { email } = payload;
const user = await this.userRepository.findOne({ email });
if (!user) {
throw new UnauthorizedException();
}
return user;
}
}
to protect the routes I am using the : #UseGuards(AuthGuard())
where AuthGuard is from the passport library.
The AuthModule:
#Module({
imports: [
PassportModule.register({ defaultStrategy: 'jwt' }),
JwtModule.register({
secret: process.env.JWT_SECRET || jwtConfig.secret,
signOptions: {
expiresIn: process.env.JWT_EXPIRES_IN || jwtConfig.expiresIn,
},
}),
TypeOrmModule.forFeature([UserRepository]),
],
controllers: [AuthController],
providers: [AuthService, JwtStrategy],
exports: [AuthService, JwtStrategy, PassportModule],
})
export class AuthModule {}
Well, i have 3 types of environments (i.e development,test,production) am using nodejs with express. My problem is this my either development and production scripts don't run because they can't access .env variables i have searched online but i can't find something helpful. This is what i did i created .env file and put my variables in. i tried using export command i.e export key=value. please help
I created a .env file and added either of development database url and production database url, but when i run either of environment it doesn't work. i also tried using export command export key=value. but it works for a while and then it fails again.
//my config
require('dotenv').config();
module.exports ={
development :{
use_env_variable: process.env.DEVELOPMENT_URL,
dialect: 'postgres'
},
production :{
use_env_variable:process.env.PRODUCTION_URL,
dialect: 'postgres',
}
}
//my package.json scripts
{
"name": "report_deck",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"start": "export NODE_ENV=production && sequelize db:migrate && node ./build/index.js",
"dev": "nodemon --exec babel-node ./api/index.js",
"test": "export NODE_ENV=test && sequelize db:migrate:undo:all && sequelize db:migrate && nyc --require #babel/register mocha ./api/test/test.js --timeout 20000 --exit",
"build": "rm -rf ./build && babel -d ./build ./api -s",
"generate-lcov": "nyc report --reporter=text-lcov > lcov.info",
"coveralls-coverage": "coveralls < lcov.info",
"codeclimate-coverage": "codeclimate-test-reporter < lcov.info",
"coverage": "nyc npm test && npm run generate-lcov && npm run coveralls-coverage && npm run codeclimate-coverage"
},
}
//.env
DEVELOPMENT_URL=postgres://example1:pass#example:5432/dbname
PRODUCTION_URL=postgres://example2:pass#example:5432/dbname
//my index.js
import express from 'express';
import bodyParser from 'body-parser';
import classRoutes from './server/routes/classRouter';
// all routes
import cors from 'cors';
const app = express();
app.use(bodyParser.json());
app.use(cors());
//use all routes
app.use(bodyParser.urlencoded({ extended: false }));
const port = process.env.PORT || 8003;
app.get('*', (req, res) => res.status(200).send({
message: "Entrance"
}));
app.listen(port, () => {
console.log("Entrance done, We are running at port " + port);
});
export default app;
Expectations:
It should log "entrance done we are running on port 8003" for (npm run dev)
It should log "entrance done we are running on port 5000" for(heroku local web)
Actual:
throw new TypeError('Parameter "url" must be a string, not ' + typeof url);
You should add -r dotenv/config to your starting script to preload dotenv =>
"start": "export NODE_ENV=production && sequelize db:migrate && node -r dotenv/config ./build/index.js",
Check docs
Here is my project structure:
package.json:
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"dev": "nodemon $NODE_DEBUG_OPTION server/boot.js --exec babel-node",
"start": "nodemon server/boot.js --exec babel-node",
"build": "babel server -d dist/server",
"serve": "node dist/server/boot.js"
},
The main file is server/boot.js:
import dotenv from 'dotenv';
import path from 'path';
dotenv.load({path: path.join(__dirname, '.env')});
import _ from 'underscore';
import configs from './config/index';
The server/config/index.js is only a barrel file that imports the other config files:
import app from './app';
import database from './database';
export default Object.assign({}, app, database);
In each of the config files I am not able to access any properties of the process.env object that are defined in the .env file.
Here is one of the config files for reference:
export default {
app: {
host: process.env.HOST || 'localhost',
port: process.env.PORT || 9000,
}
}
Here process.env.HOST is undefined, but the key is present in the .env file.
What I am doing wrong?
process.env object that are defined in the .env file.
Can you please be more specific about the process.env file?
As per https://www.npmjs.com/package/dotenv#rules the file should be in the format:
VAR1=value1
VAR2=value2
and not in
export default {
VAR1: 'value1',
VAR2: 'value2'
}