TypeORM configuration - node.js

I'm writing a Typescript NodeJs server.
I use TypeORM and for work it needs a config file with two arrays of paths or functions to entities and migrations.
Right now it looks like
{
subscribers: ['build/subscriber/*.js'],
migrations: ['build/migration/*.js'],
}
When I'm starting my app it'll be transpile with tsc and create a build folder with js files. And in these case everything works fine.
But TypeORM have a CLI tool, and I want use it for creating migrations. But I don't want to transpile all projects just to create migration. I'd run the CLI command with ts-node and use ts files. But without transpiration "build/subscriber/*.js" doesn't exist.
Can I do something to use TypeORM CLI without transpiration the whole project?
P.s. If I change config paths to
{
subscribers: ['src/subscriber/*.ts'],
migrations: ['src/migration/*.ts'],
}
The project will stop running.
May there exists a way to see in code transpiled them or not to implement something like optional paths
{
subscribers: isTranspiled ?['build/subscriber/*.js'] : ['src/subscriber/*.ts'],
migrations: isTranspiled ? ['build/migration/*.js'] : ['src/migration/*.ts'],
}

We have the same problem and do the following:
// Hack for webpack
const migrations_path = __dirname.trim() === '/usr/src/app/dist' ? __dirname.trim() : path.join(__dirname, '..');
{
...
migrations: [migrations_path + '/typeorm/migrations/*.{js,ts}'],
}
You'll need to update for your own paths, but this is how we support the dist for production webpack build and for our src folder during hot reloading.

Related

How to import a node module inside an angular web worker?

I try to import a node module inside an Angular 8 web worker, but get an compile error 'Cannot find module'. Anyone know how to solve this?
I created a new worker inside my electron project with ng generate web-worker app, like described in the above mentioned ng documentation.
All works fine until i add some import like path or fs-extra e.g.:
/// <reference lib="webworker" />
import * as path from 'path';
addEventListener('message', ({ data }) => {
console.log(path.resolve('/'))
const response = `worker response to ${data}`;
postMessage(response);
});
This import works fine in any other ts component but inside the web worker i get a compile error with this message e.g.
Error: app/app.worker.ts:3:23 - error TS2307: Cannot find module 'path'.
How can i fix this? Maybe i need some additional parameter in the generated tsconfig.worker.json?
To reproduce the error, run:
$ git clone https://github.com/hoefling/stackoverflow-57774039
$ cd stackoverflow-57774039
$ yarn build
Or check out the project's build log on Travis.
Note:
1) I only found this as a similar problem, but the answer handles only custom modules.
2) I tested the same import with a minimal electron seed which uses web workers and it worked, but this example uses plain java script without angular.
1. TypeScript error
As you've noticed the first error is a TypeScript error. Looking at the tsconfig.worker.json I've found that it sets types to an empty array:
{
"compilerOptions": {
"types": [],
// ...
}
// ...
}
Specifying types turns off the automatic inclusion of #types packages. Which is a problem in this case because path has its type definitions in #types/node.
So let's fix that by explicitly adding node to the types array:
{
"compilerOptions": {
"types": [
"node"
],
// ...
}
// ...
}
This fixes the TypeScript error, however trying to build again we're greeted with a very similar error. This time from Webpack directly.
2. Webpack error
ERROR in ./src/app/app.worker.ts (./node_modules/worker-plugin/dist/loader.js!./src/app/app.worker.ts)
Module build failed (from ./node_modules/worker-plugin/dist/loader.js):
ModuleNotFoundError: Module not found: Error: Can't resolve 'path' in './src/app'
To figure this one out we need to dig quite a lot deeper...
Why it works everywhere else
First it's important to understand why importing path works in all the other modules. Webpack has the concept of targets (web, node, etc). Webpack uses this target to decide which default options and plugins to use.
Ordinarily the target of a Angular application using #angular-devkit/build-angular:browser would be web. However in your case, the postinstall:electron script actually patches node_modules to change that:
postinstall.js (parts omitted for brevity)
const f_angular = 'node_modules/#angular-devkit/build-angular/src/angular-cli-files/models/webpack-configs/browser.js';
fs.readFile(f_angular, 'utf8', function (err, data) {
var result = data.replace(/target: "electron-renderer",/g, '');
var result = result.replace(/target: "web",/g, '');
var result = result.replace(/return \{/g, 'return {target: "electron-renderer",');
fs.writeFile(f_angular, result, 'utf8');
});
The target electron-renderer is treated by Webpack similarily to node. Especially interesting for us: It adds the NodeTargetPlugin by default.
What does that plugin do, you wonder? It adds all known built in Node.js modules as externals. When building the application, Webpack will not attempt to bundle externals. Instead they are resolved using require at runtime. This is what makes importing path work, even though it's not installed as a module known to Webpack.
Why it doesn't work for the worker
The worker is compiled separately using the WorkerPlugin. In their documentation they state:
By default, WorkerPlugin doesn't run any of your configured Webpack plugins when bundling worker code - this avoids running things like html-webpack-plugin twice. For cases where it's necessary to apply a plugin to Worker code, use the plugins option.
Looking at the usage of WorkerPlugin deep within #angular-devkit we see the following:
#angular-devkit/src/angular-cli-files/models/webpack-configs/worker.js (simplified)
new WorkerPlugin({
globalObject: false,
plugins: [
getTypescriptWorkerPlugin(wco, workerTsConfigPath)
],
})
As we can see it uses the plugins option, but only for a single plugin which is responsible for the TypeScript compilation. This way the default plugins, configured by Webpack, including NodeTargetPlugin get lost and are not used for the worker.
Solution
To fix this we have to modify the Webpack config. And to do that we'll use #angular-builders/custom-webpack. Go ahead and install that package.
Next, open angular.json and update projects > angular-electron > architect > build:
"build": {
"builder": "#angular-builders/custom-webpack:browser",
"options": {
"customWebpackConfig": {
"path": "./extra-webpack.config.js"
}
// existing options
}
}
Repeat the same for serve.
Now, create extra-webpack.config.js in the same directory as angular.json:
const WorkerPlugin = require('worker-plugin');
const NodeTargetPlugin = require('webpack/lib/node/NodeTargetPlugin');
module.exports = (config, options) => {
let workerPlugin = config.plugins.find(p => p instanceof WorkerPlugin);
if (workerPlugin) {
workerPlugin.options.plugins.push(new NodeTargetPlugin());
}
return config;
};
The file exports a function which will be called by #angular-builders/custom-webpack with the existing Webpack config object. We can then search all plugins for an instance of the WorkerPlugin and patch its options adding the NodeTargetPlugin.

How to do Node.js path aliases?

I am using Webpack in order to build my Node.js backend project.
In my Webpack config I have aliases, like this:
alias: {
'#': resolvePath('src'),
'#app': resolvePath('src/app'),
},
When I build the app for production, those aliases are working, since Webpack converts them to the their real paths on the created bundle.
However I ran nodemon in order to develop the API. And nodemon is not related to the webpack bundle or the devserver.
Therefore this is getting error (obviously):
let myService = require('#services/my-service');
I want to ask how to run those aliases using Webpack config OR allow kind of aliases using the nodemon server before a bundle was created - in dev mod.

How to make build for react app for different stages?

I have a single page application which is a react app. I am using webpack for it. I am facing problem in configuring server API URL for every stage like test, beta and prod.
Is there some standard way of doing it?
Create a .env and add your variables there ensuring that they are prefixed with REACT_APP e.g. REACT_APP_SERVER_URL=https://example.com
You can create multiple env files one each for dev, prod, test etc. like .env.local, .env.prod
The env files injected from your npm commands
npm start: .env.development.local, .env.development, .env.local, .env
npm run build: .env.production.local, .env.production, .env.local, .env
Use the variable in your code like
if (process.env.NODE_ENV !== 'production') {
analytics.disable();
}
OR
<b>{process.env.NODE_ENV}</b>
Refer https://github.com/facebook/create-react-app/blob/master/packages/react-scripts/template/README.md#adding-development-environment-variables-in-env
Do that based on NODE_ENV.
declare an url file in your application for the basepath of it.
const baseURL = (__DEV__) ? url1 : url 2;
or a switch statement, does'nt matter.
Do be able to have access to these variables, you have to use DefinePlugin from webpack.
new webpack.DefinePlugin({
envType: JSON.stringify(process.env.NODE_ENV)
})
or something like that...

Jest not picking up configs in multi project mode

We recently migrated two different repos into a monorepo. Each uses jest with its own custom configurations, defined in their own package.json files.
I'd like to use the --projects flag to run Jest across both projects from the root of the monorepo. I've added a jest.config.js file to the root of the monorepo:
module.exports = {
projects: ['<rootDir>/projectA', '<rootDir>/projectB']
};
The runner successfully picks up the tests for both projects, but it doesn't appear to be using each project's custom configuration. For example, in "projectA", I'm using babel-plugin-module-resolver. When I run jest in that project alone, babel-jest successfully picks up that plugin and it works fine, but when I run it from the root in multi-project mode, I get "Cannot find module..." errors that indicate the plugin isn't being used.
Similarly, in "projectB" I'm using a custom setupTestFrameworkScriptFile. Running jest in this project runs that file just fine, but it's ignored when running from the root.
My understanding of the multi-project mode was that each individual project should keep its own settings/configs intact. Did I miss something? Do I need to configure these in the root as well?
I think there are some bugs with jest multi project runner, we need to provide some failing examples so jest can fix it. There are almost no docs about it
I made this work providing custom babel-transformer instead of using babel-jest directly.
Check this link https://twitter.com/sseraphini/status/1061779382669316098
Use this for your transformer inside packages
const config = require('../shared/babel.config.js');
const { createTransformer } = require('babel-jest');
module.exports = createTransformer({
...config,
});
and use this for your root transfomer
const { join, resolve } = require('path');
const { createTransformer } = require('babel-jest');
const packagePath = resolve('../');
const packageGlob = join(packagePath, '*');
module.exports = createTransformer({
babelrcRoots: packageGlob,
});
use like this on jest.config.js
transform: {
'^.+\\.(js|ts|tsx)?$': '<rootDir>/test/babel-transformer',
},

How to set rootDir as process.cwd() + '/..'

My application was setup as a mono repo. This is the structure:
\mono-repo
|--core
|--app1
|--app2
I run jest at app2. But I also want to run all tests from core. So here is my config
{
rootDir: process.pwd() + '/..',
roots: [ 'core', 'app2' ],
}
But it didn't work.
Could you give me any idea for this case?
Using process.cwd() works for me. It looks like you're using a JSON file rather than a JS file, so it's not interpreting the command. You can also add --showConfig to the jest command for further debugging.

Resources