NestJS Monorepo Microservices - Cannot find module '#work/contracts/dist/adresses' or its corresponding type declarations - node.js

I'm trying to build a NestJS Monorepo service using Microservices architecture and TypeScript as primary language.
The monorepo project consists of two microservice apps - 'work' and 'workers'.
The 'work' service app is meant to power the 'workers' app that exists in the same NestJS monorepo project.
The problem is, when I start the 'work' app and try running the 'workers' app, it doesn't recognize the existence of the 'work' app or its path, and thus fails to run.
I get the error below.
src/application/work/load-address.ts:4:41 - error TS2307: Cannot find module '#work/contracts/dist/adresses' or its corresponding type declarations.
I'm using 'npm run start' to run the 'workers' app.
Thanks in advance!
The 'work' package.json file looks like this:
{
"name": "#work/create",
"version": "0.0.1",
...
...
}
The 'workers' package.json file looks like this:
{
"name": "#workers/contracts",
"version": "0.0.1",
...
...
}
And I import 'work' into 'workers' like this:
import { ADDRESS } from '#work/contracts/dist/adresses';

Related

Newly made NPM package is found, but deeper import references are all unfound

This is my first go publishing an NPM package and I feel like a newb. My basic imports aren't working, both within the module itself and when trying to reference specific files within the package from outside. The whole npm publish -> npm install part works as expected.
The file structure is a ./lib directory, with a ./lib/data-types directory. The main files with the objects getting exported live in the lib, and some other "helper" files live in the data-types.
- index.js, etc
- /lib
-- connection.js
-- session.js
-- /data-types
--- point.js, etc
I have an index.js file that's just a passthrough for some other objects:
import Connection from "./lib/connection.js"
import Session from "./lib/session.js"
export default {
Connection,
Session,
}
And I've defined the main export and the data-types in package.json:
{
"name": "ef-vue-crust",
"type": "module",
"main": "index.js",
"exports": {
"." : "./index.js",
"./data-types/": "./lib/data-type/*.js"
},
...
}
The basic import from my application seems to work, i.e. import {Connection} from 'ef-vue-crust', except for the aforementioned inner disconnect. index.js is unable to find the following files:
import Connection from "./lib/connection.js"
import Session from "./lib/session.js"
Module not found: Error: Can't resolve './lib/session.js' in 'C:\Projects\my-app\node_modules\ef-vue-crust'
Directly importing a file from the ./lib/data-type/ directory has the same issue in my application:
import Point from '#ef-vue-crust/data-types/point.js';
Does anyone see the disconnect?
Part 1: changed export default {} to export {} in index.js.
Part 2: looks like I was missing an * in the exports:
{
"name": "ef-vue-crust",
"type": "module",
"main": "index.js",
"exports": {
"." : "./index.js",
"./data-types/*": "./lib/data-type/*.js"
},
...
}
And finally: I had some strings flat out not matching in the imports, which became obvious once the above was fixed.
So I suppose the answer is "attention to detail"

Unit testing AWS Lambda that uses Layers - Node JS app

I have a SAM application with a bunch of Lambda functions and layers, using Mocha/Chai to run unit tests on the individual functions.
The issue is, that I am also using Layers for shared local modules.
The SAM project structure is like this..
functions/
function-one/
app.js
package.json
function-two/
app.js
package.json
layers/
layer-one/
moduleA.js
moduleB.js
package.json
layer-two/
moduleC.js
package.json
According to AWS once the function and layers are deployed, to require a local layer from a function you use this path...
const moduleA = require('/opt/nodejs/moduleA');
However, that running locally as a unit test wont resolve to anything.
Any idea on how to resolve the paths to the layer modules when running unit tests?
I could set an ENV var, and then set a base path for the layers based on that, but I was wondering if there was a more elegant solution I was missing...
Is there any way to alias the paths when running Mocha ?
other options are to use SAM INVOKE but that has massive overheads and is more integration testing...
I swapped over to using Jest which does support module mappings
In the package.json...
...
"scripts": {
"test": "jest"
},
"jest": {
"moduleNameMapper": {
"^/opt/nodejs/(.*)$": "<rootDir>/layers/common/$1"
}
}
...

What is proper way to store code/functions that are used by both the frontend and backend?

My frontend Reactjs app is stored in one repository.
My backend Node.js app is stored in another repository.
There are some functions used by both. Where should store those functions so that both repositories can access them?
You can create a library that exports all of the functions you'll be needing, then publish it to NPM and add it to the dependencies of both projects' package.json. With NPM you can set your packages as private, too, in case you don't want your code/package to be publicly available.
The starting point would be to create a directory with all the functions you need, export them all in an index.js, and run npm init to create a package.json for your new project. You'll be guided for naming and assigning a version number, then publish with npm publish (you may need to create an account and run npm login first). Then in your frontend and backend projects you simply npm install <your-package> like any other npm package.
Your project directory may be as simple as...
myFunctions.js
index.js
package.json
myFunctions.js:
export const functionA = () => {
return "a"
}
export const functionB = () => {
return "b"
}
index.js:
export * from './myFunctions.js'
package.json (can be created with npm init:
{
"name": "my-functions",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC"
}
Then in the directory run npm publish, and in your other projects you can run npm install my-functions.
And finally, in your other projects:
import { functionA } from 'my-functions';
// ...
functionA() // returns "a"
Creating a separate NPM package for your helper functions can certainly be a good solution, but I find them somewhat annoying to maintain across different repositories. I tend to try and avoid them.
There are certainly some functions in your application that do have purpose on both the front- and backend, but I would encourage you to look at these carefully to see if that logic can be the responsibility of one or the other (backend or frontend).
For example; if you have a function to parse a date and format it in a very specific way for your app then you can have that function live solely in the backend and leverage it to pass back the already converted value to the frontend - avoiding the burden of maintaining it in 2 places or in a separate package that then needs to be updated in 2 repositories.
Sometimes there's just no getting around it though, but I found that in most cases I can split them accordingly.

Yoga server deployment to the now.sh shows directory listing instedad the application

I can run the app locally without any issue by yarn start command. here I have provided photographs which represent my problem. I googled and noticed several people faces the same problem. but their context is different.
By default, Now publishes your files as a static directory. You can add a builder to your now.json file to tell Now how to build and deploy your site.
In a case where app.js contains a web server application, your now.json might look like this:
{
"version": 2,
"name": "my-project",
"builds": [
{"src": "app.js", "use": "#now/node"}
]
}
This tells Now to use the #now/node builder to generate a lambda that runs app.js to respond to requests.
If your app is purely js+html to be run on the client machine, you wouldn't need the lambda, but you can still build the source before deploying it as static files with #now/static-build.
Check out the Now docs for more info: https://zeit.co/docs/v2/deployments/basics/#introducing-a-build-step

Sharing code between Firebase Functions and React

I'm using Firebase functions with a React application. I have some non-trivial code that I don't want to duplicate, so I want to share it between the deployed functions and my React client. I've got this working locally in my React client (though I haven't tried deploying) - but I can't deploy my functions.
The first thing I tried was npm link. This worked locally, but the functions won't deploy (which makes sense, since this leaves no dependency in your package.json). Then I tried npm install ../shared/ - this looked promising because it did leave a dependency in package.json with a file: prefix - but Firebase still won't deploy with this (error below).
My project directory structure looks like this:
/ProjectDir
firebase.json
package.json (for the react app)
/src
* (react source files)
/functions
package.json (for firebase functions)
index.js
/shared
package.json (for the shared module)
index.js
My shared module package.json (extraneous details omitted):
{
"name": "myshared",
"scripts": {
},
"dependencies": {
},
"devDependencies": {
},
"engines": {
"node": "8"
},
"private": true,
"version": "0.0.1"
}
My firebase functions package.json (extraneous details omitted):
{
"name": "functions",
"scripts": {
},
"dependencies": {
"myshared": "file:../shared",
},
"devDependencies": {
},
"engines": {
"node": "8"
},
"private": true
}
When I try to deploy with:
firebase deploy --only functions
It's telling me it can't load the module:
Function failed on loading user code. Error message: Code in file index.js can't be loaded.
Did you list all required modules in the package.json dependencies?
And I don't think the issue is how I export/imported my code- but just in case:
The export:
exports.myFunc = () => { some code };
The import (functions/index.js)
const myFunc = require('myshared');
And in my react code:
import { myFunc } from 'myshared';
So far the searching I've done hasn't yielded anything that works. Someone did mention entering the shared module path in firebase.json, but I couldn't find any details (including in the firebase docs) that show what that would look like. Thanks for any tips to get this going.
I found a solution. I'm not sure if it's the only or even the best solution, but it seems to work for this scenario, and is easy. As Doug noted above, Firebase doesn't want to upload anything not in the functions directory. The solution was to simply make my shared module a subdirectory under functions (ie ./functions/shared/index.js). I can then import into my functions like a normal js file. However, my shared folder also has a package.json, for use as a dependency to the react app. I install it using:
npm install ./functions/shared
This creates a dependency in my react app, which seems to resolve correctly. I've created a production build without errors. I haven't deployed the react app yet, but I don't think this would be an issue.
Another solution is to create a symlink. In terminal, under /ProjectDir, execute:
ln -s shared functions/shared
cd functions
npm i ./shared

Resources