How to import simple bit-src component into projects' /src folder? - bit-src

I'm trying bit-src (or should I call it bit.dev?) for the first time...
I have two files - a .ts with just 1 dependency and a .md with its docs - which I use in many projects, which I want to turn into a bit component.
So, I find these two files in the /src folder of one of my projects, add it as a component, tag it and then export it to my collection. So far so good.
Then I go to another project that needs to share in the use of this component and I try to import it (if the files were already there I deleted them first)..
$ bit import <user>.<colx>/<component> --path src
I get an error message stating that the directory is not empty (which is true).
If I do install it into its own directory then it also creates a bunch of overhead I don't want, but I guess that's not the point - it should be in my /src folder along with other code, same as it was in the project I created it from.
Is there some way to do what I want? Or should I be using a different tool entirely?

if you want to change the default location of imported component, you can do it by edit the bit configuration in the package.json, from this:
"bit": {
"env": {},
"componentsDefaultDirectory": "components/{name}",
"packageManager": "npm"
}
to this:
"bit": {
"env": {},
"componentsDefaultDirectory": "src/{name}",
"packageManager": "npm"
}
If you want to use the --path flag, you need to specify the folder, like this:
bit import <user>.<colx>/<component> --path src/<component>
I hope it will help you 😃

It seems that using the --path <foldPath> flag places the component files directly at foldPath.
this would include the /dist folder, node_modules, package.json, etc.
So, it makes sense to do this in a new, empty folder.
maybe use import like this?
bit import ... --path ./src/componentName

Related

How to copy files/folders from src folder during a bin execution with Node js?

I would like to copy the content of my 'three-experience' folder into the current "bin execution environment".
I'm using Typescript, but there is no issue with it at the moment (and I'm mapping on the .js dist files).
I have the following root:
My package.json is mapping in the dist folder like so :
"bin": {
"ts-cli": "dist/main.js",
"vite-config": "dist/bin/vite-config.js"
},
During the execution I'm calling the "vite-config" bin (using shelljs):
shell.exec('vite-config');
It may not be clear but the purpose is to do a npm package that would install a template on your local machine.
I can do it with git. But that's not what I want to do.
I thought about filling a process.env.FOLDER global variable, but I am quite sure that's a bad idea.
Any clue would be appreciated !

Pointing the main field in package.json conditionally

I have a Monorepo under Lerna and Yarn Workspaces. The repo has packages which are published to npm and consumed outside the monorepo as well as within the monorepo. While developing in the monorepo we would like the main field of package.json for all such packages to point to the src directory, while when a package is used outside the monorepo, we would like the consumer to use the transpiled code in the dist folder.
I want this to be consistent across all uses of the packages, My current solution is to have the main field point to the dist folder. Then for each of the tools within the monorepo, namely jest, tsc, webpack, parcel I've had to come up with a different tool specific solution for aliasing the src directory instead of the dist directory. But I don't like the fact that I've had to do this work for each of these tools. It just doesn't seem scalable.
Has anybody come up with a lower level solution, where a module resolves to a different folder based on the environment?
Thank you.
If your internal code base is always transpiling the sources, why not just import { thing } from "my-package/src/main.js?
Then you can just leave the main field as dist for the consumers who ideally shouldn't have to keep track of additional paths when importing your packages.
There are a lot of details left our in your answer, but assuming you're using a single webpack/other instance to compile all your packages.
Another approach, since you're already coupled all your packages via the same compilation step, why not just use relative paths between the packages? That way you'll never have to act as a consumer but with slightly different needs.
And finally the third approach, which I think sounds a bit convoluted but should do exactly what you're asking for. Create a script that uses globby or some other npm package to grab all package.json files in your repository (excluding node_modules!). require() / iterate through these package.json manifest files and set the main field to an input value (say "dist"). Then, create two bin js files (hint: bin field) called set-main-dist and set-main-src, and possibly a third called unset-main.
Next, no matter what scripts you run in your package.json files at the root (or using lerna run), make sure to let the script look either like this:
"prebuild": "set-main-src"
or like this
"build": "set-main-src && build etc"
Hope one of these options work out for you. Remember that it's rarely worth going against the stream of usual patterns in tooling and what not. Make this one worth it.
I had exactly the same dilemma but with yarn3.
The solution importing always from source dint worked for my case, as the package itself might get published to npm too.
So after digging around I luckily found the package.json property publishConfig.main https://yarnpkg.com/configuration/manifest#publishConfig
With it I can change the main field from source to dist on npm publish.
Basically only for publishing we use a modified package.json.
Implemented in my package.json it looks like this:
{
"main": "./src/index.ts",
"publishConfig": {
"main": "./dist/index.js"
}
}
and when I run yarn npm publish or yarn pack the main field will be replaced temporary for the zip.
While all my tooling (jest and ts) can still rely on the main field pointing to the source.

Setting the "root" of a published npm project

I'm publishing an npm package named foo to the npm registry.
I wrote the package using a compile-to-js language.
For sanity, I put the compiled output into the dist/ folder of the project directory.
My package.json lists the entrypoint as dist/entry.js:
{
"name": "foo",
"main": "dist/entry.js",
}
Sometimes, I want to use files within the package that are not part of the entry point. For example, there is a very useful export called whatever inside of dist/util.js:
import { whatever } from "foo/dist/util";
This works, but forcing the users of my package to type dist/ in all import statements is inconvenient.
Furthermore, re-exporting every possible util function is not DRY. I do not want to re-export from the entrypoint.
Ideally, I would like to import files from dist/ using the following syntax:
import { whatever } from "foo/util"
How do I configure my package.json to search for files in the dist/ folder of my project?
This cannot be done.
This is the reason why some packages have entry point file that re-exports all public exports (not everything that resides in dist is intended to be used by end user), e.g. #angular/core.
And the reason why some packages have unsuitable file structure that is published to NPM registry and favours proper import paths, e.g. rxjs.

multi-repository graphql types with local npm link

Given 2 repositories user-repo and project-repo, I define UserType in the user-repo, and in package.json of project-repo I do this:
"dependencies": {
"user-repo": "git+ssh://git#github.com/me/user-repo"
}
and everything works (UserType loads in project-repo).
However if I link the repo locally like so:
"dependencies": {
"user-repo": "file:../../user-repo"
}
UserType instanceof GraphQLObjectType returns false. And it's only the graphql types that are acting up. Everything else like models is getting loaded just fine.
I've tried npm linking (npm link user-repo), doing it both the require and import ways, and it didn't help.
Any ideas?
Ok, here I go, I'm not entirely sure but I'm betting on the fact that instanceof is kinda tricky and I'm imagining the following scenario:
You build your schema inside user-repo and you run the check in project-repo. Both repos have their own graphql module dependency installed meaning that GraphQLObjectType constructor in user-repo is different from the one in project-repo, that's why instanceof returns false. You can confirm it's actually ok by doing UserType.contructor.name === 'GraphQLObjectType' and that should be true.
A solution would be to import GraphQLObject from the same module. See if that works.
Edited this answer as previously I've provided a solution with webpack aliasing, but that doesn't apply on backend.
This problem was also discussed in graphql-js issue tracker:
https://github.com/graphql/graphql-js/issues/1091#issuecomment-349874772
Answer by FB developer #leebyron:
Unfortunately npm link is pretty broken, though I'm surprised using a file path as a dependency didn't work for you.
This issue occurs when there are multiple instances of graphql-js in your node_modules directory. npm-link works by creating symlink to another directory, if that other directory also has a node_modules folder with graphql-js within it, then you'll have multiple copies.
When I develop in this way, I usually set up a script which cp's my sub-project's built source directly into the main project's node_modules directory, avoiding any filesystem symlinking hyjinx.
We ended up doing this: common code is no longer a npm package, we use symlink to map common code directory in each application directory and import/require same way as any application module.
Hint for those using docker: you will not be able to put common code into the container this way, docker forbids symlinks. To go around this you can copy the common code in the app temporary directory, build container and remove temporary directory afterwards.

require.context for node modules does not work with linked package

I created kind of a plug-in web site. Each plug-in follows a certain name standard and a contract. Inside my main.js I load all packages "dynamically" following the name standard:
const context = require.context("../../node_modules", true, /plugin\-\w+\/dist\/(index\.js|styles\.css)$/)
This requires a context throughout node_modules folder and loads all modules named "plugin-X". From within those modules it looks for "dist/index.js" and "dist/styles.css". Those later get imported:
context.keys().forEach(path => {/* do stuff */ })
This works super nice as long as the packages are installed using npm install path/to/tgz. However this does not give a pleasant developing experience. So in my plugin-X folder I use "npm link" and in my web site I use "npm link plugin-X".
when I start webpack now the whole thing explodes after creating 15K file handles. If I remove the node_modules folder inside "plugin-X" it works. However I need node_modules for building with babel and other stuff to recognize changes in src folder to rebuild and put new scripts into dist folder.
Is there any way I could do this? Or maybe another option I missed during my research on how this could be done?
Regards

Resources