I'm using npm link as described here
http://npmjs.org/doc/link.html
Locally everything works perfectly. When I deploy to Heroku I get the error message
Error: Cannot find module '...'
How can I get this working with Heroku?
I wish there were an elegant solution to this (it would make my life a hell of a lot easier). Your custom package is symlinked into node_modules by npm link, but git doesn't follow symbolic links nowadays. So when you git push to Heroku, there's no way to make your custom packages go along for the ride.
Note, however, that from my experiments, Heroku will honor any node_modules you do push in, instead of trying to install them from the network. It just runs npm install --production, essentially. Perhaps a hard link directly to the development source of your package would do the trick, but I'm not sure whether Git would play nicely with that. Use at your own risk!
EDIT: If you want to know exactly what Heroku does, it's all open source.
The ideal situation would be to get the packages, if they're open source, onto NPM itself. Which is pretty painless and automatic.
If you are hosting your private module on GitHub (or BitBucket), you can add the git repo as a dependency in your package.json.
"dependencies": {
// ... all your deps
"my_private_module": "git+ssh://git#github.com:my-username/my-private-module.git"
}
You will, however, need to grant privileges to Heroku to read your repo (assuming it's private -- which is the crux of the issue). Check out this answer for a detailed set of instructions showing how to do so with Github. This might help for Bitbucket.
I've found that the build time increases when doing this. Worth it for my needs.
Related
Pardon the title, it's terrible.
Situation:
I pull a github repo that represents a website. I build it locally for development. It's running on localhost.
It uses webpack, and has a series of node modules loaded.
I need to edit one of those modules and see the results on that website.
I then need to push those changes to the repo that builds to the npm module.
Ideally I do the edits on the npm module as a local pull of that repo, and the local website is referencing those changes and webpack watch is rebuilding as I go along, then when all good, I can do a pull request to that npm module.
What do I use to do this? Is their a common workflow or term for this so I can figure out how?
Thank you.
Note: I attempted to use npm link,
I.e. in the repo "custom-npm" I typed "npm link"
Then in the website repo, in the node_modules/custom-npm folder I typed "npm link custom-npm"
error is: "custom-npm" is not in this registry
"custom-npm" is a github private package. Not sure if that does something here unintended...
I have a repo that has the backend and frontend (create-react-app) in two separate folders. For the build command, I have something like cd frontend && npm run build and for the publish directory, I have something like frontend/build, but this is not working.
disclaimer: I work for Netlify.
If you were to clone a new copy (no node modules installed in the project, for instance) of your project on a fresh laptop with nothing else except node and npm installed there, how would you build it? Imagine netlify's build process like that. So you're missing at least an "npm install" step in there :)
Anything else missing, like globally installed npm packages? Need to specify them in package.json so that Netlify's build network knows to grab them for you. Ruby gems? Better have a Gemfile in your repo!
Netlify tries to npm install (and bundle install) automatically for you, assuming there is a package.json either in the root of your repository (I'm guessing yours is in frontend/ ?) OR if you set the "base" parameter so that we start our build in the base directory. This is probably a good pattern for you, to set "base" to frontend, and then set your publish directory to build.
You can specify that base parameter in netlify.toml something like this:
[build]
base = "frontend"
Note that netlify.toml must reside in the root of your repository.
For more details on how Netlify builds, check out the following articles:
Overview of how our build network works. This article also shows how you can download our build image to test locally.
Settings that affect our build environment. Useful for telling us about what node version to use, for instance.
Some frequently experienced problems
If after some reading and experimenting, you still can't figure things out, ping the helpdesk.
The top answer is correct ^. For anyone looking to simply change the base directory (lets say there is only one npm install/start) you need to change the BASE DIRECTORY, which you will find in the build settings. Simply go to: site-settings -> build & deploy - and you will see it where I pointed in the picture attacted. Hopefully that helps someone in need of this. see here
I'm trying to write an npm package that will be published and used as a framework in other projects. The problem is -- I can't figure out a solid workflow for working on it at the same time as working on projects that depend on it.
I know this seems super basic and that npm link solves the issue, but this is a bigger one than just being able to import one local package from another.
I have my framework package scaffolded out; let's call it gumby, It exports a function that does console.log('hello from gumby'). That's all that matters for right now.
Now I'm ready to create a project that will use gumby. Let's call this one client. I set that up too and npm link gumby so client can import from it, etc. OK cool, it's working as expected.
So now it's time to publish gumby. I run npm publish and it goes out to npm as version 0.0.1.
At this point, how do I get the published, npm-hosted version of gumby into the package.json for client? I mean, I could just delete the symlinked copy from my node_modules and then yarn add gumby, but what if I want to go back and work on it locally again? And then run it against the npm version again? And then work on it some more? And then...
You get the point, I imagine. There's no obvious way to switch between the npm copy of a package that you're working on, and the local one. There's the additional problem of how to do that without messing with your package.json too much, e.g. what if I accidentally commit to it version control with some weird file:// dependency path. Any suggestions would be much appreciated.
For local development, having the package symlinked is definitely the way to go, the idea of constantly publishing / re-installing the package sounds like a total pain.
The real issue sounds more like you’re concerned about committing a dev configuration to prod - you could address that problem with something as simple as a pre-commit hook on your VCS e.g. block if it detects any local file references in the package.json.
I have a custom npm module that I am working on, and it has a GitHub repo. I'm also working on a project that uses the custom module. When working on the larger project, it is nice to use npm link so I can make changes to the module and see them right away in the main project.
To deploy to staging or production, I use shrinkwrap and shrinkpack so I can do an npm install after every deploy (some of the dependencies need binaries, and dev systems aren't the same as production systems, so they do need to be installed and not just kept in source control). Edit: I'm crossing this out as the answer below technically solves my issue, even though it doesn't solve for this particular point, but that wasn't as important as the rest of it.
Of course, since the module is linked to my main project and not listed in package.json, a deploy and install misses it entirely. I can go ahead and list it in package.json and have it point to the appropriate GitHub repo, but then every time I need to test a change in the main project I would have to commit and push those changes, then update the main project, kill and restart the app...that would get tiresome pretty quickly.
I guess I need something like the opposite of "devDependencies"; something where I can have it not install the module on dev, but do install it from GitHub when doing npm install on staging or production. Other than remembering to manually change package.json every time I need to go back and forth, is there a better way to do this?
you can specify a github repository as your package to install, in your package.json file:
{
dependencies: {
"my-library": "githubusername/my-library"
}
}
this will work in your production environment.
in your development environment, use "npm link".
from within the "my-library" folder, run npm link directly. that will tell npm on your local box that "my-library" is avaialable as a link.
now, in your project that uses "my-library", run npm link my-library. this will create a symlink to your local development version of "my-library", allowing you to change code in that repository and have it work in your other project that needs it.
once you are ready to push to production, push "my-library" to your github repository, and then you can npm install on your servers, like normal.
Checking in node_module was the community standard but now we also have an option to use shrinkwrap. The latter makes more sense to me but there is always the chance that someone did "force publish" and introduced a bug. Are there any additional drawbacks?
My favorite post/philosophy on this subject goes all the way back (a long time in node.js land) to 2011:
https://web.archive.org/web/20150116024411/http://www.futurealoof.com/posts/nodemodules-in-git.html
To quote directly:
If you have an application, that you deploy, check in all your dependencies in to node_modules. If you use npm do deploy, only define bundleDependencies for those modules. If you have dependencies that need to be compiled you should still check in the code and just run $ npm rebuild on deploy.
Everyone I’ve told this too tells me I’m an idiot and then a few weeks later tells me I was right and checking node_modules in to git has been a blessing to deployment and development. It’s objectively better, but here are some of the questions/complaints I seem to get.
I think this is still the best advice.
The force-publish scenario is rare and npm shrinkwrap would probably work for most people. But if you're deploying to a production environment, nothing gives you the peace-of-mind like checking in the entire node_modules directory.
Alternately, if you really, really don't want to check in the node_modules directory but want a better guarantee there hasn't been a forced push, I'd follow the advice in npm help shrinkwrap:
If you want to avoid any risk that a byzantine author replaces a package you're using with code that breaks your application, you could modify the shrinkwrap file to use git URL references rather than version numbers so that npm always fetches all packages from git.
Of course, someone could run a weird git rebase or something and modify a git commit hash... but now we're just getting crazy.
npm FAQ directly answers this:
Check node_modules into git for things you deploy, such as websites
and apps.
Do not check node_modules into git for libraries and modules
intended to be reused.
Use npm to manage dependencies in your dev
environment, but not in your deployment scripts.
cited from npm FAQ