How to update Jest testing framework in create-react-app? - node.js

I already have an application created with create-react-app package.
I found a bug with the version of Jest which is 15.1.1. But I realized that in version 16 the bug is gone.
How to update Jest?
My problem is in the package.json of application there is no Jest package.
Jest is in other folder: node_modules/react-scripts.

Create React App updates its dependencies once they are stable enough. This usually means waiting a week or two after the new release.
We don't recommend updating anything by yourself unless this is absolutely critical. If you choose to eject to update something we recommend making it a single commit so that you can revert it later once Create React App uses that version internally.

The following commands are going to get the job done:
npm run eject
npm install --save-dev jest#16.0.0
But be careful here! The eject command irreversibly eliminates the abstraction layer of create-react-app exposing all of the dependencies and configuration to you. Though, your app is going to work just like before. You'll just have total control upon it, including the ability to update dependencies.

Related

React unable to find modules

I run into this problem pretty frequently while developing react applications. The latest is hwid. I am using yarn to manage dependencies.
I added the module using
yarn add hwid
It added it to the package.json file and gave me no errors. When I run the application, it says it is unable to find the module. The module is there in node_modules and everything seems to be correct and in place. So I tried deleting node_modules and running yarn install. I've done this several times. I tried force cleaning the npm cache. I have run yarn remove and yarn add several times.
I am using the WebStorm IDE. It gives me no errors, and in fact, if I let it resolve the import, it finds it just fine. This seems to only happen to me in react projects. I think, but I'm not sure, that it is usually typescript modules that give me problems.
Is there a magic bullet for this? The module is a pretty critical part of my app, so if I can't resolve it using node and react's import system, I'm going to have to just copy the files into my project. I would really rather not do that for obvious reasons.
Any help is appreciated.
If it's about typescript modules, have you tries also installing types of that modulea?
E.g.yarn add #types/hwid

Best practice for nodejs deployment - Directly moving node_modules to server or run npm install command

What is the best practice for deploying a nodejs application?
1) Directly moving the node_modules folders from the development server to production server, so that our same local environment can be created in the production also. Whatever changes made to any of the node modules remotely will not affect our code.
2) Run npm install command in the production server with the help of package.json. Here the problem is, any changes in the node modules will affect our code. I have faced some issues with the loopback module (issue link).
Can anyone help me?
Running npm install in production server cannot be done in certain scenario (lack of compiling tools, restricted internet access, etc...) and also if you have to deploy the same project on multiple machines, can be a waste of cpu, memory and bandwidth.
You should run npm install --production on a machine with the same libraries and node version of the production server, compress node_modules and deploy on production server. You should also keep the package-lock.json file to pinpoint versions.
This approach allows you also to build/test your code using development packages and then pruning the node_modules before the actual deploy.
Moving node_modules folder is overkilled.
Running npm install might break the version dependencies.
The best approach is npm ci. It uses the package_lock file and installs the required dependencies without modify the versions.
npm ci meant for continuous integration projects. LINK
I am an ASP.NET Core developer but I recently started working with Node.js apps. For me this was one of the challenges you mentioned to move the node_modules folder to production. Instead of moving the whole folder to production or only running the npm install command on production server, I figured out and tried a way of bundling my Node.js app using Webpack into a single/multiple bundles, and I just got rid of the mess of managing node_modules folder. It only picks up the required node_modules packages that are being used/referred in my app and bundles up in a single file along with my app code and I deploy that single file to production without moving the entire node_modules folder.
I found this approach useful in my case but please suggest me if this is not the correct way regarding the performance of the app or if any cons of this approach.
Definitely npm install. But you shouldn't do this by your own hand when it comes to deploying your app.
Use the tool for this like PM2.
As for your concern about changes in packages, the short answer is package-lock.json.
My guess is that by asking this question you don't really understand the point of the package.json file.
The package.json file is explicitly intended for this purpose (that, and uploading to the npm registry), the transfer of a node package without having to transfer the sizeable number of dependencies along with it.
I would go as far as to say that one should never manually move the node_modules directory at all.
Definitely use the npm install command on your production server, this is the proper way of doing it. To avoid any changes to the node_modules directory as compared to your local environment, use the package lock file. That should help with minimising changes to the source code in node_modules.
I mean no bad intent by saying this

How to work on two npm packages at the same time?

I'm trying to write an npm package that will be published and used as a framework in other projects. The problem is -- I can't figure out a solid workflow for working on it at the same time as working on projects that depend on it.
I know this seems super basic and that npm link solves the issue, but this is a bigger one than just being able to import one local package from another.
I have my framework package scaffolded out; let's call it gumby, It exports a function that does console.log('hello from gumby'). That's all that matters for right now.
Now I'm ready to create a project that will use gumby. Let's call this one client. I set that up too and npm link gumby so client can import from it, etc. OK cool, it's working as expected.
So now it's time to publish gumby. I run npm publish and it goes out to npm as version 0.0.1.
At this point, how do I get the published, npm-hosted version of gumby into the package.json for client? I mean, I could just delete the symlinked copy from my node_modules and then yarn add gumby, but what if I want to go back and work on it locally again? And then run it against the npm version again? And then work on it some more? And then...
You get the point, I imagine. There's no obvious way to switch between the npm copy of a package that you're working on, and the local one. There's the additional problem of how to do that without messing with your package.json too much, e.g. what if I accidentally commit to it version control with some weird file:// dependency path. Any suggestions would be much appreciated.
For local development, having the package symlinked is definitely the way to go, the idea of constantly publishing / re-installing the package sounds like a total pain.
The real issue sounds more like you’re concerned about committing a dev configuration to prod - you could address that problem with something as simple as a pre-commit hook on your VCS e.g. block if it detects any local file references in the package.json.

What are the main differences of demeteorizer and meteor bundle?

Having gone through and used demeteorizer. I wonder what are the main differences between setting up meteor vs demeteorizer and running it via node; on own server?
meteor only
hot swappable code?
problem in maintaining packages similar from production and dev
same meteor versions running on prod and dev
hardcoded environment settings (i.e. mongo)
demeteorizer
platform independant as this auto bundles dependancies and uses pure nodejs
organise and maintain mongodb how you like (backup scripts etc)
I have been using demeteorizer (packaging->upload->running forever), but wonder if there are any performance or issues in the long run.
I have seen packages such as "authentication" running okay locally but very slow on the test server (hangs on submit, indicating sync problems?)
thanks in advance.
ref: https://twitter.com/SachaGreif/status/424908644590030848
Demeteorizer builds on top of meteor bundle with one small difference: Demeteorizer builds a package.json for you and deletes the node_modules directories.
Without demeteorizer you would have a bit of trouble deploying your app, particularly if it was on a different platform to the one you built your app on.
If you see meteor's own docs, you have to remove fibers and manage your npm modules yourself, manually. With a package.json you can run npm install and have them all installed for you, including ones from packages.
Why is this useful? For services like modulus it means you can upload an app and have it install all your dependencies for you without you having to think about it, as if it were an ordinary node-js app.
Everything that applies to meteor bundle will also apply to demeteorizer as it is still the same meteor bundled app, just with the package.json. So you can use forever, hard coded/environment based settings, etc the same way.

How to automate testing user-version of npm package instead of running the development version on continious integration?

It happens occasionally that the development version of a module works in my development workspace and passes on Travis-CI but after publishing to npm it turns-out the end-user package is broken.
For example if you use a sub module that should be in dependencies but had it in devDependencies then CI will pass (but there are plenty other possible breakages).
How do you automate testing this? Do you use external rigging? Is there a secret module? Do you have a user acceptance test suite?
I use Github with Travis-CI but the standard setup uses the development install.
Once upon a time I discovered that npm would let me publish packages that are uninstallable. So I've added a target to my Gruntfile that does this:
Issue npm pack to create a package from my source.
Into a directory created (automatically by my Gruntfile) just for testing install the new package using npm install <path to the package created in the previous step>.
I have a target for publishing a new version that will publish only if the steps above are successful.
The steps above would not catch the dependency problem you mentioned in the question but they could easily be extended to catch it. To do this, I'd add one or more tests that cause the package installed in step 2 above to call require with all that it depends on.
I would suggest to set up your own CI server that does essentially one thing, npm install package ; cd node_modules/package ; npm test. This would ensure that your package is installable at least on your server.
I heard that Jenkins is good for this (at least, that's what node.js core team seems to be using), but don't have any first hand experience yet. We're just planning to set in up in a couple of weeks.
Also, having some external module that depends on you and testing it helps a bit. :)

Resources