I have a node.js project that checks itself for code consistency according to rules specified in .eslintrc using gulp and gulp-eslint.
Now, I would like to have it throw custom deprecation warnings when it encounters a certain require:
const someModule = require('myDeprecatedModule');
// Warning: myDeprecatedModule is deprecated. Use myNewModule in stead.
Is this possible in a simple way that will be picked up by IDE's too?
Using .eslint
No custom plugin to be published and installed using npm
Local code only that can be pushed to the repository, nothing global
No custom code in node_modules
The rule no-restricted-modules does exactly this: it disallows requiring certain modules.
The names of the deprecated modules must be coded in the configuration. So in order to disallow the deprecated myDeprecatedModule you would add this setting to your .eslintrc file under the "rules" section.
"no-restricted-modules": ["error", "myDeprecatedModule"]
I don't think it's possible to customize the error message though. That would be possible with a custom plugin.
Related
I am following the directions in the documentation https://www.gatsbyjs.org/docs/eslint/, and would like to overwite one of the rules, but not affect the others, what I did is create an .eslintrc.js file.
This is the content of the file
module.exports = {
globals: {
__PATH_PREFIX__: true,
},
extends: `react-app`,
"rules": {
'jsx-a11y/no-static-element-interactions': [
'error',
{
handlers: [
'onClick',
'onMouseDown',
'onMouseUp',
'onKeyPress',
'onKeyDown',
'onKeyUp',
],
},
],
}
}
but the rest of the rules are now ignored, like it was not an extension
While the answer above is correct, it is a bit incomplete. The thing is eslint can be integrated both in builds and editors.
When you start using a custom .eslintrc.js you will lose the integration on build and output in the terminal based on those rule. That's because the built-in eslint-loader is disabled when you use a custom file. It actually says so on the documentation page but it is a bit unclear.
To get that back, you will need to integrate it in the webpack build. The easiest way is using the plugin mentioned on the doc page: gatsby-plugin-eslint.
I filed an issue to make custom integrations easier.
From the Gatsby docs you linked to:
When you include a custom .eslintrc file, Gatsby gives you full control over the ESLint configuration. This means that it will override the built-in eslint-loader and you need to enable any and all rules yourself. One way to do this is to use the Community plugin gatsby-eslint-plugin. This also means that the default ESLint config Gatsby ships with will be entirely overwritten. If you would still like to take advantage of those rules, you’ll need to copy them to your local file.
So it looks like as soon as your create a .eslintrc.js file, you need to build your rules up from the bottom again. It overwrites, it doesn't extend.
We have a library that is traditionally client-side only. It uses HTTP Request (or several other dependency libraries) to make REST calls. When using the library, user will initialize with a particular request provider and off they go.
We use webpack in our examples to utilize our library.
It is now extended it to use node-fetch, so if someone wants to use it from nodejs that's supported too.
For people using webpack, webpack is now attempting to pack node-fetch and the require call is failing in the browser. We can get around this with setting an external
"externals" : {
"node-fetch": "{}"
}
Is there a way to define our library so that if the consumer is using webpack target: web, it'd skip the require check for node-fetch? And similarly, if the consumer is using webpack target: nodejs - it needs to include the node-fetch component.
The project in question is https://github.com/OfficeDev/PnP-JS-Core
Thanks for reporting this. So according to This commit and conversation linked to it, the automatic module resolution field (also known as a described-resolve to the webpack resolver instance) changes based on what your target is.
By default, when target is node in your webpack build, resolution in package.json field will default to the main field else, browser field takes priority by default.
More reference https://github.com/webpack/webpack/issues/151
The links provided in the accepted answer & comment show how to do this, so +1 to those, but just to surface it directly here
Is there a way to define our library so that if the consumer is using webpack target: web, it'd skip the require check for node-fetch
Yes. In the library's package.json, add a browser field with the following
"browser": {
"node-fetch": false
}
This provides a hint to webpack and other bundlers that the the node-fetch module should be ignored - i.e. do not even attempt to include it in the bundle - when the target is web. When the target is node, it will be included.
Note that the above relies on the code in the client bundle never using node-fetch. In that sense it can be considered unsafe, because there is no compile-time guarantee of this, and if it happens, it will just error and probably crash your client. If you're absolutely sure it can never be used client-side, though, this is the simplest way to get this done.
For more safety - i.e. if you want the client code to only warn if you try to use node-fetch - you also have the option of providing a shim to the module that the client bundle can include instead, and for instance just log a warning in the shim implementation if it gets used. You do this in the same way, just by providing a path to the shim module instead of false
"browser": {
"node-fetch": "./shims/node-fetch.js"
}
I'm working on a project that wraps ESLint output, and would like to access the content of the detailed markdown documentation for each warning. These files live in the ESLint repository at docs/rules.
However, it looks like the docs directory might not get included in the packaged module, and so those docs might not be accessible from the standard product once installed.
The npm package.json docs also make it seem like docs may not typically be available:
directories.doc
Put markdown files in here. Eventually, these will be displayed
nicely, maybe, someday.
I'm new to working with node packages, so may be missing something obvious. Thanks for any ideas!
OP here. In the end, I worked with a friend on an ESLint fork that adds this functionality: https://github.com/codeclimate/eslint/commit/fb68870708b1b3bbad80a955eec085a8dfd5f13b
If anyone would like to use it, you can use our fork of eslint and access docs for each rules through:
var docs = require("eslint").docs;
where docs is an object with { rule_name: "content string....", ... }
complexity_readup = docs.get("complexity")
1) In node on the backend to link one javascript file to another we use the require statement and module.exports.
This allows us to create modules of code and link them together.
How do the same thing in Meteor?
2) On the front end, in Meteor is I want to access a code from another front end javascript file, I have to use globals. Is there a better way to do this, so I can require one javascript file in another file? I think something like browserify does this but I am not sure how to integrate this with Meteor.
Basically if on the client I have one file
browserifyTest.coffee
test = () ->
alert 'Hello'
I want to be able to access this test function in another file
test.coffee
Template.profileEdit.rendered = ->
$ ->
setPaddingIfMenuOpen()
test()
How can I do this in Meteor without using globals?
Meteor wraps all the code in a module (function(){...your code...})() for every file we create. If you want to export something out of your js file (module), make it a global. i.e don't use var with the variable name you want to export and it'll be accessible in all files which get included after this module. Keep in mind the order in which meteor includes js files http://docs.meteor.com/#structuringyourapp
I don't think you can do this without using globals. Meteor wraps code in js files in SEF (self executing function) expressions, and exports api is available for packages only. What problem do you exactly have with globals? I've worked with fairly large Meteor projects and while using a global object to keep my global helpers namespaces, I never had any issues with this approach of accessing functions/data from one file in other files.
You can use a local package, which is just like a normal Meteor package but used only in your app.
If the package proves to be useful in other apps, you may even publish it on atmosphere.
I suggest you read the WIP section "Writing Packages" of the Meteor docs, but expect breaking changes in coming weeks as Meteor 0.9 will include the final Package API, which is going to be slightly different.
http://docs.meteor.com/#writingpackages
Basically, you need to create a package directory (my-package) and put it under /packages.
Then you need a package description file which needs to be named package.js at the root of your package.
/packages/my-package/package.js
Package.describe({
summary:"Provides test"
});
Package.on_use(function(api){
api.use(["underscore","jquery"],"client");
api.add_files("client/lib/test.js","client");
// api.export is what you've been looking for all along !
api.export("Test","client");
});
Usually I try to mimic the Meteor application structure in my package so that's why I'd put test.js under my-package/client/lib/test.js : it's a utility function residing in the client.
/packages/my-package/client/lib/test.js
Test={
test:function(){
alert("Hello !");
}
};
Another package convention is to declare a package-global object containing everything public and then exporting this single object so the app can access it.
The variables you export NEED to be package-global so don't forget to remove the var keyword when declaring them : package scope is just like regular meteor app scope.
Last but not least, don't forget to meteor add your package :
meteor add my-package
And you will be able to use Test.test in the client without polluting the global namespace.
EDIT due to second question posted in the comments.
Suppose now you want to use NPM modules in your package.
I'll use momentjs as an example because it's simple yet interesting enough.
First you need to call Npm.depends in package.js, we'll depend on the latest version of momentjs :
/packages/my-moment-package/package.js
Package.describe({
summary:"Yet another moment packaged for Meteor"
});
Npm.depends({
"moment":"2.7.0"
});
Package.on_use(function(api){
api.add_files("server/lib/moment.js");
api.export("moment","server");
});
Then you can use Npm.require in your server side code just like this :
/packages/my-moment-package/server/moment.js
moment=Npm.require("moment");
A real moment package would also export moment in the client by loading the client side version of momentjs.
You can use the atmosphere npm package http://atmospherejs.com/package/npm which lets you use directly NPM packages in your server code without the need of wrapping them in a Meteor package first.
Of course if a specific NPM package has been converted to Meteor and is well supported on atmosphere you should use it.
I'm developing a web app using Require.js for AMD and amplify.request to abstract away my AJAX calls. The other advantage to amplify.request is that I've defined an alternative module containing mocked versions of my requests that I can use for testing purposes. Currently, I'm switching between the two versions of my request module by simply commenting/un-commenting the module reference in my main.js file.
What I'd love to do is use Grunt to create different builds of my app depending on which module I wanted included. I could also use it to do things like turn my debug mode on or off. I'm picturing something similar to usemin, only for references inside JavaScript, not HTML.
Anyone know of a plugin that does this, or have a suggestion about how I could do it with Grunt?
On our current project we have a few different environments. For each of them, we can specify different configuration settings for the requirejs build.
To distinguish between these different environments, I've used a parameter target.
You can simply pass this to grunt by appending it to your call like
grunt --target=debug
And you can access this parameter in the Gruntfile, by using grunt.option, like
var target = (grunt.option('target') || 'debug').toLowerCase();
The line above will default to debug. You could then make use of the paths configuration setting of requirejs to point the build to the correct module. Example code below.
requirejs: {
compile: {
options: {
paths: {
"your/path/to/amplify/request": target === "debug" ? "path/to/mock" : "path/to/real",
}
}
}
}