Gatsby extend ESLint rules overwrites original ESLint - eslint

I am following the directions in the documentation https://www.gatsbyjs.org/docs/eslint/, and would like to overwite one of the rules, but not affect the others, what I did is create an .eslintrc.js file.
This is the content of the file
module.exports = {
globals: {
__PATH_PREFIX__: true,
},
extends: `react-app`,
"rules": {
'jsx-a11y/no-static-element-interactions': [
'error',
{
handlers: [
'onClick',
'onMouseDown',
'onMouseUp',
'onKeyPress',
'onKeyDown',
'onKeyUp',
],
},
],
}
}
but the rest of the rules are now ignored, like it was not an extension

While the answer above is correct, it is a bit incomplete. The thing is eslint can be integrated both in builds and editors.
When you start using a custom .eslintrc.js you will lose the integration on build and output in the terminal based on those rule. That's because the built-in eslint-loader is disabled when you use a custom file. It actually says so on the documentation page but it is a bit unclear.
To get that back, you will need to integrate it in the webpack build. The easiest way is using the plugin mentioned on the doc page: gatsby-plugin-eslint.
I filed an issue to make custom integrations easier.

From the Gatsby docs you linked to:
When you include a custom .eslintrc file, Gatsby gives you full control over the ESLint configuration. This means that it will override the built-in eslint-loader and you need to enable any and all rules yourself. One way to do this is to use the Community plugin gatsby-eslint-plugin. This also means that the default ESLint config Gatsby ships with will be entirely overwritten. If you would still like to take advantage of those rules, you’ll need to copy them to your local file.
So it looks like as soon as your create a .eslintrc.js file, you need to build your rules up from the bottom again. It overwrites, it doesn't extend.

Related

How to setup StencilJS components on S3 and CloudFront

I have a few components and I want to deploy them into S3 and make them reachable with CloudFront.
My problem is that I don't know what file(s) I need to upload to S3 and which file needs CloudFront needs to point to as entry point.
Here's my stencil.config.tsx:
import { Config } from '#stencil/core';
export const config: Config = {
namespace: 'stencil-test',
taskQueue: 'async',
outputTargets: [
{
type: 'dist',
esmLoaderPath: '../loader',
dir: './build/dist'
},
{
type: 'www',
serviceWorker: null // disable service workers
}
]
};
I tried executing npm run build that generated a couple of folders: build/loader and build/dist there's a lot of stuff within each folder but I have no idea how which folder and files are supposed to do what.
It was hoping the build command would generate a minified file that contained all the stuff needed (is this how it works?) so I could eventually do something like the following where I want to use my components:
<script type="module" src='https://cdn.jsdelivr.net/npm/my-name#0.0.1/dist/myname.js'></script>
Can anyone offer some guidance or point me towards any resources?
The www output target is meant for generating apps and not really relevant for component libraries. To host your components, you should upload the whole generated dist folder. Only the files that the client needs are downloaded, which depends on the client and which components they access (lazy-loading). So you don't need to worry about the amount of files. See https://stenciljs.com/docs/distribution.
To start, Stencil was designed to lazy-load itself only when the component was actually used on a page. There are many benefits to this approach, such as simply adding a script tag to any page and the entire library is available for use, yet only the components actually used are downloaded.
If you want to generate a single bundle containing all your components, there's an output target called dist-custom-elements-bundle. For the differences to dist you can have a look at the same docs link above.
One of the main differences is that loading the script doesn't automatically register the components for you, you'll have to either do it manually per component (using customElements.define(), or define them all using the defineCustomElements() export. The official documentation for that output target is https://stenciljs.com/docs/custom-elements.

Gatsby Failed Build - error "window" is not available during server side rendering

I have been trying to build my gatsby (react) site recently using an external package.
The link to this package is "https://github.com/transitive-bullshit/react-particle-animation".
As I only have the option to change the props from the components detail, I cannot read/write the package file where it all gets together in the end as it is not included in the public folder of 'gatsby-build'.
What I have tried:
Editing the package file locally, which worked only on my machine but when I push it to netlify, which just receives the public folder and the corresponding package.json files and not the 'node-modules folder', I cannot make netlify read the file that I myself changed, as it requests it directly from the github page.
As a solution I found from a comment to this question, we can use the "Patch-Package" to save our fixes to the node module and then use it wherever we want.
This actually worked for me!
To explain how I fixed it: (As most of it is already written in the "Patch Package DOCS), so mentioning the main points:
I first made changes to my local package files that were giving the error.(For me they were in my node_modules folder)
Then I used the Patch Package Documentation to guide my self through the rest.
It worked after I pushed my changes to github such that now, Patch Package always gives me my edited version of the node_module.
When dealing with third-party modules that use window in Gatsby you need to add a null loader to its own webpack configuration to avoid the transpilation during the SSR (Server-Side Rendering). This is because gatsby develop occurs in the browser (where there is a window) while gatsby build occurs in the Node server where obviously there isn't a window or other global objects (because they are not even defined yet).
exports.onCreateWebpackConfig = ({ stage, loaders, actions }) => {
if (stage === "build-html") {
actions.setWebpackConfig({
module: {
rules: [
{
test: /react-particle-animation/,
use: loaders.null(),
},
],
},
})
}
}
Keep in mind that the test value is a regular expression that will match a folder under node_modules so, ensure that the /react-particle-animation/ is the right name.
Using a patch-package may work but keep in mind that you are adding an extra package, another bundled file that could potentially affect the performance of the site. The proposed snippet is a built-in solution that is fired when you build your application.

Can I have a project-specific custom code deprecation message using eslint?

I have a node.js project that checks itself for code consistency according to rules specified in .eslintrc using gulp and gulp-eslint.
Now, I would like to have it throw custom deprecation warnings when it encounters a certain require:
const someModule = require('myDeprecatedModule');
// Warning: myDeprecatedModule is deprecated. Use myNewModule in stead.
Is this possible in a simple way that will be picked up by IDE's too?
Using .eslint
No custom plugin to be published and installed using npm
Local code only that can be pushed to the repository, nothing global
No custom code in node_modules
The rule no-restricted-modules does exactly this: it disallows requiring certain modules.
The names of the deprecated modules must be coded in the configuration. So in order to disallow the deprecated myDeprecatedModule you would add this setting to your .eslintrc file under the "rules" section.
"no-restricted-modules": ["error", "myDeprecatedModule"]
I don't think it's possible to customize the error message though. That would be possible with a custom plugin.

How to access node module documentation - eslint

I'm working on a project that wraps ESLint output, and would like to access the content of the detailed markdown documentation for each warning. These files live in the ESLint repository at docs/rules.
However, it looks like the docs directory might not get included in the packaged module, and so those docs might not be accessible from the standard product once installed.
The npm package.json docs also make it seem like docs may not typically be available:
directories.doc
Put markdown files in here. Eventually, these will be displayed
nicely, maybe, someday.
I'm new to working with node packages, so may be missing something obvious. Thanks for any ideas!
OP here. In the end, I worked with a friend on an ESLint fork that adds this functionality: https://github.com/codeclimate/eslint/commit/fb68870708b1b3bbad80a955eec085a8dfd5f13b
If anyone would like to use it, you can use our fork of eslint and access docs for each rules through:
var docs = require("eslint").docs;
where docs is an object with { rule_name: "content string....", ... }
complexity_readup = docs.get("complexity")

Grunt task to optionally include an AMD module for different environment

I'm developing a web app using Require.js for AMD and amplify.request to abstract away my AJAX calls. The other advantage to amplify.request is that I've defined an alternative module containing mocked versions of my requests that I can use for testing purposes. Currently, I'm switching between the two versions of my request module by simply commenting/un-commenting the module reference in my main.js file.
What I'd love to do is use Grunt to create different builds of my app depending on which module I wanted included. I could also use it to do things like turn my debug mode on or off. I'm picturing something similar to usemin, only for references inside JavaScript, not HTML.
Anyone know of a plugin that does this, or have a suggestion about how I could do it with Grunt?
On our current project we have a few different environments. For each of them, we can specify different configuration settings for the requirejs build.
To distinguish between these different environments, I've used a parameter target.
You can simply pass this to grunt by appending it to your call like
grunt --target=debug
And you can access this parameter in the Gruntfile, by using grunt.option, like
var target = (grunt.option('target') || 'debug').toLowerCase();
The line above will default to debug. You could then make use of the paths configuration setting of requirejs to point the build to the correct module. Example code below.
requirejs: {
compile: {
options: {
paths: {
"your/path/to/amplify/request": target === "debug" ? "path/to/mock" : "path/to/real",
}
}
}
}

Resources