When I try to create a JSCS config file:
C:\Blog\BlogWeb>jscs --auto-configure "C:\Blog\BlogWeb\temp.jscs"
I get the following error:
safeContextKeyword option requires string or array value
What parameter am I supposed to pass? What is a safecontextkeyword?
New to NPM and JSCS, please excuse ignorance.
JSCS was complaining that I didn't have a config file, so I was trying to figure out how to create one.
JSCS looks for a config file in these places, unless you manually specify it with the --config option:
jscs it will consequentially search for jscsConfig option in package.json file then for .jscsrc (which is a just JSON with comments) and .jscs.json files in the current working directory then in nearest ancestor until it hits the system root.
I fixed this by:
Create a new file named .jscsrc. Windows Explorer may not let you do this, so may need to use the command line.
Copy the following into it. It doesn't matter if this is the preset you want to use or not. The command will overwrite it.
{
"preset": "jquery",
"requireCurlyBraces": null // or false
}
Verify that it works by running a command such as:
run the command
jscs --auto-configure .jscsrc
Related
I'm working on a large TS-based library. When I build the application, this creates a lot of .d.ts files, most of which are of internal use only, and make no sense to export or ship to the end user. Usually I've used a .npmignore file to keep these out, but recently learned that certain tools really prefer that information to be included via the "files" field of the package.json, so here I am trying to convert.
Now, I have a directory structure that looks somewhat like this:
dist/
--bundle.js
--...
--components/
----componentA.d.ts
----componentB.d.ts
----common/
----...
--hooks/
----...
--util/
----...
The idea is that I want all top level files, and all files directly under /components/ but no child directories. In my .npmignore, I'd do this like:
# blacklist all
**
# include whitelist
!dist/*
!dist/components/*
However, when I do the same under "files" in my package.json, all that crap still comes along. The single wildcard is not respected.
Edit:
"files": [
"dist/*",
"dist/components/*",
...
],
Reproducing what you show of your file system, this works for me:
"files": ["dist/*.js", "dist/components/*.ts"]
Omitting the file extensions indeed included all the subdirectory cruft. I tested with npm 7 and npm 6.
I am trying to create a custom file watcher in WebStorm that will auto fix ESLint errors on save. In Settings > Tools > File Watchers I created a new file watcher with the following settings:
File type: Any
Scope: All places
Program: /home/user/Projects/todo-app/eslint-autofix.sh
Arguments: blank
Output paths to refresh: blank
Other options > Working directory: /home/user/Projects/todo-app
eslint-autofix.sh:
#!/usr/bin/env bash
./node_modules/.bin/eslint --fix
Then I made an ESLint error and pressed Ctrl + S to save. The following error pops up:
/home/user/Projects/todo-app/eslint-autofix.sh
/usr/bin/env: ‘node’: No such file or directory
How to fix this error?
According to this article, settings should be as the following:
File type: Any (or JavaScript)
Scope: Project files
Program: $ProjectFileDir$/node_modules/.bin/eslint
Arguments: --fix $FilePath$
Output paths to refresh: $FileDir$
On WebStorm 2020.1.1, there is a checkbox called Run eslint --fix on save.
Also see:
https://www.jetbrains.com/help/webstorm/eslint.html#ws_js_eslint_activate
Just to extend on jstice4all's & gotjosh's solution:
I was able to get the FileWatcher to ESLint for some projects, but it wasn't working with the plugin extends: '#react-native-community'
#react-native-community/eslint-config#overrides[2]:
Environment key "jest/globals" is unknown
Turns out that the #react-native-community plugin needs to be ran from the project folder itself in order to load the environment variables, whereas the file watcher runs from the node_module/eslint path. To get it to work I had to add the following config:
Working Directory: $ProjectFileDir$
Screenshot Config
I am trying to add a line to an existing file /etc/fuse.conf. I tried this
added a folder two folders under modules directory
sudo mkdir /etc/puppet/modules/test
sudo mkdir /etc/puppet/modules/test/manifests
Then created a test.pp file and added following lines
sudo vim /etc/puppet/modules/test/manifests/test.pp
file { '/etc/fuse.conf':
ensure => present,
}->
file_line { 'Append a line to /etc/fuse.conf':
path => '/etc/fuse.conf',
line => 'Want to add this line as a test',
}
After that I ran this command
puppet apply /etc/puppet/modules/test/manifests/test.pp
Then I opened this file /etc/fuse.conf and there was no change in the file. The line was not added to the file. I don't understand what I am missing here. How can I do this?
Interesting. I ran the same test you did without an issue, and as long as you have stdlib installed in your environment you should be fine.
https://forge.puppet.com/puppetlabs/stdlib
The results of running the same steps you outlined were successful for me:
[root#foreman-staging tmp]# puppet apply /etc/puppet/modules/test/manifests/test.pp
Notice: Compiled catalog for foreman-staging.kapsch.local in environment production in 0.18 seconds
Notice: /Stage[main]/Main/File[/etc/fuse.conf]/ensure: created
Notice: /Stage[main]/Main/File_line[Append a line to /etc/fuse.conf]/ensure: created
Notice: Finished catalog run in 0.24 seconds
What did your puppet run output?
You should use templates (ERB) to handle file configuration. Its easier and cleaner.
Check the puppet docs for it in :
https://docs.puppetlabs.com/puppet/latest/reference/lang_template.html
There are other options though. e.g. Augeas which is an API for file configuration and integrate very well with Puppet. http://augeas.net/index.html
[]'s
There are a few ways to handle this. If it's ini file you can use ini_setting. If it's supported by augeas you can use that. Otherwise try specifying the after parameter to file_line
I have a python file as part of my grunt workflow. I have defined two build tasks:
build:dev
build:release
When I compile 'build:dev', I want to add this line to my python file:
...
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + dbpath
...
When I compile 'build:release', I want to add this line to my python file:
...
app.config['SQLALCHEMY_DATABASE_URI'] = os.environ['POSTGRESQL_COLORFUL_URL']
...
edit: fixed typo in code and title
You can use grunt-sed.
It's a really useful 'find and replace' system that builds into Grunt.
From the docs:
npm install grunt-sed
Add this line to your project's Gruntfile.js:
grunt.loadNpmTasks('grunt-sed');
Then in your build:dev and build:release tasks have the following:
sed: {
database_uri: {
path: 'path_to_your_python.py',
pattern: '%PATTERN_IN_YOUR_PYTHON_FILE%',
replacement: '\'sqlite:///\' + dbpath',
}
}
In your python file you want replacing you must also have %PATTERN_IN_YOUR_PYTHON_FILE% to be replaced.
I've used a plugin called grunt-string-replace that worked very well for what I needed to do. Also, I added some custom code in my Gruntfile.js to read different environment configurations and customize the build output based on that.
I detailed the full deployment script in this post: http://dev-blog.cloud-spinners.com/2014/02/complete-grunt-deployment-workflow-for.html
I hope you find that useful.
I'm writing a groovy script that I want to be controlled via a properties file stored in the same folder. However, I want to be able to call this script from anywhere. When I run the script it always looks for the properties file based on where it is run from, not where the script is.
How can I access the path of the script file from within the script?
You are correct that new File(".").getCanonicalPath() does not work. That returns the working directory.
To get the script directory
scriptDir = new File(getClass().protectionDomain.codeSource.location.path).parent
To get the script file path
scriptFile = getClass().protectionDomain.codeSource.location.path
As of Groovy 2.3.0 the #SourceURI annotation can be used to populate a variable with the URI of the script's location. This URI can then be used to get the path to the script:
import groovy.transform.SourceURI
import java.nio.file.Path
import java.nio.file.Paths
#SourceURI
URI sourceUri
Path scriptLocation = Paths.get(sourceUri)
Note that this will only work if the URI is a file: URI (or another URI scheme type with an installed FileSystemProvider), otherwise a FileSystemNotFoundException will be thrown by the Paths.get(URI) call. In particular, certain Groovy runtimes such as groovyshell and nextflow return a data: URI, which will not typically match an installed FileSystemProvider.
This makes sense if you are running the Groovy code as a script, otherwise the whole idea gets a little confusing, IMO. The workaround is here: https://issues.apache.org/jira/browse/GROOVY-1642
Basically this involves changing startGroovy.sh to pass in the location of the Groovy script as an environment variable.
As long as this information is not provided directly by Groovy, it's possible to modify the groovy.(sh|bat) starter script to make this property available as system property:
For unix boxes just change $GROOVY_HOME/bin/groovy (the sh script) to do
export JAVA_OPTS="$JAVA_OPTS -Dscript.name=$0"
before calling startGroovy
For Windows:
In startGroovy.bat add the following 2 lines right after the line with
the :init label (just before the parameter slurping starts):
#rem get name of script to launch with full path
set GROOVY_SCRIPT_NAME=%~f1
A bit further down in the batch file after the line that says "set
JAVA_OPTS=%JAVA_OPTS% -Dgroovy.starter.conf="%STARTER_CONF%" add the
line
set JAVA_OPTS=%JAVA_OPTS% -Dscript.name="%GROOVY_SCRIPT_NAME%"
For gradle user
I have same issue when I'm starting to work with gradle. I want to compile my thrift by remote thrift compiler (custom by my company).
Below is how I solved my issue:
task compileThrift {
doLast {
def projectLocation = projectDir.getAbsolutePath(); // HERE is what you've been looking for.
ssh.run {
session(remotes.compilerServer) {
// Delete existing thrift file.
cleanGeneratedFiles()
new File("$projectLocation/thrift/").eachFile() { f ->
def fileName=f.getName()
if(f.absolutePath.endsWith(".thrift")){
put from: f, into: "$compilerLocation/$fileName"
}
}
execute "mkdir -p $compilerLocation/gen-java"
def compileResult = execute "bash $compilerLocation/genjar $serviceName", logging: 'stdout', pty: true
assert compileResult.contains('SUCCESSFUL')
get from: "$compilerLocation/$serviceName" + '.jar', into: "$projectLocation/libs/"
}
}
}
}
One more solution. It works perfect even you run the script using GrovyConsole
File getScriptFile(){
new File(this.class.classLoader.getResourceLoader().loadGroovySource(this.class.name).toURI())
}
println getScriptFile()
workaround: for us it was running in an ANT environment and storing some location parent (knowing the subpath) in the Java environment properties (System.setProperty( "dirAncestor", "/foo" )) we could access the dir ancestor via Groovy's properties.get('dirAncestor').
maybe this will help for some scenarios mentioned here.