Has anyone used eclim? I wanted to try it out and I use vim as my primary editor so I want to run it as a headless instance. Anyway I installed it via the Unattended (automated) install
$ java \
-Dvim.files=$HOME/.vim \
-Declipse.home=/opt/eclipse \
-jar eclim_2.4.0.jar install
I had already downloaded eclipse luna and I have jdk 7 installed (but I don't know if it is part of the environment variables) and I ended up with:
2014-08-30 10:37:40,569 INFO [ANT] [eclim:unattended] Finished analyzing your eclipse installation.
2014-08-30 10:37:40,572 ERROR [ANT]
jar:file:/home/jim/Downloads/eclim_2.4.0.jar!/installer.xml:119: java.lang.NullPointerException
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:116)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:390)
at org.apache.tools.ant.Target.performTasks(Target.java:411)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.Project.executeTarget(Project.java:1368)
at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
at org.apache.tools.ant.Project.executeTargets(Project.java:1251)
at org.formic.ant.Main.runBuild(Main.java:232)
at org.formic.ant.Main.startAnt(Main.java:81)
at org.formic.ant.Main.main(Main.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.simontuffs.onejar.Boot.run(Boot.java:306)
at com.simontuffs.onejar.Boot.main(Boot.java:159)
Caused by: java.lang.NullPointerException
at org.formic.Installer.getString(Installer.java:201)
at org.eclim.installer.step.FeatureProvider.getFeatures(FeatureProvider.java:99)
at org.eclim.installer.ant.UnattendedInstallTask.execute(UnattendedInstallTask.java:73)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
... 16 more
2014-08-30 10:37:40,582 DEBUG [ANT]
BUILD SUCCESSFUL
Total time: 19 seconds
java.lang.NullPointerException
So I have no idea what happened. But I can not find eclimd anywhere in my system
/opt is owned by root per default. My guess is that it indeed is in your setup and since eclim needs to write to /opt/eclipse during installation it results in an error. Try changing ownership of /opt/eclipse using the -R option or run the installation as root. Note though that using $HOME will then probably not lead to the desired result.
I had the same issue. I followed the instructions to build from the source code and that worked for me.
I checked out the master branch from the Git repository and used ant to build and install eclim. At the time of this writing that resulted in version 2.4.0.11-ge560abe getting installed without errors. Running eclimd and then :PingEclim and :EclimValidate from vim reported that everything is fine.
Note that eclimd dumped an exception on startup:
java.lang.RuntimeException: Unable to aquire PluginConverter service during generation for: /home/pappmar/dev/eclipse/plugins/org.eclim.installer_2.4.0.11-ge560abe.jar
I don't know if that's a problem or not. It seems to be running all the same.
Related
i'm trying to learn a big data online course and came across the problem while installing apache spark.
i've done everything correctly but when i try to run spark-submit it seems that there is an issue with java i guess.
when i run this:
(base) C:\SparkCourse>spark-submit ratings-counter.py
i get this error:
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
at org.apache.spark.internal.config.package$.<init>(package.scala:1095)
at org.apache.spark.internal.config.package$.<clinit>(package.scala)
at org.apache.spark.deploy.SparkSubmitArguments.$anonfun$loadEnvironmentArguments$3(SparkSubmitArguments.scala:157)
at scala.Option.orElse(Option.scala:447)
at org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:157)
at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:115)
at org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$3.<init>(SparkSubmit.scala:1022)
at org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:1022)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:85)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module #5b94b04d
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
... 13 more
any ideas?
Cheers!
I reinstalled windows and started everything from scratch.
Installed jdk version 8, and this version of spark "spark-3.0.3-bin-hadoop2.7.tgz". Indicated all the paths correctly and. It worked as i can open pyspark shell and do spark-submit for example, but there still is a lot of text in the cmd that i can't get rid of.
I am simply trying to launch the spark shell on my local Windows 8 and here's the error message that i get :
java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are:
rw-rw-rw-
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
... 56 more
Somehow the REPL is here, but i can't use the sqlContext..
Did anyone faced this problem before? Any answer will be helpful, thanks.
RESOLVED : Downloaded the correct Winutils version and issue was resolved. Ideally, it should be locally compiled but if downloading compiled version make sure that it is 32/64 bit as applicable.
I tried on Windows 7 64 bit, Spark 1.6 and downloaded winutils.exe from https://www.barik.net/archive/2015/01/19/172716/ and it worked..!!
Complete Steps are at : http://letstalkspark.blogspot.com/2016/02/getting-started-with-spark-on-window-64.html
First you need to download the correct compatible winutils.exe for your spark and operating system. Place it somewhere in folder followed by bin directory.
Lets say D:\winutils\bin\winutils.exe
Now if /tmp/hive is present in your D: drive, run following command:
D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive
For more details, refer these posts:
Frequent Issues occurred during Spark Development
https://issues.apache.org/jira/browse/SPARK-10528
This might be helpful in this case :
https://issues.apache.org/jira/browse/SPARK-10528
I've downloaded a prebuilt version of Spark on my mac (OS Mavericks), but when I try to open an interactive shell, typing bin/pyspark, I get the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.Main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
I have googled every part of the error and checkout out some other stack overflow threads, but I can't find anything that addresses this error. Any idea what's going on/how to fix it?
One idea I have is that scala is a dependency that I need to download separately...but I really don't know.
I had the same issue before, and it turned out to be a permission issue and I'm under user who has no access to the spark files (root downloaded spark).
Another possibility is, you downloaded the source code and did not build the project from source code :P
Hope it helps.
After installing CRF++ toolkit, I try to run the program "test.java" under CRF++-0.54/java folder. For this, I type :
java -cp /home/amira/CRF++-0.54/java/org/chasen/crfpp test
But, I have the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: test
Caused by: java.lang.ClassNotFoundException: test
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
Could not find the main class: test. Program will exit.
In the README file, there is the command java -classpath CRFPP.jar test -d ../dic. But, the problem is that I don't find the classpath of CRFPP.jar. Moreover, I don't understand what ../dic in the command refer to.
Make changes in the Makefile of the java directory as per your machine settings.
Give the correct Java path and the compiler you are using using.
Run make java in the swig directory.
Run make all in the java directory.
Before running make in the java directory, ensure that you have the model file in the proper location, otherwise it won't open the model file.
Run make test in the java directory.
I am perl, python guy and new to java and groovy.
I am getting this error while running groovyConsole
groovy is working fine.
myhome:~/gscripts # groovyConsole
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.tools.GroovyStarter.rootLoader(GroovyStarter.java:108)
at org.codehaus.groovy.tools.GroovyStarter.main(GroovyStarter.java:130)
Caused by: java.awt.HeadlessException:
No X11 DISPLAY variable was set, but this program performed an operation which requires it.
I have added this
DISPLAY=:0.0
export DISPLAY
To /home/me/.bask_profile and /home/me/.bashrc also But all in vain.
Help me, let me know why this error occurs.
The groovy console is a GUI app and it looks like you're trying to run it in an environment that doesn't support graphics, e.g. connecting to a remote machine via telnet/SSH.
A possible workaround is to use the Groovy shell instead of the Groovy console. The Groovy shell is functionally similar to the Groovy console, but the shell is a command-line, rather than a GUI app.
Assuming the Groovy bin directory is on your PATH variable, you should be able to run it by typing groovysh.