They suggested me to update scala so I did it:
$ scala -version
Scala code runner version 2.11.4 -- Copyright 2002-2013, LAMP/EPFL
But this error remains:
my_project $ sbt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[info] Loading project definition from /home/alex/Documents/projects/android/my_app123/sub_project1/project
[info] Compiling 1 Scala source to /home/alex/Documents/projects/android/my_app123/sub_project1/project/target/scala-2.9.2/sbt-0.12/classes...
[error] error while loading CharSequence, class file '/usr/local/java/jdk1.8.0_05/jre/lib/rt.jar(java/lang/CharSequence.class)' is broken
[error] (bad constant pool tag 15 at byte 1501)
[error] one error found
[error] (compile:compile) Compilation failed
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q
So the error is the same. And also for some reason it has ...target/scala-2.9.2/sbt-0.12/classes in it path, why is it using scala 2.9.2. I tried deleting the directory target but it appeared again with the same scala 2.9.2.
The version of Scala installed on your system is irrelevant if you use sbt. What matters is that the following setting be present in your build.sbt file:
scalaVersion := "2.11.4"
Related
I have this error coming from groovy -v.
Error: Could not find or load main class org.codehaus.groovy.tools.GroovyStarter
Caused by: java.lang.ClassNotFoundException: org.codehaus.groovy.tools.GroovyStarter
java version = openjdk version "17.0.6" 2023-01-17 LTS
got that from Microsoft https://learn.microsoft.com/en-us/java/openjdk/install
Groovy version 4.0.8
got groovy from jfrog.io
apache-groovy-binary-4.0.8.zip 22-01-23 01:00:24
Have set Java_home and Groovy_home. Both are appended to the path.
C:\Users<Myusername>\dev\languages\apache-groovy-sdk-4.0.8\groovy-4.0.8\bin
groovy was unzipped into this folder.
I was trying to start the use of groovy by checking its version number.
I've followed all the steps in the official guide. Except I built it using:
$ bazel build -c opt --copt=-mavx --copt=-mavx2 --copt=-mfma --copt=- msse4.1 --copt=-msse4.2 --config=opt -k //tensorflow/tools/pip_package:build_pip_package
And during ./config I've set the right paths and disabled Google Cloud Platform, Hadoop, XLA, VERBS, OpenCL, CUDA, MPI support.
Hardware:
Macbook Pro 13 inch (mid 2014)
CPU: Intel Core i5 (4278U)
RAM: 8GB
Software:
High Sierra (10.13.2)
Clang Version: clang-900.0.39.2
Bazel Version: 0.9.0
Conda Version: 4.4.3
Python: 3.6.3
All the packages are upto date. This worked perfectly fine 2 months ago on this machine. For some strange reasons it doesn't build anymore now. I'm just posting a part of the error list here:
WARNING: Config values are not defined in any .rc file: opt
ERROR: Skipping 'msse4.1': no such target '//:msse4.1': target 'msse4.1' not declared in package '' defined by /Users/rakshithgb/Documents/Tensorflow/tensorflow/BUILD
WARNING: Target pattern parsing failed.
ERROR: /private/var/tmp/_bazel_rakshithgb/fde7bc60972656b0c2db4fd0b79e24fb/external/com_googlesource_code_re2/BUILD:96:1: First argument of 'load' must be a label and start with either '//', ':', or '#'. Use --incompatible_load_argument_is_label=false to temporarily disable this check.
ERROR: /private/var/tmp/_bazel_rakshithgb/fde7bc60972656b0c2db4fd0b79e24fb/external/com_googlesource_code_re2/BUILD:98:1: name 're2_test' is not defined (did you mean 'ios_test'?)
ERROR: /private/var/tmp/_bazel_rakshithgb/fde7bc60972656b0c2db4fd0b79e24fb/external/com_googlesource_code_re2/BUILD:100:1: name 're2_test' is not defined (did you mean 'ios_test'?)
And it ends like this:
ERROR: /Users/rakshithgb/Documents/Tensorflow/tensorflow/tensorflow/core/kernels/BUILD:550:1: Target '#local_config_sycl//sycl:using_sycl' contains an error and its package is in error and referenced by '//tensorflow/core/kernels:debug_ops'
WARNING: errors encountered while analyzing target '//tensorflow/tools/pip_package:build_pip_package': it will not be built
INFO: Analysed target //tensorflow/tools/pip_package:build_pip_package (203 packages loaded).
INFO: Found 0 targets...
ERROR: command succeeded, but there were errors parsing the target pattern
INFO: Elapsed time: 12.763s, Critical Path: 0.02s
FAILED: Build did NOT complete successfully
Has anyone else had this issue? How do I fix it? I've uploaded the entire error log on GitHub Tensorflow issue page. #15622
Ok it looks like the new bazel version isn't compatible with the current Tensorflow release. It looks like the fix will be issued in the next release. According to this thread on GitHub - #15492
The temporary fix that worked for me was to build it using --incompatible_load_argument_is_label=false in the bazel command. So my build command now looks like this:
$ bazel build --config=opt --incompatible_load_argument_is_label=false //tensorflow/tools/pip_package:build_pip_package
Can somebody tell me how to build the Spark-Cassandra Connector assembly? I've tried following the instructions on the Github page https://github.com/datastax/spark-cassandra-connector but I just get hundreds of "deduplicate" errors.
I'm using Scala 2.11.7 with Spark 1.5.1 (which I built for Scala 2.11) and SBT 13.8.
I did the following:
git clone https://github.com/datastax/spark-cassandra-connector.git
cd spark-cassandra-connector/
sbt -Dscala-2.11=true assembly
The build process runs for a while, but then starts spitting out hundreds of "deduplicate" errors and fails. I have no idea where to start fixing this, but right now as far as I can tell the assembly build process for this project just doesn't work.
Any tips on how I can fix this?
It appears the build for 2.11 is broken and you should report it to the project. Dunno how to fix it right now.
➜ spark-cassandra-connector git:(master) sbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/dev/oss/spark-cassandra-connector/project
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots
Scala: 2.10.5 [To build against Scala 2.11 use '-Dscala-2.11=true']
Scala Binary: 2.10
Java: target=1.7 user=1.8.0_66
[info] Set current project to root (in build file:/Users/jacek/dev/oss/spark-cassandra-connector/)
[root]> update
...
[info] Done updating.
[info] Done updating.
[success] Total time: 314 s, completed Dec 2, 2015 10:26:01 AM
[root]>
➜ spark-cassandra-connector git:(master) sbt -Dscala-2.11=true
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/dev/oss/spark-cassandra-connector/project
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots
Scala: 2.11.7
Scala Binary: 2.11
Java: target=1.7 user=1.8.0_66
[info] Set current project to root (in build file:/Users/jacek/dev/oss/spark-cassandra-connector/)
[root]> update
...
[error] impossible to get artifacts when data has not been loaded. IvyNode = org.slf4j#slf4j-log4j12;1.7.6
...
[trace] Stack trace suppressed: run last spark-cassandra-connector-embedded/*:update for the full output.
[error] (spark-cassandra-connector-embedded/*:update) java.lang.IllegalStateException: impossible to get artifacts when data has not been loaded. IvyNode = org.slf4j#slf4j-log4j12;1.7.6
[error] Total time: 9 s, completed Dec 2, 2015 10:27:19 AM
I filed an issue https://datastax-oss.atlassian.net/browse/SPARKC-295.
I am using Groovy/Gradle with Cucumber framework. following are the versions
Groovy Version: 2.4.4
------------------------------------------------------------
Gradle 2.5
------------------------------------------------------------
Build time: 2015-07-08 07:38:37 UTC
Build number: none
Revision: 093765bccd3ee722ed5310583e5ed140688a8c2b
Groovy: 2.3.10
Ant: Apache Ant(TM) version 1.9.3 compiled on December 23 2013
JVM: 1.7.0_17 (Oracle Corporation 23.7-b01)
OS: Windows 8 6.2 amd64
I am getting this error when I try to run following
$ gradle clean idea
FAILURE: Build failed with an exception.
* Where:
Build file 'C:\Documents and Settings\Sudheerah\Documents\Sudheera\KBase\APDM\APIAutomation\build.gradle' line: 76
* What went wrong:
A problem occurred evaluating root project 'APIAutomation'.
> Could not find method groovy() for arguments [org.codehaus.groovy:groovy-all:2.4.4] on root project 'APIAutomation'.
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 35.237 secs
Below is the build.gradle file error line.
Stacktrace
Caused by: org.gradle.api.internal.MissingMethodException: Could not find method groovy() for arguments [org.codehaus.groovy:groovy-all:2.4.4] on root project 'APIAutomation'.
at org.gradle.api.internal.AbstractDynamicObject.methodMissingException(AbstractDynamicObject.java:68)
Assuming you've applied the groovy plugin, you need to change your groovy dependency to a compile dependency.
The groovy dependency is the old way of setting up the groovy plugin
I am assembling the "Cassandra-Spark-Connector". I just followed the steps below:
Git clone connector code
Run "sbt assembly"
During the assembly phase I am getting the following error:
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.eed3si9n:sbt-assembly:0.11.2 -> 0.13.0
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 5 Scala sources to /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes...
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:23: object Plugin is not a member of package sbtassembly
[error] import sbtassembly.Plugin._
[error] ^
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:24: not found: object AssemblyKeys
[error] import AssemblyKeys._
[error] ^
[error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:217: not found: value assemblySettings
[error] lazy val sbtAssemblySettings = assemblySettings ++ Seq(
[error] ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
I am running sbt 0.13.6
You can always use the packaged sbt by running.
./sbt/sbt assembly
This will automatically download and use a valid version of sbt.
Building Spark java connector requires sbt-assembly version 0.11.2 as defined in plugins.bat. It is likely you have a newer sbt-assembly version (ver. 0.13.0) installed in global plugins folder (~.sbt\0.13\plugins) which is causing this issue.
Kindly rename the plugins folder in ~.sbt\0.13 and try to build it again.