Selenium-Maven-Gauge template not running - getgauge

I am currently building off of the gauge-maven-selenium template and I have added my own spec and java class steps to test it. Unfortunately when I try to run using 'mvn clean test' it builds it just fine, but when trying to run it, it stops after :
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Any help would be appreciated. Thanks!

Related

spark error - getting installation eror

After installing Spark and running
C:\spark-2.3.1-bin-hadoop2.7\bin>spark-shell
I am getting following error - any advice?
C:\spark-2.3.1-bin-hadoop2.7\bin>spark-shell
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/C:/spark-2.3.1-bin-hadoop2.7/jars/hadoop-auth-2.7.3.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2018-08-05 01:29:36 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.NullPointerException
I think you dont have the right java or scala version.
Please note that Spark 2.3.1 runs on
Java 8+,
Python 2.7+/3.4+ and
R 3.1+.
For the Scala API, Spark 2.3.1 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).
Please check the below 2 things-
1. Check the java version installed on your machine where you are submitting your spark application
sudo update-alternatives --config java
sudo update-alternatives --config javac
check the scala version
scala -version

Spark sql errors

I try to work with spark-sql but I had the following errors :
error: missing or invalid dependency detected while loading class file
'package.class'. Could not access term annotation in package
org.apache.spark, because it (or its dependencies) are missing. Check
your build definition for missing or conflicting dependencies. (Re-run
with -Ylog-classpath to see the problematic classpath.) A full
rebuild may help if 'package.class' was compiled against an
incompatible version of org.apache.spark. warning: Class
org.apache.spark.annotation.InterfaceStability not found - continuing
with a stub. error: missing or invalid dependency detected while
loading class file 'SparkSession.class'. Could not access term
annotation in package org.apache.spark, because it (or its
dependencies) are missing. Check your build definition for missing or
conflicting dependencies. (Re-run with -Ylog-classpath to see the
problematic classpath.) A full rebuild may help if
'SparkSession.class' was compiled against an incompatible version of
org.apache.spark.
My configuration :
Scala 2.11.8
Spark-core_2.11-2.1.0
Spark-sql_2.11-2.1.0
Note: I use SparkSession.
After dig into the error message, I know how to solve this kind of errors.
For example:
Error - Symbol 'term org.apache.spark.annotation' is missing... A full rebuild may help if 'SparkSession.class' was compiled against an incompatible version of org.apache.spark
Open SparkSession.class, search "import org.apache.spark.annotation.", you will find import org.apache.spark.annotation.{DeveloperApi, Experimental, InterfaceStability}. It's sure that these classes is missing in classpath. You'll need to find the artifact which conclude these classes.
So open https://search.maven.org and search with c:"DeveloperApi" AND g:"org.apache.spark", you will find the missing artifact is spark-tags as #Prakash answered.
In my situation, just add dependencies spark-catalyst and spark-tags in pom.xml works.
But it's weird that why maven not auto resolve transitive dependencies here?
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
If I use the above depencency, only spark-core_2.11-2.2.0.jar is in maven dependency; While if I change version to 2.1.0 or 2.3.0, all transitive dependencies will be there.
You need to include following artifacts to avoid the dependency issues.
spark-unsafe_2.11-2.1.1
spark-tags_2.11-2.1.1

'Cucumber' symbol not recognized in runner class

When trying to run my cucumber test in android studio I am getting the error message in regards to my runner class junit annotation #RunWith(Cucumber.class):
Error:(9, 10) error: cannot find symbol class Cucumber
I have the libraries that I need installed. Why am I getting this message?
Here is a screenshot of my modules build.gradle class.
build.gradle

Ant LibusbJava compile error: "jni.h: No such file"...fixed, now a memset error

There appears to be a Ant / jni.h problem (for my setup) with LibusbJava. I get the following error when setting up LibusbJava by running
ant linux
in CentOS 6.3 as root (quick and dirty test, thanks for those concerned about user level =0). I will redo with proper restrictions as shown on a libusbjava reference after reflection when first install works. LibusbJava, is a Java wrapper for the libusb library.
The output starts out like this:
[root#somebox LibusbJava]# ant linux -lib $JAVA_HOME/include -lib $JAVA_HOME/include/linux
Buildfile: build.xml
clean:
Build LibusbJava Test Linux:
Build LibusbJava Test:
[echo] Building Library for unit tests:
[exec] /.../libusbjava/trunk/LibusbJava/LibusbJava.cpp:27:17: error: jni.h: No such file or directory
[exec] /.../libusbjava/trunk/LibusbJava/LibusbJava.cpp:34:26: error: test/CuTest.h: No such file or directory
`...`
as it was not set on my system. It appears that jni.h is a header called by C++ code, which I had to add to the LibusbJava.cpp file. This is the new include, /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/include/jni.h. I then get an error with jni.h not being able to fine jni_md.h which is an include in jni.h #include "jni_md.h".
Clearly this is not the right approach, so perhaps I need a correct Ant reference, but I cannot seem to do it with a -lib switch. Besides, this just creates thousands of jni.h file errors during the Ant build attempt.
How can I fix this problem?
Notes: I've set$JAVA_HOME up like JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64 and javac and java work fine.
New discovery... after helpful questions which guided me, I am much closer to successful compilation.
When I install LibusbJava and manually install all the library references in build.xml there is still an error compiling the LibusbJava based on a memset error. I see memset patches for libusb that appeared in 2007 and it's unclear how to use or if they relate. Investigating... Hints, comments and questions welcome. My most sincere thanks for the help thus far.
[exec] /.../libu/libusbjava/trunk/LibusbJava/objects/Usb_Device.cpp: In function ‘void Usb_Device_disconnect(JNIEnv*)’:
[exec] /.../libu/libusbjava/trunk/LibusbJava/objects/Usb_Device.cpp:88: error: ‘memset’ was not declared in this scope
[antcall] Exiting /.../libu/libusbjava/trunk/LibusbJava/build.xml.
BUILD FAILED
but I find no referece to an include string.h or cstring. The header of Usb_Device.cpp mentions it is a C++ Stub for the java class of a java class ch.ntb.inf.libusbJava.Usb_Device. which only has this include #include "Usb_Device.h" which also does not appear to have a string include. Perhaps just insertion of#include`?

groovy console in osgi

I managed to embed the standard groovy console to Felix and expose a number of variables (i.e. BundleContext, etc).
However, on "first" start up of my blueprint bundle, I got this error:
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.6.0_24]
... ... ...
at groovy.util.FactoryBuilderSupport.callAutoRegisterMethods(FactoryBuilderSupport.java:202)[groovy-all-1.7.8.jar:1.7.8]
... 42 more
Caused by: java.lang.NoClassDefFoundError: sun/reflect/ConstructorAccessorImpl
at sun.misc.Unsafe.defineClass(Native Method)[:1.6.0_24]
... ... ...
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)[:1.6.0_24]
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)[:1.6.0_24]
... ... ...
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:153)[groovy-all-1.7.8.jar:1.7.8]
at groovy.swing.SwingBuilder.registerActionButtonWidgets(SwingBuilder.groovy:94)[groovy-all-1.7.8.jar:1.7.8]
... 47 more
Caused by: java.lang.ClassNotFoundException: *** Package 'sun.reflect' is not imported by bundle groovy-all [18], nor is there any bundle that exports package 'sun.reflect'. However, the class 'sun.reflect.ConstructorAccessorImpl' is available from the system class loader. There are two fixes: 1) Add package 'sun.reflect' to the 'org.osgi.framework.system.packages.extra' property and modify bundle groovy-all [18] to import this package; this causes the system bundle to export class path packages. 2) Add package 'sun.reflect' to the 'org.osgi.framework.bootdelegation' property; a library or VM bug can cause classes to be loaded by the wrong class loader. The first approach is preferable for preserving modularity. ***
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1782)[org.apache.felix.framework-3.0.9.jar:]
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)[:1.6.0_24]
at org.codehaus.groovy.runtime.callsite.CallSiteClassLoader.loadClass(CallSiteClassLoader.java:51)[groovy-all-1.7.8.jar:1.7.8]
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)[:1.6.0_24]
at org.codehaus.groovy.reflection.ClassLoaderForClassArtifacts.loadClass(ClassLoaderForClassArtifacts.java:58)[groovy-all-1.7.8.jar:1.7.8]
... 65 more
Caused by: java.lang.ClassNotFoundException: sun.reflect.ConstructorAccessorImpl not found by groovy-all [18]
at org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:787)[org.apache.felix.framework-3.0.9.jar:]
at org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)[org.apache.felix.framework-3.0.9.jar:]
at org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)[org.apache.felix.framework-3.0.9.jar:]
... 69 more
With the exception thrown, my bundle is now active. However, the console do not show. If I stop and start the bundle again, the error no longer shows and i'm able to see and use my swing groovy console.
The stack trace indicates the following options to fix this problem:
Add package 'sun.reflect' to the
'org.osgi.framework.system.packages.extra'
property and modify bundle
groovy-all [18] to import this
package; this causes the system
bundle to export class path
packages.
Add package 'sun.reflect' to the
'org.osgi.framework.bootdelegation'
property; a library or VM bug can
cause classes to be loaded by the
wrong class loader. The first
approach is preferable for
preserving modularity. *
I am quite certain that with option 2, the error will go away.
However, my question is...if sun.reflect is not imported, why does the groovy console show after i restart the bundle? Appreciate advice from anyone with such experience.
Option 1 is the correct solution to the problem. Your bundle depends on package sun.reflect, so you should make that explicit in the Import-Package statement.
I have no idea why the console might show after restarting. It may do its own internal "fail-over" in the event it can't load the sun.reflect package. I wouldn't worry too much about this aspect, just get it to work by adding the import.

Resources