Run test with Sikuli 2.x on Linux give runtime error:`GLIBC_2.27' not found - linux

I have selenium tests with Sikuli.
When I was using sikuli 1.x all worked fine.
The issue started when I moved to sikuli 2.x
I import Sikuli by maven:
<dependency>
<groupId>com.sikulix</groupId>
<artifactId>sikulixapi</artifactId>
<version>2.0.5</version>
</dependency>
I tried to run this with Jenkins on CentOS7 and AWS linux.
I got on both a runtime error:
GLIBC_2.27' not found (required by /root/.Sikulix/SikulixLibs/libopencv_java430.so)
Did someone see and solve this issue?
Thanks a head for any help
Lior

Related

Getting No compiler is provided in this environment. Perhaps you are running on a JRE rather than a JDK? error while running code using azure devops

Getting No compiler is provided in this environment. Perhaps you are running on a JRE rather than a JDK? error while running code using azure devops I am using the below pom.xml, it's running perfectly fine on my local machine.
i am using below dependency
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>mssql-jdbc</artifactId>
<version>6.1.0.jre8</version>
</dependency>
Error:
Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project : Compilation failure [ERROR] No compiler is provided in this environment.

OWASP security issue with jackson-databind-2.9.8 jar

I have a maven web project(RESTful, Spring Rest/data) running in Java 8(tomcat 8.5.5) and using 'jackson-databind-2.9.8.jar'. When the Dependency Check Tool(Checks vulnerable jar version and generates report) is run against the libraries the project is using, it showed 'jackson-databind-2.9.8.jar' as Vulnerable(Reference- https://nvd.nist.gov/vuln/search/results?form_type=Advanced&results_type=overview&search_type=all&cpe_vendor=cpe%3A%2F%3Afasterxml&cpe_product=cpe%3A%2F%3Afasterxml%3Ajackson-databind&cpe_version=cpe%3A%2F%3Afasterxml%3Ajackson-databind%3A2.9.8)
Problem:- Changing to 'jackson-databind-2.10.0.jar' version fixes OWASP security issue(running Dependency Check Tool) but, when project is build and run it throws error since 2.10.0 uses jdk9+ complaint classes(Reference- https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.10)
What should be done to resolve the issue, can we make the project compile in Java 8 and run in JDK11(since JDK9 is out of support) or something else should be done? Please suggest.
Thanks in advance!
CVE-2019-12086 is fixed in jackson-databind-2.9.9.jar .
See the report: https://nvd.nist.gov/vuln/detail/CVE-2019-12086
Maven repo for 2.9.9 : https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind/2.9.9

Boost.Python Quickstart Failing

I am trying to run the quickstart example found here
I am able to follow the instructions all the way through 3.1.4 where bjam is invoked but the tests all fail despite that is IS successfully finding python.
I am on Ubuntu 16.04
boost version 1.66.0
Python 3.5
gcc 5.4.0
bjam 2014.03
Any hints as to what the problem is would be greatly appreciated!
Update: #sehe was correct. It was using python2.7 instead of 3.5.

Writing a simple program in Spark 1.4.0

I'm new in Spark. I installed jdk8 and eclipse (mars) in debian 8. And installes Spark1.4.0 and used sbt/sbt assembly command to get all required. COuld anyone tell me how to write a simple hello program in spark using eclipse ide which need to coded in java. or tell me a url to do the same. I need a step-by-step help.
Thank you in advance
You can make a maven project and add spark 1.4 maven dependency as follow.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>
And start coding in eclipse ide.
you can follow this or this and here is the java wordcount in spark example code
The link is for scala, but the same goes for java. hope it will help.

pom file java version spec for Maven

I am a new user to Maven, as I am trying to use it to build apache spark on amazon EC2 VMs. I have mannually installed java version 1.7.0 on the VMs. However as I was running the Maven, the following error occurs:
Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) on project spark-core_2.10: Execution scala-test-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile failed. CompileFailed
As I think the java version mismatch is the potential reason, causing the compiling problem. I opened up the pom file of the spark for maven tool, it has declared java related version in two seperate places:
<java.version>1.6</java.version>
and
<aws.java.sdk.version>1.8.3</aws.java.sdk.version>
What are the differences between these two versions?
Which one should be edited to solve the jave version mismatch?
It's two different things
<java.version>1.6</java.version>
is the java version used and
<aws.java.sdk.version>1.8.3</aws.java.sdk.version>
is the AWS SDK for Java version used.
The minumum requirement of AWS SDK 1.9 is Java 1.6+ so there is no compatibility issues.

Resources