No source TO compile maven-compiler-plugin:3.5.1:compile in Linux System - linux

I am trying to run my selenium tessts from a linux system where Jenkins is configured . But On running the job i am getting No sources to compile message # maven-compiler-plugin:3.5.1:compile section .
Note : It is a Maven Project.
pom.xml contenet
http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
Aspire_TA_Aspire_Generic_Framework_Java
GenericFramework_Java
1.0.0
jar
<name>Aspire TA Aspire Generic Framework - Java</name>
<url>http://repo.maven.apache.org/maven2</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>2.53.1</version>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.9.4</version>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-clean-plugin</artifactId>
<version>3.0.0</version>
<type>maven-plugin</type>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19.1</version>
<inherited>true</inherited>
<configuration>
<suiteXmlFiles>
<suiteXmlFile>testng.xml</suiteXmlFile>
</suiteXmlFiles>
<!-- <systemPropertyVariables>
<url>${project.url}</url>
<site>${site}</site>
</systemPropertyVariables> -->
</configuration>
</plugin>
</plugins>
</build>
<repositories>
<!-- To directly pointing the saucelabs repository -->
<repository>
<id>saucelabs-repository</id>
<url>https://repository-saucelabs.forge.cloudbees.com/release</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
<!-- To directly pointing the soapui maven2 repository -->
<repository>
<id>smartbear-sweden-plugin-repository</id>
<url>http://www.soapui.org/repository/maven2/</url>
</repository>
</repositories>
Below is my Directory structure

Linux is a case-sensitive OS by default, so you should rename the 'src/test/Java' folder to be 'src/test/java', and I don't see a 'src/main/java' folder so it has nothing to compile, or you should use the following configuration in your pom.xml file:
<build>
...
<sourceDirectory>src/main/java</sourceDirectory>
<testSourceDirectory>src/test/Java</testSourceDirectory>
...
</build>

Related

Vaadin app deployed succesfully on Heroku but not loading frontend

I have a problem with deploying Vaadin app on Heroku where it builds succesfully and I can visit the site, but frontend is not loaded at all. There is no error, it just doesn't load. I honestly have no idea what is the issue and cannot find anything on the internet what could solve it. I tried many things but nothing helped. Locally all is working good.
pom.xml
<?xml version="1.0" encoding="UTF-8"?><project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.jakpop.stepsdictionary</groupId>
<artifactId>steps-dictionary</artifactId>
<name>steps-dictionary</name>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<properties>
<maven.compiler.source>15</maven.compiler.source>
<maven.compiler.target>15</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<vaadin.version>17.0.6</vaadin.version>
</properties>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.2.RELEASE</version>
</parent>
<repositories>
<!-- The order of definitions matters. Explicitly defining central here to make sure it has the highest priority. -->
<!-- Main Maven repository -->
<repository>
<id>central</id>
<url>https://repo.maven.apache.org/maven2</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
<!-- Repository used by many Vaadin add-ons -->
<repository>
<id>Vaadin Directory</id>
<url>https://maven.vaadin.com/vaadin-addons</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<pluginRepositories>
<!-- Main Maven repository -->
<pluginRepository>
<id>central</id>
<url>https://repo.maven.apache.org/maven2</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-bom</artifactId>
<version>${vaadin.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.vaadin</groupId>
<!-- Replace artifactId with vaadin-core to use only free components -->
<artifactId>vaadin</artifactId>
</dependency>
<dependency>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.vaadin.artur</groupId>
<artifactId>a-vaadin-helper</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.vaadin.artur.exampledata</groupId>
<artifactId>exampledata</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
</dependency>
<!-- <dependency>-->
<!-- <groupId>org.springframework.boot</groupId>-->
<!-- <artifactId>spring-boot-starter-data-jpa</artifactId>-->
<!-- </dependency>-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>2.3.5.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-testbench</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.github.bonigarcia</groupId>
<artifactId>webdrivermanager</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.14</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
<version>2.4.0</version>
</dependency>
</dependencies>
<build>
<defaultGoal>spring-boot:run</defaultGoal>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<!-- Clean build and startup time for Vaadin apps sometimes may exceed
the default Spring Boot's 30sec timeout. -->
<configuration>
<wait>500</wait>
<maxAttempts>240</maxAttempts>
</configuration>
</plugin>
<!--
Take care of synchronizing java dependencies and imports in
package.json and main.js files.
It also creates webpack.config.js if not exists yet.
-->
<plugin>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-maven-plugin</artifactId>
<version>${vaadin.version}</version>
<executions>
<execution>
<goals>
<goal>prepare-frontend</goal>
</goals>
</execution>
</executions>
<configuration>
<pnpmEnable>false</pnpmEnable>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>15</source>
<target>15</target>
<compilerArgs>--enable-preview</compilerArgs>
</configuration>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<!-- Production mode is activated using -Pproduction -->
<id>production</id>
<properties>
<vaadin.productionMode>true</vaadin.productionMode>
</properties>
<dependencies>
<dependency>
<groupId>com.vaadin</groupId>
<artifactId>flow-server-production-mode</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<jvmArguments>-Dvaadin.productionMode</jvmArguments>
</configuration>
</plugin>
<plugin>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-maven-plugin</artifactId>
<version>${vaadin.version}</version>
<executions>
<execution>
<goals>
<goal>build-frontend</goal>
</goals>
<phase>compile</phase>
</execution>
</executions>
<configuration>
<pnpmEnable>false</pnpmEnable>
</configuration>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>it</id>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<id>start-spring-boot</id>
<phase>pre-integration-test</phase>
<goals>
<goal>start</goal>
</goals>
</execution>
<execution>
<id>stop-spring-boot</id>
<phase>post-integration-test</phase>
<goals>
<goal>stop</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Runs the integration tests (*IT) after the server is started -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
<configuration>
<trimStackTrace>false</trimStackTrace>
<enableAssertions>true</enableAssertions>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
Procfile
web: java --enable-preview -jar target/steps-dictionary-1.0-SNAPSHOT.jar $PORT
system.properties
java.runtime.version=15
vaadin.pnpm.enable=false
heroku-settings.xml
<?xml version="1.0" encoding="UTF-8"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
<!-- activate by setting the MAVEN_SETTINGS_PATH config var to heroku-settings.xml in Heroku project settings tab.
See https://devcenter.heroku.com/articles/using-a-custom-maven-settings-xml for more details.
-->
<activeProfiles>
<activeProfile>production</activeProfile>
<activeProfile>npm</activeProfile>
</activeProfiles>
</settings>
I have java and nodejs buildpacks added on Heroku as well.
I basically lost any hope at this point. If anyone has any idea I would be very grateful for the help.
I switched back to Java 13 and it's started working

Apache beam spark /flink runner not getting executed in EMR(Access files from GCS)

I have an apache beam pipeline to index some data to elasticsearch. I was trying to use spark or Flink runner to run the job in AWS EMR. When I tried to run the job on a stand-alone spark on local setup, pipeline works with source files in the local disk, however, when I read the file from GCS it's not working. It is the same when I am running in the EMR cluster.
The configs that I set on the Hadoop core-site.xml
as EMR config
{
"Classification": "core-site",
"Properties": {
"fs.gs.impl": "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem",
"fs.AbstractFileSystem.gs.impl": "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS",
"fs.gs.project.id": "data-warehouse",
"google.cloud.auth.service.account.enable": "true",
"fs.gs.auth.service.account.json.keyfile": "/home/hadoop/utils/key.json"
}
}
Also, GCS-connector jar is in the spark jar path and hadoop jar path
The pom file of the maven for the pipeline
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.company.beam</groupId>
<artifactId>IndexToEs</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<beam.version>2.22.0</beam.version>
<maven-compiler-plugin.version>3.7.0</maven-compiler-plugin.version>
<maven-exec-plugin.version>1.6.0</maven-exec-plugin.version>
<slf4j.version>1.7.25</slf4j.version>
</properties>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.3</version>
<executions>
<execution>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.company.beam.IndexToEs</mainClass>
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>${maven-exec-plugin.version}</version>
<configuration>
<cleanupDaemonThreads>false</cleanupDaemonThreads>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
<dependencies>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-core</artifactId>
<version>${beam.version}</version>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-elasticsearch</artifactId>
<version>${beam.version}</version>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-google-cloud-platform</artifactId>
<version>${beam.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-extensions-google-cloud-platform-core -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-extensions-google-cloud-platform-core</artifactId>
<version>${beam.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-runners-google-cloud-dataflow-java -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-google-cloud-dataflow-java</artifactId>
<version>${beam.version}</version>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-direct-java</artifactId>
<version>${beam.version}</version>
<!-- <scope>runtime</scope>-->
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-spark</artifactId>
<version>${beam.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-runners-flink -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-flink_2.11</artifactId>
<version>2.16.0</version>
</dependency>
<dependency>
<groupId>com.google.cloud.bigdataoss</groupId>
<artifactId>gcs-connector</artifactId>
<version>hadoop2-1.9.17</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.3</version>
</dependency>
<!-- slf4j API frontend binding with JUL backend -->
<!-- <dependency>-->
<!-- <groupId>org.slf4j</groupId>-->
<!-- <artifactId>slf4j-api</artifactId>-->
<!-- <version>${slf4j.version}</version>-->
<!-- </dependency>-->
<!-- <dependency>-->
<!-- <groupId>org.slf4j</groupId>-->
<!-- <artifactId>slf4j-jdk14</artifactId>-->
<!-- <version>${slf4j.version}</version>-->
<!-- </dependency>-->
</dependencies>
</project>
There is no error but EMR shows task com[pleted but the pipeline has not run.
I could not figure out if its an apache beam problem or cluster config problem.
I figured out the issue. Apache beam sdk uses gsutil to access the GCS files. As per flink documentation, hadoop connectors were responsible for any other files system access, but in the case of apache beam using flink runner the data is read using gsutil and fed into the downstream. So I installed google could SDK and activated the service account.

java.io.InvalidClassException: com.pingcap.tikv.region.TiRegion - Local class incompatible

I want to connect my Spark cluster to TIDB by TiSpark but I got a problem when I run my Spark application, an error occur:
java.io.InvalidClassException: com.pingcap.tikv.region.TiRegion; local class incompatible: stream classdesc serialVersionUID = -3091715739322916126, local class serialVersionUID = -3556238418089320368
I'm setting up a TIDB cluster follow the guide at https://pingcap.com/docs/v3.0/how-to/get-started/deploy-tidb-from-binary/
After that I follow the guide at https://pingcap.com/docs/v3.0/reference/tispark/ to download tispark-core-2.2.0-SNAPSHOT-jar-with-dependencies.jar and copy it to my jars folder in Spark.
I also config:
spark.tispark.pd.addresses 127.0.0.1:2379
spark.sql.extensions org.apache.spark.sql.TiExtensions
Here is my pom file:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.vng</groupId>
<artifactId>testlan102</artifactId>
<version>1.0-SNAPSHOT</version>
<inceptionYear>2019</inceptionYear>
<properties>
<scala.version>2.11.12</scala.version>
<spark.version>2.4.3</spark.version>
</properties>
<repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.3</version>
</dependency>
<dependency>
<groupId>org.scala-lang.modules</groupId>
<artifactId>scala-xml_2.11</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.4.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>com.pingcap.tispark</groupId>
<artifactId>tispark-core</artifactId>
<version>2.1.1-spark_2.4</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.37</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.12</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.5</arg>
</args>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<configuration>
<downloadSources>true</downloadSources>
<buildcommands>
<buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand>
</buildcommands>
<additionalProjectnatures>
<projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature>
</additionalProjectnatures>
<classpathContainers>
<classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
<classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer>
</classpathContainers>
</configuration>
</plugin>
</plugins>
</build>
<reporting>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>fully.qualified.MainClass</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</reporting>
</project>
My Spark Session is:
val _spark = SparkSession.builder()
.master("spark://127.0.0.1:7077")
.config("spark.tispark.pd.addresses", "127.0.0.1:2379")
.config("spark.sql.extensions","org.apache.spark.sql.TiExtensions")
.appName("SparkApp")
.getOrCreate()
When I call a simple query to database:
_spark.sql("use locdb")
val df = _spark.sql("select * from bang")
df.show()
I got an error:
java.io.InvalidClassException: com.pingcap.tikv.region.TiRegion; local class incompatible: stream classdesc serialVersionUID = -3091715739322916126, local class serialVersionUID = -3556238418089320368
My full log is here:
https://gist.github.com/lploc94/bb6bf9db14c030ee123630f6362f6160
I think the reason is I using TiSpark 2.1.1-2.4 in maven pom file but the Tispark jar file I download and copy to jars folder is 2.2.0. But I cant see any other version of TiSpark like tispark-core-2.1.1-SNAPSHOT-jar-with-dependencies.jar
I'm a dev of tispark.
Yes your educated guess is correct :). There are two different versions of tispark jars during your run which caused problem. Version 2.2 (in your cluster env) is not officially released to maven repo artifacts yet.
Since you already have tispark jars deployed in your cluster you can just remove tispark dependency in your pom. In most of the cases you don't need any special api from tispark unless you are using older version (< 2.0) and you still can query tidb directly.
Or you might remove all jars in your cluster environment and rely on tispark in your pom (if so, please pack it with dependencies).

Build Failure in Bitbucket and Azure

My build is failing in Azure and Bitbucket using Java Maven with this message
Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile
BUT
its passing in Jenkins and also locally.
Please have a look at my pom.xml.
Please also check my yml file
# This is a sample build configuration for Java (Maven).
# Check our guides at https://confluence.atlassian.com/x/zd-5Mw for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: maven:3.3.9
pipelines:
default:
- step:
caches:
- maven
script: # Modify the commands below to build your repository.
- mvn -B verify # -B batch mode makes Maven less verbose
enter code here
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>WebDriverTest1</groupId>
<artifactId>WebDriverTest1</artifactId>
<version>0.0.1-SNAPSHOT</version>
<!-- https://mvnrepository.com/artifact/org.testng/testng -->
<dependencies>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.14.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>2.45.0</version>
</dependency>
<dependency>
<groupId>io.rest-assured</groupId>
<artifactId>rest-assured</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.141.59</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
<dependency>
<groupId>com.intuit.karate</groupId>
<artifactId>karate-junit4</artifactId>
<version>0.6.1</version>
</dependency>
<dependency>
<groupId>com.intuit.karate</groupId>
<artifactId>karate-apache</artifactId>
<version>0.6.1</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java</artifactId>
<version>1.2.5</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>2.3.1</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.21.0</version>
<inherited>true</inherited>
<configuration>
<includes>
<include>SystemTest.java</include>
</includes>
</configuration>
</plugin>
</plugins>
<defaultGoal>install</defaultGoal>
</build>
<properties>
<java.version>1.8.0_171</java.version>
</properties>
</project>
I had the error:
Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin
and it was happening because maven couldn't download all the dependencies:
Failed to collect dependencies at org.apache.maven.plugins:maven-compiler-plugin
For me the solution in Azure was to edit maven task, in advanced section I deselected
'Authenticate built-in Maven feeds' .

Allure #Step annotation does not work with groovy/spock code

I'm using spock framework and groovy for my tests. Also I'm using allure-spock-1.0-adaptor for generating Allure reports. Reports looks fine, but not showing steps in results. All groovy methods are annotated with #Step but still not logged in report.
How to fix that?
Check FAQ. I suppose you need to add javaagent. AspectJ is used to process steps, attachments and parameters.
Solved by adding AspectJ and allure-junit-adaptor into pom.xml. Now #Step is handled correctly with spock tests.
See example of pom.xml :
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>allure.spock.demo</groupId>
<artifactId>allure-spock</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<allure.version>1.4.23.HOTFIX1</allure.version>
<aspectj.version>1.8.9</aspectj.version>
<compiler.version>1.7</compiler.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.gmavenplus</groupId>
<artifactId>gmavenplus-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19.1</version>
<configuration>
<argLine>
-javaagent:"${settings.localRepository}/org/aspectj/aspectjweaver/${aspectj.version}/aspectjweaver-${aspectj.version}.jar"
</argLine>
<includes>
<include>**/*Spec.*</include>
</includes>
</configuration>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-core</artifactId>
<version>1.1-groovy-2.4-rc-3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>ru.yandex.qatools.allure</groupId>
<artifactId>allure-junit-adaptor</artifactId>
<version>${allure.version}</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>2.4.6</version>
</dependency>
<dependency>
<groupId>ru.yandex.qatools.allure</groupId>
<artifactId>allure-spock-1.0-adaptor</artifactId>
<version>1.0</version>
</dependency>
</dependencies>
<reporting>
<excludeDefaults>true</excludeDefaults>
<plugins>
<plugin>
<groupId>ru.yandex.qatools.allure</groupId>
<artifactId>allure-maven-plugin</artifactId>
</plugin>
</plugins>
</reporting>

Resources