How to test code dependent on environment variables using JUnit mockito? - mockito

I am trying to add import "org.junit.contrib.java.lang.system.EnvironmentVariables;" but getting error This(org.junit.cont....) import cannot be resolved. I added below dependency:
<dependency>
<groupId>com.github.stefanbirkner</groupId>
<artifactId>system-rules</artifactId>
<version>1.2.0</version>
<scope>test</scope>
</dependency>
any other way to set environment variable using junit mockito.
Can i set the env variable using powermock ?

Use a newer version, like 1.19.0.
I am not sure when the class was introduced, but its later than 1.2.0.
Edit:
The GitHub list a change date of May 18, 2018, so it should be at least
1.18.0.

Related

java.lang.NoClassDefFoundError: org/apache/log4j/spi/Filter in SparkSubmit

I've been trying to submit applications to a Kubernetes. I have followed the tutorial in https://spark.apache.org/docs/latest/running-on-kubernetes.html such as building the spark image and etc.
But whenever I tried to run the command spark-submit, the pod always throw error. This is the logs from the command: kubectl logs <spark-driver-pods>:
Error: Unable to initialize main class org.apache.spark.deploy.SparkSubmit
Caused by: java.lang.NoClassDefFoundError: org/apache/log4j/spi/Filter
I have tried to use something like:
spark-submit
...
--jars $(echo /opt/homebrew/Caskroom/miniforge/base/lib/python3.9/site-packages/pyspark/jars/*.jar | tr ' ' ',')
...
But that also still throw error.
Some notes related to my development environment:
I use Kubernetes built-in the Docker desktop
I use pyspark in conda environment, and yes I have activated the environment. That's why I can use pyspark in the terminal.
Anything else I should do? Or forget to do?
I'm using Maven, but I encountered this error while migrating from log4j 1.x to log4j 2.x and realized I still had some code that only worked with 1.x. Instead of refactoring code, I added this dependency to my pom.xml in order to maintain compatibility.
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-1.2-api</artifactId>
<version>2.17.1</version>
</dependency>
below line worked for me
libraryDependencies += "log4j" % "log4j" % "1.2.17"

Test Runner file - Hover over error does not prompt user to import api

I created a New Maven project in Eclipse and then created a new "Feature file", "Step Definition" & "Testrunner". However I see quite a few error messages when I hover over the different annotations on the Test Runner class. See below {1 to 5}
I did a lot of research over the internet and added a few dependencies in the pom.xml file but nothing seem to work.
I currently have the following dependencies in my pom.xml file
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-java -->
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>6.8.2</version>
<!-- https://mvnrepository.com/artifact/io.cucumber/cucumber-junit -->
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>6.8.2</version>
<scope>test</scope>
When I hover my mouse over Cucumber I get the following message
Cucumber cannot be resolved to a variable
When I hover my mouse over #RunWith I get the following
RunWith cannot be resolved to a type
When I hover my mouse over #CucumberOptions I get the following
CucumberOptions cannot be resolved to a type
When I hover my mouse over import cucumber.api.junit.Cucumber; I get the following
The import cucumber.api.junit cannot be resolved
When I hover my mouse over import cucumber.api.CucumberOptions; I get the following
The import cucumber.api.CucumberOptions cannot be resolved
Seems like its got to do something with the jre file. Here are the solutions i have tried based on responses I could find in stackoverflow
A] Some solutions over the internet suggested that I add the following dependency to the pom.xml file, but it did not resolve the issue
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.6</version>
<type>pom</type>
<scope>test</scope>
NOTE:- Even after adding these to the pom.xml file i do not see it under Maven Dependencies list.
B] added import cucumber.junit.Cucumber;
but even this did not help
C] Commented out #RunWith(Cucumber, class) and added the following dependency to the pom.xml file
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.5</version>
<type>pom</type>
<scope>test</scope>
NOTE:- The error message against "import cucumber.api.CucumberOptions;" disappeared but error message against "import cucumber.api.junit.Cucumber;" still shows.
Regards,
Rohit
I was able to resolve this issue.
Solution:- Verified if the Eclipse Environment was using JUNIT4
Steps:
Clicked on Project--> Properties--> Java Build Path --> Library
Then clicked on Add Library
On the "Select the Library type to add" window selected JUNIT & clicked on Next & then Finish
JUNIT showed up on the Java Build Path window
Clicked on Apply and Close.

How can I implement a maven dependency into my nodejs code?

My backend server is implemented by nodeJS.
Now, I need to add a new feature and a maven dependency is required according to the related document.
The document just says:
//Add dependency
<dependency>
<groupId>com.COMPANY.A</groupId>
<artifactId>SOME-NAME</artifactId>
<version>1.0.0</version>
</dependency>
But my code is written in NodeJS, so I wonder how I can insert this dependency setting into my code.
Any comment or link that I can refer to is appreciated.
AFAIK there is no way to import Maven dependency to NodeJS code. NodeJS does not support running uncompiled Java code.
If your dependency has a Main class, you can download the Jar from JCenter and run it as follows:
const execSync = require('child_process').execSync;
const output = execSync('java -cp ./SOME-NAME-1.0.0.jar com.foo.MyMainClass');

OpenAPI java generated client Class not found error

I have an issue with OpenAPI generator for Java client.
I am using nodeJs openApi generator to generate a Java api client with :
npx openapi-generator generate -i .\swagger.yaml -g java -o ./output -c config.yaml
then use mvn package to package the output and integrate it to my spring-boot application.
My problem appears everytime I try to create an instance of my api. I get this error :
Caused by: java.lang.NoClassDefFoundError: io/gsonfire/GsonFireBuilder
I checked to generated pom and I can see that the dependencies are there:
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>${gson-version}</version>
</dependency>
<dependency>
<groupId>io.gsonfire</groupId>
<artifactId>gson-fire</artifactId>
<version>${gson-fire-version}</version>
</dependency>
Thanks

Run spark program locally with intellij

I tried to run a simple test code in intellij IDEA. Here is my code:
import org.apache.spark.sql.functions._
import org.apache.spark.{SparkConf}
import org.apache.spark.sql.{DataFrame, SparkSession}
object hbasetest {
val spconf = new SparkConf()
val spark = SparkSession.builder().master("local").config(spconf).getOrCreate()
import spark.implicits._
def main(args : Array[String]) {
val df = spark.read.parquet("file:///Users/cy/Documents/temp")
df.show()
spark.close()
}
}
My dependencies list:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
<!--<scope>provided</scope>-->
</dependency>
when I click with run button, it throw an exception:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapreduce.TaskID.<init>(Lorg/apache/hadoop/mapreduce/JobID;Lorg/apache/hadoop/mapreduce/TaskType;I)V
I checked this post, but situation don't change after making modification. Can I get some help with running local spark application in IDEA? THx.
Update: I can run this code with spark-submit. I hope to directly run it with run button in IDEA.
Are you using cloudera sandbox and running this application because in POM.xml i could see CDH dependencies '2.6.0-mr1-cdh5.5.0'.
If you are using cloudera please use the below dependency for your spark scala project because the 'spark-core_2.10' artifact version gets changed.
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.0.0-cdh5.1.0</version>
</dependency>
</dependencies>
I used the below reference to run my spark application.
Reference: http://blog.cloudera.com/blog/2014/04/how-to-run-a-simple-apache-spark-app-in-cdh-5/
Here are the settings I use for Run/Debug configuration in IntelliJ:
*Main class:*
org.apache.spark.deploy.SparkSubmit
*VM Options:*
-cp <spark_dir>/conf/:<spark_dir>/jars/* -Xmx6g
*Program arguments:*
--master
local[*]
--conf
spark.driver.memory=6G
--class
com.company.MyAppMainClass
--num-executors
8
--executor-memory
6G
<project_dir>/target/scala-2.11/my-spark-app.jar
<my_spark_app_args_if_any>
spark-core and spark-sql jars are referred in my build.sbt as "provided" dependencies and their versions must match one of the Spark installed in spark_dir. I use Spark 2.0.2 at the moment with hadoop-aws jar version 2.7.2.
It may be late for the reply, but I just had the same issue. You can run with spark-submit, probably you already had related dependencies. My solution is:
Change the related dependencies in Intellij Module Settings for your projects from provided to compile. You may only change part of them but you have to try. Brutal solution is to change all.
If you have further exception after this step such as some dependencies are "too old", change the order of related dependencies in module settings.
I ran into this issue as well, and I also had an old cloudera hadoop reference in my code. (You have to click the 'edited' link in the original poster's link to see his original pom settings).
I could leave that reference in as long as I put this at the top of my dependencies (order matters!). You should match it against your own hadoop cluster settings.
<dependency>
<!-- THIS IS REQUIRED FOR LOCAL RUNNING IN INTELLIJ -->
<!-- IT MUST REMAIN AT TOP OF DEPENDENCY LIST TO 'WIN' AGAINST OLD HADOOP CODE BROUGHT IN-->
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.0-cdh5.12.0</version>
<scope>provided</scope>
</dependency>
Note that in 2018.1 version of Intellij, you can check Include dependiencies with "Provided" Scope which is a simple way to keep your pom scopes clean.

Resources