BeforeStep AfterStep aren't called - cucumber

I created a hook and is using #Before, #After , #BeforeStep, #AfterStep.
1.
The pom I set is as below:
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.2.0</version>
</dependency>
with this setting,Only #Before, #After work.#BeforeStep, #AfterStep don't work. How to fix it.
If I change the version of cucumber-java to latest version 6.9.1 But the following are invalid,
import cucumber.api.java.AfterStep;
import cucumber.api.java.BeforeStep;
which package should I import?
Is anyone able to help me fix it.

Try with package-
io.cucumber.java

Related

Intellij: Running python script errors - Fatal Python error: Py_Initialize: unable to load the file system codec

This is for windows Intellij.
I have my add the plugin - org.codehaus.mojo in MVN pom file for setting PYTHONPATH environmentVariables.
Installed Python plugin.
Added the python interpretor to the project.
And when I run a simple python program with below 2 lines it errors:
# !/usr/bin/python
print("testing")
errors :
C:\Users\name\AppData\Local\Programs\Python\Python36\python.exe C:/Users/name/IdeaProjects/projectName/src/main/resources/pythonlib/test.py
Fatal Python error: Py_Initialize: unable to load the file system codec
Traceback (most recent call last):
File "<frozen importlib._bootstrap_external>", line 1096, in _path_importer_cache
KeyError: 'C:\\Users\\name\\.m2\\repository\\info\\cukes\\cucumber-java\\1.2.2\\cucumber-java-1.2.2.jar'
I have set the environmental variable PYTHONHOME, PYTHONPATH and added path.
This is a maven project and created the python file under src/main/resources/pythonlib.
Any help please.
Please try this plugin configuration:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<id>python-build</id>
<phase>prepare-package</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>C:\Users\name\AppData\Local\Programs\Python\Python36\python.exe</executable>
<workingDirectory>src/main/resources/pythonlib/</workingDirectory>
<arguments>
<argument>test.py</argument>
</arguments>
</configuration>
</plugin>
</plugins>
</build>

spark-submit fails to detect the installed modulus in pip

I have a python code which have the following 3rd party dependencies:
import boto3
from warcio.archiveiterator import ArchiveIterator
from warcio.recordloader import ArchiveLoadFailed
import requests
import botocore
from requests_file import FileAdapter
....
I have installed the dependencies using pip, and made sure that it was correctly installed by having the command pip list. Then, when I tried to submit the job to spark, I received the following errors:
ImportError: No module named 'boto3'
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193)
at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234)
at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152)
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.api.python.PairwiseRDD.compute(PythonRDD.scala:395)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
The problem of no module named not only happens with 'boto3' but also with other modules.
I tried the following things:
Added SparkContext.addPyFile(".zip files")
Using submit-spark --py-files
Reinstall pip
Made sure the path env variables export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH and installed pip install py4j
Used python instead of spark-submit
Software information:
Python version: 3.4.3
Spark version: 2.2.0
Running on EMR-AWS: Linux version 2017.09
Before doing spark-submit try going to python shell and try importing the modules.
Also check which python shell (check python path) is opening up by default.
If you are able to successfully import these modules in python shell (same python version as you trying to use in spark-submit), please check following:
In which mode are you submitting the application? try standalone or if on yarn try client mode.
Also try adding export PYSPARK_PYTHON=(your python path)
All checks mentioned above worked ok but setting PYSPARK_PYTHON solved the issue for me.

Not able to create object of SparkBundlecontext in mleap

I have imported required packages. I am even able to import SparkBundleContext
import org.apache.spark.ml.bundle.SparkBundleContext
But then when I do
val sbc = SparkBundleContext()
I get this error
java.lang.NoClassDefFoundError: org/apache/spark/ml/clustering/GaussianMixtureModel
If you are using maven add the apache spark ML dependency as
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.1</version>
</dependency>
If you are using SBT then add dependency as
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.1"
Use the right version of dependency so that it matchs your scala version.
Hope this helps!

Tensorboard - pkg_resources has no attribute 'declare_namespace'

I am trying to run tensorboard but I am getting the following import error;
AttributeError: module 'pkg_resources' has no attribute 'declare_namespace'
I have tried reinstalling setuptools and distribute. This is for Python3.5.
This stacktrace is going into the Protobuf codebase so this doesn't look like a bug in TensorFlow. It looks like something is wrong with the setuptools on your system. Maybe try reinstalling it? You can also try installing TensorFlow inside a virtualenv.
See also: https://github.com/tensorflow/tensorflow/issues/6863

Using apache poi ant in eclipse

I have installed "EE developers" software and also "XML editors and tools" on my eclipse and I want to use apache poi to read some data from excel files but when eclipse compiles my code, it gives me an error.
In fact it cannot support the following imports. why do I get this problem?
import org.apache.poi.hssf.usermodel.HSSFCell;
import org.apache.poi.hssf.usermodel.HSSFRow;
import org.apache.poi.hssf.usermodel.HSSFSheet;
import org.apache.poi.hssf.usermodel.HSSFWorkbook;
Cheers
The ANT tutorial explains how external libraries are managed by ANT.
<project name="HelloWorld" basedir="." default="main">
...
<property name="lib.dir" value="lib"/>
<path id="classpath">
<fileset dir="${lib.dir}" includes="**/*.jar"/>
</path>
...
<target name="compile">
<mkdir dir="${classes.dir}"/>
<javac srcdir="${src.dir}" destdir="${classes.dir}" classpathref="classpath"/>
</target>
...
</project>

Resources