Error: Invalid or corrupt jarfile occured while trying to build recommendation engine of PredictionIO in Linux machine - linux

Error occured while trying to build the recommendation engine using PredictionIO. Please anyone know how to solve this issue.
root#testing:~/PredictionIO/engines# pio build --verbose
[INFO] [Console$] Using command '/root/PredictionIO/sbt/sbt' at the current working directory to build.
[INFO] [Console$] If the path above is incorrect, this process will fail.
[INFO] [Console$] Uber JAR disabled. Making sure lib/pio-assembly-0.9.4.jar is absent.
[INFO] [Console$] Going to run: /root/PredictionIO/sbt/sbt package assemblyPackageDependency
[ERROR] [Console$] Error: Invalid or corrupt jarfile /root/PredictionIO/sbt/sbt-launch-0.13.7.jar
[ERROR] [Console$] Return code of previous step is 1. Aborting.

For me help to download this file
https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.7/sbt-launch.jar
Rename downloaded file to sbt-launch-0.13.7.jar and replace previous file in PredictionIO/sbt/

Related

Azure Databricks - Library Installation Fails

I am running a job from a jar file in Azure Databricks. This jar has a dependency on azure-storage-file-share. Previously, there wasn't an issue installing this dependency using Maven within the Databricks UI. Now, I get failure with this error message:
Run result unavailable: job failed with error message Library installation failed for library due to user error for maven { coordinates: "com.azure:azure-storage-file-share:12.16.2" } Error messages: Library installation attempted on the driver node of cluster 0131-144423-r1g81134 and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: java.util.concurrent.ExecutionException: java.io.FileNotFoundException: File file:/local_disk0/tmp/clusterWideResolutionDir/maven/ivy/jars/io.netty_netty-transport-native-kqueue-4.1.86.Final.jar does not exist
To try to work around this, I manually installed this netty library (and several others) as well. I can see in the logs that it was able to download the jar successfully:
downloading https://maven-central.storage-download.googleapis.com/maven2/io/netty/netty-transport-native-kqueue/4.1.86.Final/netty-transport-native-kqueue-4.1.86.Final-osx-x86_64.jar ... [SUCCESSFUL ] io.netty#netty-transport-native-kqueue;4.1.86.Final!netty-transport-native-kqueue.jar (202ms)
However, it still fails. This is error message is in the same log after the one above:
23/01/31 14:50:09 WARN LibraryState: [Thread 137] Failed to install library maven;azure-storage-file-share;com.azure;12.16.2;; java.util.concurrent.ExecutionException: java.io.FileNotFoundException: File file:/local_disk0/tmp/clusterWideResolutionDir/maven/ivy/jars/io.netty_netty-transport-native-kqueue-4.1.86.Final.jar does not exist

SSL problems running azure-webapp-maven-plugin config

when running:
mvn com.microsoft.azure:azure-webapp-maven-plugin:1.14.0:config
I end up with
[ERROR] Failed to execute goal com.microsoft.azure:azure-webapp-maven-plugin:1.14.0:config (default-cli) on project storingen-api: Max retries 0 times exceeded. Error Details: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal com.microsoft.azure:azure-webapp-maven-plugin:1.14.0:config (default-cli) on project storingen-api: Max retries 0 times exceeded. Error Details: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Clearly this is a certificate problem that cannot be found. I have however no clue what certificate is needed nor in which keystore it needs to be. I suppose in the cacerts of my java distribution but it's not clear in any way.
Does anyone know which cert and where to put it ?
I just running into the same problem following Azure Learn Java Azure
Basically, this error ocurred because Java is trying to connect with your Azure Subscription at Azure Console. I saw the origin of this problem executing Maven command with -X. The log trace can be visualized over here, like this:
mvn com.microsoft.azure:azure-webapp-maven-plugin:1.12.0:config -X
What I did to solve:
I tried inserting Azure certifications inside Java cacerts - following the explanation here. But it didn't change or solved the problem.
I searched for a new one version of azure-webapp-maven-plugin. Nowadays, the version is 2.3.0. You can check at Maven repository the last version. It fixed my problem and no more throws the error caused by Java.
before: mvn com.microsoft.azure:azure-webapp-maven-plugin:1.12.0:config
after (I think so, that version fixes the problem): mvn com.microsoft.azure:azure-webapp-maven-plugin:2.3.0:config
Thanks

JitPack build failed with ERROR: Time-out getting container status

I am trying to deploy some of my jar libraries through JitPack. So far I am still testing things out, thus version codes are dev-SNAPSHOT or master-SNAPSHOT for the libraries.
For most libraries this seems to work well (at least as far as fetching the artifacts goes); however, one library had a failed build for master-SNAPSHOT.
The corresponding build, master-36ef0715cd-1, reports failure. The last lines of the log read:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18.995 s
[INFO] Finished at: 2021-01-23T06:52:54Z
[INFO] ------------------------------------------------------------------------
Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8 -Dhttps.protocols=TLSv1.2
Found module: org.traffxml:traff-source-android:0.0.1-SNAPSHOT
Build tool exit code: 0
Looking for artifacts...
Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8 -Dhttps.protocols=TLSv1.2
Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8 -Dhttps.protocols=TLSv1.2
Looking for pom.xml in build directory and ~/.m2
Found artifact: org.traffxml:traff-source-android:0.0.1-SNAPSHOT
ERROR: Time-out getting container status
Error building
So, apparently, the jar got built but fetching container status failed after that. This could well be an issue with JitPackā€™s infrastructure: over the last couple hours, it has taken several retries to get the artifacts to build, and I do not see any difference between this library and the others which built successfully.
How can I retry the failed build, or otherwise fix this (other than by going the crude way of pushing a new commit to my repo)?
Eventually I had to commit more changes, which eventually solved this for the moment.
On the long run, one thing to try:
Go to https://jitpack.io/
In the box, enter your group and artifact ID, and click Look up.
In the list of availale builds, click the Branches tab.
Look for the corresponding -SNAPSHOT build for your branch, and click Get it.
No guarantee this works, but it did trigger a new build after pushing a new commit. Feedback appreciated.

jsreport failing on looking up extensions

So I recently started working on a linux machine (ubuntu 16.04) and followed the installation instructions here http://jsreport.net/downloads/. When I run npm start --production I get
2016-09-13T16:51:57.134Z - info: Initializing jsreport in production mode using configuration file prod.config.json
2016-09-13T16:51:57.138Z - info: Setting process based strategy for rendering. Please visit http://jsreport.net/learn/configuration for information how to get more performance.
2016-09-13T16:51:57.146Z - info: Searching for available extensions in /home/ross/rifiniti/optimo_ui/
2016-09-13T16:51:57.152Z - info: Extensions location cache not found, crawling directories
2016-09-13T16:51:57.317Z - error: Error occured during reporter init Error: ENOENT, no such file or directory '/home/ross/rifiniti/optimo_ui/ember/tmp/tree_merger-tmp_dest_dir-2yGnPLMA.tmp/optimo-ui/config'
at Error (native)
at Object.fs.statSync (fs.js:801:18)
at /home/ross/rifiniti/optimo_ui/node_modules/jsreport/node_modules/jsreport-core/lib/util/util.js:51:21
at Array.forEach (native)
at Object.exports.walkSync (/home/ross/rifiniti/optimo_ui/node_modules/jsreport/node_modules/jsreport-core/lib/util/util.js:44:10)
at /home/ross/rifiniti/optimo_ui/node_modules/jsreport/node_modules/jsreport-core/lib/extensions/locationCache.js:50:20
From previous event:
at /home/ross/rifiniti/optimo_ui/node_modules/jsreport/node_modules/jsreport-core/lib/reporter.js:90:8
From previous event:
at Reporter.init (/home/ross/rifiniti/optimo_ui/node_modules/jsreport/node_modules/jsreport-core/lib/reporter.js:69:30)
at Reporter.reporter.init (/home/ross/rifiniti/optimo_ui/node_modules/jsreport/lib/extendInit.js:8:21)
at Object.<anonymous> (/home/ross/rifiniti/optimo_ui/server.js:1:85)
And I have no idea why. It was working fine on mac but now it is looking for this non-existent tmp file in ember. I'm at a loss, any help would be greatly appreciated!
SOLVED!
All I had to do was remove the tmp directory in ember and it worked.

Scala syntax errors when building Spark

I cloned a fresh copy of the branch-2.0 branch of Spark from Github onto a Centos 7 system. When executing the suggested command to build from source,
./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
I get the following errors:
[INFO] Total time: 5.368 s (Wall Clock)
[INFO] Finished at: 2016-08-18T11:56:49-05:00
[error[INFO] Final Memory: 71M/1963M
] [INFO] ------------------------------------------------------------------------
/home/rprechelt/ada/spark/common/tags/src/main/scala/org/apache/spark/annotation/Since.scala:30: error writing class Since: /home/rprechelt/ada/spark/common/tags/target/scala-2.11/classes/org/apache/spark/annotation/Since.class: /home/rprechelt/ada/spark/common/tags/target/scala-2.11/classes/org is not a directory
[error] private[spark] class Since(version: String) extends StaticAnnotation
[error] ^
[error] /home/rprechelt/ada/spark/common/tags/src/main/scala/org/apache/spark/annotation/package.scala:25: error writing package object annotation: /home/rprechelt/ada/spark/common/tags/target/scala-2.11/classes/org/apache/spark/annotation/package.class: /home/rprechelt/ada/spark/common/tags/target/scala-2.11/classes/org is not a directory
[error] package object annotation
[error] ^
[error] two errors found
[error] Compile failed at Aug 18, 2016 11:56:49 AM [0.358s]
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-sketch_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1]
I'm at a loss for what could be occurring here - I've tried building it with different versions of Scala, but they all report the same error.
Does anyone have any suggestions about how I might go about fixing this?
I got the same problem, this is because zinc server is started by some other user. Trying killing zinc and then starting the build again.
Its basically an access error where zinc server started by another user is trying to write to a different directory.
sudo ps -ef | grep zinc

Resources