I am working with DeepLearning4J working with the 1.0.0-beta7 release. I am getting two errors at run time.
jnind4jcpu.dll unsupported jni version 0xffffffff
no nd4jcpu in java.library.path
I setup a path to the to a folder where I have a few other dlls for this effort. I am using java jvm 1.8.
So what version of the jvm should I use for question #1 and where in the dn4j maven project can I find the second one? I tried the uber jar for nd4j and still the same errors.
Thanks for any help!
Your issue doesn't have anything to do with the java version. Make sure you're not mixing versions of dl4j.
You don't really need to dig in to the internals or deal with any of the manual workarounds that you normally see in the jni based libraries.
All you need to do is include nd4j-native-platform in your classpath:
<dependency>
<groupId>org.nd4j</groupId>
<artifactId>nd4j-native-platform</artifactId>
<version>1.0.0-beta7</version>
</dependency>
Nd4j/dl4j is based on javacpp and takes care of all of that for you.
To give you even more targeted advice, I would have to know more about your environment (ideally reproducible on github)
Related
I am trying to integrate BDD using Cucumber. But I am really confused what is the difference between io.cucumber and info.cukes libraries. And which one to use and when.
I tried to read and understand the github README.md file still can't make heads or tails.
Still further I am not sure what is cucumber-jvm. Why do we need cucumber-junit (can't the standalone junit library suffice).
Thanks in advance. Any help is much appreciated.
Refer to the release notes for more details. - https://github.com/cucumber/cucumber-jvm/blob/master/CHANGELOG.md.
There has been substantial changes in cucumber 2. Refer to this for more - https://cucumber.io/blog/2017/08/29/announcing-cucumber-jvm-2-0-0
io.cucumber and info.cukes are Maven group ids. info.cukes was for Cucumber version till 1.2.5. The latest version are in io.cucumber starting from 2.0.0. There is also a new version 3 with more goodies in github with the master as mentioned in the release notes.
The reason the groupid was changed because gherkin has changed the groupid similarly.
cucumber-jvm is the java implementation of Cucumber framework. there are many other implementations in other languages - https://github.com/cucumber.
When you use the #RunWith(Cucumber.class) on top of the test class, it means that a specialized runner is being used which will execute the feature files. The default runner of junit will not get you anywhere, though might cough up some exceptions.
Is there any reference as to what sets of versions are compatible between aws java sdk, hadoop, hadoop-aws bundle, hive, spark?
For example, I know Spark is not compatible with hive versions above Hive 2.1.1
You cannot drop in a later version of the AWS SDK from what which hadoop-aws was built with and expect the s3a connector to work. Ever. That is now written down quite clearly in the S3A troubleshooting docs
Whatever problem you have, changing the AWS SDK version will not fix things, only change the stack traces you see.
This may seem frustrating, given the rate at which the AWS team push out a new SDK, but you have to understand that (a) the API often changes incompatibly between versions (as you have seen), and (b) every release introduces/moves bugs which end up causing problems.
Here is the 3.x timeline of things which broke on updates of the AWS SDK.
Move 1.11.86 and some tests hang under load.
Fix: move to 1.11.134 leading to logs are full of AWS telling us off for deliberatly calling abort() on a read.
Fix: move to 1.11.199 leading to logs full of stack traces.
Fix: move to 1.11.271 and shaded JAR pulls in netty unshaded.
Every upgrade of the AWS SDK JAR causes a problem, somewhere. Sometimes an edit to the code and recompile, most commonly: logs filling up with false-alarm messages, dependency problems, threading quirks, etc. Things which can take time to surface.
what you see when you get a hadoop release is not just an aws-sdk JAR which it was compiled against, you get a hadoop-aws JAR which contains the workarounds and fixes for whatever problems that release has introduced and which were identified in the minimum of 4 weeks of testing before the hadoop release ships.
Which is why, no, you shouldn't be changing JARs unless you plan to do a complete end-to-end retest of the s3a client code, including load tests. You are encouraged to do that, the hadoop project always welcomes more testing of our pre-release code, with the Hadoop 3.1 binaries ready to play with. But trying to do it yourself by changing JARs? Sadly, an isolated exercise in pain.
In Hadoop documentation, it is stated that by adding hadoop-aws JAR to the build dependencies; it will pull in a compatible aws-sdk JAR.
So, I created a dummy Maven project with these dependencies to download the compatible versions
<properties>
<!-- Your exact Hadoop version here-->
<hadoop.version>3.3.1</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>${hadoop.version}</version>
</dependency>
</dependencies>
Then, I checked my dependencies versions, used it in my project and it worked.
My project is using log4j2, and everything looks fine until running an application that uses a third party library that uses log4j 1.x. When our application starts, we get an annoying stack trace involving a ClassNotFoundException on org.apache.log4j.ConsoleAppender. I noticed that one of our dependencies has a log4j.properties inside its jar referencing org.apache.log4j.ConsoleAppender, so I'm guessing that's the reason for the stack trace. A couple other dependencies causing this error include most anything using JBoss logging classes, like embedded glassfish and the eclipse persistence packages.
I tried adding log4j-1.2.bridge api jar to the classpath and it had no effect.
It seems a little ridiculous to include both the jars for log4j2 and log4j1.x in our application classpath. Is there any other alternative or fix?
These links provided answers for me:
https://issues.apache.org/jira/browse/LOG4J2-172
https://issues.jboss.org/browse/JBLOGGING-95
It looks like we are using an out-dated version of JBoss logging that doesn't support log4j2. However, I'm still not sure what to do for the one dependency that includes a log4j.properties.
[edit] It turns out adding log4j-jcl-2.0 jar worked for that dependency.
When I clean my project I get the following error:
[2011-10-05 13:47:53 - The Basics] Dx
trouble processing "java/nio/CharBuffer.class":
Ill-advised or mistaken usage of a core class (java.* or javax.*)
when not building a core library.
This is often due to inadvertently including a core library file
in your application's project, when using an IDE (such as
Eclipse). If you are sure you're not intentionally defining a
core class, then this is the most likely explanation of what's
going on.
However, you might actually be trying to define a class in a core
namespace, the source of which you may have taken, for example,
from a non-Android virtual machine project. This will most
assuredly not work. At a minimum, it jeopardizes the
compatibility of your app with future versions of the platform.
It is also often of questionable legality.
If you really intend to build a core library -- which is only
appropriate as part of creating a full virtual machine
distribution, as opposed to compiling an application -- then use
the "--core-library" option to suppress this error message.
If you go ahead and use "--core-library" but are in fact
building an application, then be forewarned that your application
will still fail to build or run, at some point. Please be
prepared for angry customers who find, for example, that your
application ceases to function once they upgrade their operating
system. You will be to blame for this problem.
If you are legitimately using some code that happens to be in a
core package, then the easiest safe alternative you have is to
repackage that code. That is, move the classes in question into
your own package namespace. This means that they will never be in
conflict with core system classes. JarJar is a tool that may help
you in this endeavor. If you find that you cannot do this, then
that is an indication that the path you are on will ultimately
lead to pain, suffering, grief, and lamentation.
[2011-10-05 13:47:53 - The Basics] Dx 1 error; aborting
[2011-10-05 13:47:53 - The Basics] Conversion to Dalvik format failed with error 1
I had this problem. I use Maven to build my android projects. My problem was caused by one of my dependencies depending on the android jars. I updated my pom to exclude android from that dependency and that solved it for me.
<dependency>
<groupId>org.reassembler</groupId>
<artifactId>synth-android</artifactId>
<version>2.5.8</version>
<exclusions>
<exclusion>
<artifactId>junit</artifactId>
<groupId>junit</groupId>
</exclusion>
<exclusion>
<artifactId>android</artifactId>
<groupId>android</groupId>
</exclusion>
</exclusions>
</dependency>
Hope this helps someone, it took me a while to figure out what was going on.
For the benefit of anyone who may have stumbled upon this, this problem can be caused by inclusion of an older library such as android.jar. Removing the .jar file from your buildpath will allow you to compile. Otherwise, you can use the "jarjar" mentioned in the error message to move the .jar file to another package.
Your IDE is misconfigured. Make sure that your scripts or IDE isn't passing rt.jar or android.jar to dx.
In Android Studio, I had included java and javax jar files in my lib folder and there were dependencies related to them (in build.gradle at the app level). I commented them out.
//compile files('libs/ K java-rt-jar-stubs-1.5.0.jar')
//compile 'javax.annotation:jsr250-api:1.0'
//compile files('libs/javax. annotation.jar')
Then i went to the Project View and deleted the jar files. There were some legacy instances of java.awt.geom which I had to purge, but rebuilt project and then i was fine.
I have been developing a small project meant to run under weblogic 8.1.
Richfaces according to documentation states that it supports weblogic 8.1.
Weblogic 8.1 uses servlet specification 2.3 with jsp 1.2
This has been working on my locally installed version of weblogic 8.1 but when deploying to the sparc server, I start running into trouble. I have worked through some of the initial headaches, but then I got an error 500 and couldnt get the details. But after some effort I have come out with this...
javax.servlet.ServletException: javax/servlet/jsp/JspContext
at weblogic.servlet.internal.RequestDispatcherImpl.forward(RequestDispatcherImpl.java:344)
at com.sun.faces.context.ExternalContextImpl.dispatch(ExternalContextImpl.java:346)
at com.sun.faces.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:152)
at org.ajax4jsf.application.ViewHandlerWrapper.renderView(ViewHandlerWrapper.java:108)
at org.ajax4jsf.application.AjaxViewHandler.renderView(AjaxViewHandler.java:216)
at com.sun.faces.lifecycle.RenderResponsePhase.execute(RenderResponsePhase.java:107)
at com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:245)
at com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:137)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:214)
at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run(ServletStubImpl.java:1072)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:465)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:28)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:27)
at org.ajax4jsf.webapp.BaseXMLFilter.doXmlFilter(BaseXMLFilter.java:141)
at org.ajax4jsf.webapp.BaseFilter.doFilter(BaseFilter.java:281)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:27)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:6987)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:3892)
at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2766)
at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:224)
at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:183)
JspContext is not part of jsp 1.2 it's newer. I am thinking this exception is originally a ClassNotFoundException or something similar considering the message. There is no 'cause' attached to the exception.
Following are the jar files contained in my web-application.
antlr-2.7.6.jar
asm-1.5.3.jar
asm-attrs-1.5.3.jar
cglib-2.1_3.jar
commons-beanutils-1.6.jar
commons-collections-3.2.jar
commons-digester-1.5.jar
commons-lang-2.4.jar
commons-logging-1.0.3.jar
dom4j-1.6.1.jar
ehcache-1.2.3.jar
hibernate-3.2.4.sp1.jar
jsf-api-1.1_02.jar
jsf-impl-1.1_02.jar
jstl-1.0.jar
jta-1.1.jar
log4j-1.2.15.jar
richfaces-api-3.1.6.SR1.jar
richfaces-impl-3.1.6.SR1.jar
richfaces-ui-3.1.6.SR1.jar
xercesImpl-2.9.1.jar
xml-apis-1.3.04.jar
I'm running out of options, I'll be trying to figure out who has the dependency on the JspContext class... but if someone could give me some insight it would be greatly appreciated. Oh, I cannot make many changes to the production web-logic server. I'd prefer not to make any at all, chances are those changes will be denied
Oh this error occurs when attempting to view the page, so deployment is successful.
Ok, I have solved my problem. Not the way I wanted to but it's working (atleast to what I know right now).
After using google (again) I found a comment with someone mentioning using richfaces 3.0.1.
Now I have seen many people say use version xxx even 3.3.x. After all I did get it working with 3.1.6 but on the windows version of weblogic (which could have possibly been tainted by some other weblogic version I have installed).
So I have modified my pom for maven to depend on:
<groupId>org.richfaces</groupId>
<artifactId>richfaces</artifactId>
<version>3.0.1</version>
This is not optimal, but I geuss it works.
Some taglibrary references needed changing, and packages changed to some extent.
I could post my project configuration for this if anyone ever needs it. Getting this all to work on weblogic 8.1 was not straight forward. But the reference documentation for 3.1.6 is to my knowledge incorrect by saying it support weblogic 8.1
jsp-api-2.1.jar contains the missing class, so you could try using it (either replace it in weblogic, or try in your /lib), but I can't predict what would happen.