Sun One Web Server 6.1 and Tomahawk - jsf

Do anyone know which version of tomahawk is suitable to use with Sun One Webserver 6.1?
Thanks in advance,
Alejo

Well, here is a requirements table for JSF:
JSF | 1.0 | 1.1 | 1.2 (JEE5) | 2.0
---------------------------------------------
Java | 1.3 | 1.3 | 5 | *
JSP | 1.2 | 1.2 | 2.1 | *
Servlet | 2.3 | 2.3 | 2.5 | *
JavaBeans | 1.0.1 | 1.0.1 | 1.0.1 | *
JSTL | 1.0 | 1.0 | 1.2 | *
*JSF 2.0 Public Review Draft requires JEE5
The Sun ONE Web Server doc says this:
Sun ONE Web Server 6.1 supports the
Java Servlet 2.3 specification,
including web application and WAR file
(Web ARchive file) support, and the
JavaServer Pages (JSP) 1.2
specification.
So, I'd use the compatibility matrix to check for likely candidates.

Related

Cassandra configuration config by cqlsh

Cassandra version: 3.9, CQLSH version: 5.0.1
Can I query Cassandra configuration (cassandra.yaml) using cqlsh?
No, it's not possible in your version. It's possible only starting with Cassandra 4.0 that has so-called virtual tables, and there is a special table for configurations: system_views.settings:
cqlsh:test> select * from system_views.settings ;
name | value
-------------------------------------------------+-------
transparent_data_encryption_options_enabled | false
transparent_data_encryption_options_iv_length | 16
trickle_fsync | false
trickle_fsync_interval_in_kb | 10240
truncate_request_timeout_in_ms | 60000
....
You can find more information on the virtual tables in the following blog post from TLP.
In the meantime, you can access configuration parameters via JMX.

Cassandra solr_query not working after upgrading to DSE 5

After upgrading to DSE 5 solr_query is not working. Below is the new DSE, cqlsh and Cassandra versions.
[cqlsh 5.0.1 | Cassandra 3.0.7.1158 | DSE 5.0.0 | CQL spec 3.4.0 |
Native protocol v4]
I am connecting using PHP Driver. The exception catching is
Must not send frame with CUSTOM_PAYLOAD flag for native protocol
version < 4
and the
error code is 33554442
When I run the same query on cqlsh it is working but not through the Php-driver.
$countSearchParam = '{"q":"'.$searchParam.'" }';
try{
$countStatement = $this->session->prepare(
"SELECT count(*) FROM table WHERE solr_query = ? ");
$countresults = $this->session->execute($countStatement, new Cassandra\ExecutionOptions(array(
'arguments' => array($countSearchParam)
)));
foreach ($countresults as $row) {
$cntArr = get_object_vars($row['count']);
$totCount = $cntArr['value'];
}
}catch(Exception $e){
}
PHP driver v1.1 does not support native protocol v4; however v1.2 is in the testing stages of development and will support v4 along with new features introduced in Cassandra v2.2 and v3.x. A version with specific DSE 5.0 features will begin after PHP driver v1.2 is released.
You can follow v1.2 release here.

Spark job deployment failure to cloudera

I am using guice architecture upon developing my spark strreaming program. It can run in my eclipse without any error. However, after compiling and deployed with spark-submit command, it returns an error:
java.lang.NoClassDefFoundError: com/google/common/base/Preconditions
After googling through, I noticed that this error only appears if we are using guice 3.0. But I am using guice 4.0. My spark version is 1.5.2, and my cloudera version is 5.3.2. Is there any work around on this error?
Unfortunately for you, Spark v1.5.2 depends on com.google.inject:guice:3.0.
So I suspect that what is happening is that that your project is pulling both:
Guice 4.0 (as a direct dependency stated in your dependencies file like pom.xml or build.sbt); and
Guice 3.0 (a transitive dependency pulled by Spark v1.5.2)
Basically your classpath ends up being a mess, and depending on the way classes are loaded by the classloader at runtime you will (or will not) experience such kind of errors.
You will have to use the already provided version of Guice (pulled by Spark) or start juggling with classloaders.
UPDATE:
Indeed the org.apache.spark:spark-core_2.10:1.5.2 pulls com.google.inject:guice:3.0 :
+-org.apache.spark:spark-core_2.10:1.5.2 [S]
+ ...
...
+-org.apache.hadoop:hadoop-client:2.2.0
| +-org.apache.hadoop:hadoop-mapreduce-client-app:2.2.0
| | +-com.google.protobuf:protobuf-java:2.5.0
| | +-org.apache.hadoop:hadoop-mapreduce-client-common:2.2.0
| | | +-com.google.protobuf:protobuf-java:2.5.0
| | | +-org.apache.hadoop:hadoop-mapreduce-client-core:2.2.0
| | | | +-com.google.protobuf:protobuf-java:2.5.0
| | | | +-org.apache.hadoop:hadoop-yarn-common:2.2.0 (VIA PARENT org.apache.hadoop:hadoop-yarn:2.2.0 and then VIA ITS PARENT org.apache.hadoop:hadoop-project:2.2.0)
| | | | | +-com.google.inject:guice:3.0
...
The spark-core pom.xml is here.
The hadoop-yarn-common pom.xml is here.
The hadoop-yarn pom.xml is here.
The hadoop-project pom.xml is here.

Tanuki upgrade: JVM configuration version

I am currently using the old tanuki version 3.2.3, and moving to the newest one 3.5.25.
I followed the upgrading documentation: modify my script, change the jar and binary wrapper… etc.
Debugging during the JVM launch, I could see that every additional param defined in my wrapper.conf appears as follows:
DEBUG | wrapper | 2014/07/03 13:41:08 | Command[0] : java
DEBUG | wrapper | 2014/07/03 13:41:08 | Command[1] : -Djava.system.class.loader=myClass
DEBUG | wrapper | 2014/07/03 13:41:08 | Command[2] : -Dcom.sun.management.jmxremote=true
But there are some extra params, and I don´t know where they are set up:
DEBUG | wrapper | 2014/07/03 13:41:08 | Command[30] : -Dwrapper.version=3.2.3
DEBUG | wrapper | 2014/07/03 13:41:08 | Command[31] : -Dwrapper.native_library=wrapper
DEBUG | wrapper | 2014/07/03 13:41:08 | Command[32] : -Dwrapper.service=TRUE
DEBUG | wrapper | 2014/07/03 13:41:08 | Command[33] : -Dwrapper.cpu.timeout=10
Specially annoying is the version one. It is still the old one. Does anybody know where I could change this configuration params?
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
Thanks!
Best
Tanuki library was imported by a dependency. Doesn´t matter the properties in my project, they will be overwritten.
The solution: using maven discard these dependencies or overwrite them.

Tomcat 7 Error renderResponse: /send-receive-updates

I m getting error Problem in renderResponse: /send-receive-updates Not Found in ExternalContext as a Resource.
Application was working fine in Jboss 4.2.3 now i m migrating it to tomcat 7 and icefaces version 1.8.2.
need your help for the same
jars which are in my project web inf are:
ant-1.5.1,
backport-util-concurrent,
bsf,
c3p0-0.9.1.2,
ch,
commons-beanutils,
commons-codec,
commons-collections,
commons-digester,
commons-el,
commons-fileupload,
commons-logging,
commons-net-1.3.0,
commons-pool-1.3,
dom4j-1.4,
el-ri,
hibernate3,
icefaces,
icefaces-comps,
icefaces-facelets,
iReport,
itext-1.3.1,
krysalis-jCharts-1.0.0-alpha-1,
servlet-api,
slf4j-api-1.5.0,
slf4j-log4j12-1.5.0,
spring,
spring-mock,
xercesImpl,
xml-apis,
log4j-1.2.15,
jsf-api-1.2,
jsf-impl-1.2,
jstl.

Resources