Register iWidgets in WebSphere Portal, registered but portlet not visible - websphere-7

I am new to iWidgets, I am trying to use iWidgets in WebSphere Portal. So far, I am able to create iWidget and register it, and added it on portal page.
/ConfigEngine.sh register-iwidget-definition -DIWidgetDefinition=/HelloIWidget/HelloWidget.xml
The first one -HelloIWidget was succesfully registered and added as portlet on portal page.But when I try to register another iWidget, the registeration task is succesful but I am not able to see it in the portlets section of portal server admin console.
C:\IBM\WebSphere\wp_profile\ConfigEngine>ConfigEngine.bat register-iwidget-defin
ition -DIWidgetDefinition=/MyHelloWidgetEVSUB/HelloEventSub.xml -DPortalAdminPwd
=wpsadmin -DWasPassword=wpsadmin
Licensed Materials - Property of IBM
5724-E76, 5724-E77, 5655-M44
(C) Copyright IBM Corp. All Rights Reserved.
Running Configuration Engine task 'register-iwidget-definition'
propertiesPath is ConfigEngine_temp.prop
rootDir is C:\IBM\WebSphere\wp_profile\ConfigEngine
Executing native2ascii with native encoding 'Cp1252': ConfigEngine_temp.prop_ ->
ConfigEngine_temp_ascii.prop_
Native2ascii execution was successful!
Loading system properties from ConfigEngine_temp_ascii.prop_
ConfigEngine: setting system property server.root=C:/IBM/WebSphere/AppServer
ConfigEngine: setting system property was.repository.root=C:/IBM/WebSphere/wp_pr
ofile/config
ConfigEngine: setting system property JAVA_HOME=C:/IBM/WebSphere/AppServer/java
ConfigEngine: setting system property CellName=IBMNC9REKB1JLG
ConfigEngine: setting system property ws.ext.dirs=C:/IBM/WebSphere/AppServer/jav
a/lib;C:/IBM/WebSphere/AppServer/classes;C:/IBM/WebSphere/AppServer/lib;C:/IBM/W
ebSphere/AppServer/installedChannels;C:/IBM/WebSphere/AppServer/lib/ext;C:/IBM/W
ebSphere/AppServer/web/help;C:/IBM/WebSphere/AppServer/deploytool/itp/plugins/co
m.ibm.etools.ejbdeploy/runtime;./lib;./shared/app
ConfigEngine: setting system property jvmArgFor64bit=-D64bit.args=none
ConfigEngine: setting system property NodeName=IBMNC9REKB1JLG
ConfigEngine: setting system property local.node=IBMNC9REKB1JLG
ConfigEngine: setting system property was.root=C:/IBM/WebSphere/AppServer
ConfigEngine: setting system property was.install.root=C:/IBM/WebSphere/AppServe
r
ConfigEngine: setting system property cfg.trace=C:/IBM/WebSphere/wp_profile/Conf
igEngine/log/ConfigTrace.log
ConfigEngine: setting system property local.cell=IBMNC9REKB1JLG
RegistrySynchronized: true
Registry already in sync
[05/07/13 16:52:12.485 SGT] ssl.default.password.in.use.CWPKI0041W
[05/07/13 16:52:12.803 SGT] ssl.disable.url.hostname.verification.CWPKI0027I
[05/07/13 16:52:12.845 SGT] Client code attempting to load security configuratio
n
[05/07/13 16:52:20.445 SGT] Client code attempting to load security configuratio
n
Created admin client: com.ibm.ws.management.AdminClientImpl#11f311f3
Created config Service Proxy: com.ibm.websphere.management.configservice.ConfigS
erviceProxy#7bda7bda
CELL: IBMNC9REKB1JLG
NODE: IBMNC9REKB1JLG
Websphere:_Websphere_Config_Data_Type=Registry,_Websphere_Config_Data_Id=cells/I
BMNC9REKB1JLG|registry.xml#Registry_1365043526167,_WEBSPHERE_CONFIG_SESSION=anon
ymous1367916748358
[05/07/13 16:52:29.971 SGT] WSVR0801I
loaded registry from WAS: registry.xml
wasUserHome now set to: C:/IBM/WebSphere/wp_profile
Buildfile: base_dynamic.xml
Trying to override old definition of task property
Trying to override old definition of task sleep
Trying to override old definition of task java
Trying to override old definition of task exec
cleanup-work-dir:
Tue May 07 16:52:49 SGT 2013
[echo] Cleaning up...
[delete] Deleting directory C:\IBM\WebSphere\wp_profile\ConfigEngine\config\w
ork
[echo] Done.
[mkdir] Created dir: C:\IBM\WebSphere\wp_profile\ConfigEngine\config\work
action-set-time-property:
Tue May 07 16:52:49 SGT 2013
action-init-zos:
Tue May 07 16:52:49 SGT 2013
[echo] Setting property isZos to ${isZos}
[echo] Setting property jvmArgForZos to -Dzos.argsconversion=none
action-set-managed-node-flag:
Tue May 07 16:52:49 SGT 2013
[echo] Is this a Managed Node ? false
action-set-conntype-property:
Tue May 07 16:52:50 SGT 2013
[echo] wsadminConnType set to: SOAP
init-cfg-files:
Tue May 07 16:52:50 SGT 2013
[delete] Deleting directory C:\IBM\WebSphere\wp_profile\ConfigEngine\config\w
ork
[mkdir] Created dir: C:\IBM\WebSphere\wp_profile\ConfigEngine\config\work
[copy] Copying 5 files to C:\IBM\WebSphere\wp_profile\ConfigEngine\config\w
ork
[copy] Copying 1 file to C:\IBM\WebSphere\wp_profile\ConfigEngine\config\wo
rk
[copy] Copying 7 files to C:\IBM\WebSphere\wp_profile\ConfigEngine\config\w
ork
set-wsadmin-scripting-classpath-in-jacl-properties-1:
Tue May 07 16:52:52 SGT 2013
[echo] C:/IBM/WebSphere/wp_profile
[echo] wsadmin.properties com.ibm.ws.scripting.classpath: '${com.
ibm.ws.scripting.classpath}'
[echo] com.ibm.ws.scripting.classpath.initial: 'C:/IBM
/WebSphere/wp_profile/ConfigEngine/lib/wkplc.misc.jar;C:/IBM/WebSphere/AppServer
/deploytool/itp/batchboot.jar;C:/IBM/WebSphere/AppServer/deploytool/itp/batch2.j
ar;C:/IBM/WebSphere/PortalServer/base/wp.base/shared/app/wp.base.jar;C:/IBM/WebS
phere/PortalServer/shared/app/wp.base.jar;C:/IBM/WebSphere/wp_profile/ConfigEngi
ne/shared/app/lotusworkplacelib/lwp.clbcmpAPI.jar'
set-wsadmin-scripting-classpath-in-jacl-properties-2:
Tue May 07 16:52:52 SGT 2013
[echo] jacl.properties com.ibm.ws.scripting.classpath: 'C:/IBM
/WebSphere/wp_profile/ConfigEngine/lib/wkplc.misc.jar;C:/IBM/WebSphere/AppServer
/deploytool/itp/batchboot.jar;C:/IBM/WebSphere/AppServer/deploytool/itp/batch2.j
ar;C:/IBM/WebSphere/PortalServer/base/wp.base/shared/app/wp.base.jar;C:/IBM/WebS
phere/PortalServer/shared/app/wp.base.jar;C:/IBM/WebSphere/wp_profile/ConfigEngi
ne/shared/app/lotusworkplacelib/lwp.clbcmpAPI.jar'
[echo] jacl.properties com.ibm.ws.scripting.port: '10025'
[echo] jacl.properties com.ibm.ws.scripting.host: 'local.porta
l7.com'
set-wsadmin-scripting-classpath-in-jacl-properties:
Tue May 07 16:52:52 SGT 2013
action-init-cfg-files-zos:
Tue May 07 16:52:52 SGT 2013
setup-additional-init-files:
Tue May 07 16:52:52 SGT 2013
init:
Tue May 07 16:52:52 SGT 2013
[echo] 2013-05-07-04-52
Trying to override old definition of task wplc-modify-server
Trying to override old definition of task wplc-create-server
Trying to override old definition of task wplc-remove-server
set-properties:
Tue May 07 16:52:57 SGT 2013
[setproperty] Property PortalAdminId was set to wpsadmin
[setproperty] Property PortalAdminGroupId was set to wpsadmins
[setproperty] Property WpsDocReviewer was set to ${WpsDocReviewer}
[setproperty] Property WpsContentAdministrators was set to ${WpsContentAdministr
ators}
[setproperty] Property UserSuffix was set to ${LDAPUserSuffix},${LDAPSuffix}
[setproperty] Property GroupSuffix was set to ${LDAPGroupSuffix},${LDAPSuffix}
action-pre-config:
Tue May 07 16:52:58 SGT 2013
[echo] executing pre-configuration tasks
[isWas7] overwriting previous definition of property: null
[isWas7] +++value of property is 7.0.0.11
action-set-config:
Tue May 07 16:53:00 SGT 2013
[echo] executing set-configuration tasks
[echo] contains#empty_string#
wait-for-sync-to-complete:
Tue May 07 16:53:00 SGT 2013
start-portal-server:
Tue May 07 16:53:00 SGT 2013
set-instance-properties:
Tue May 07 16:53:01 SGT 2013
action-start-portal-server-service:
Tue May 07 16:53:01 SGT 2013
[logmsg] [05/07/13 16:53:05.561 SGT] EJPCA3163I: Starting Server "WebSphere_P
ortal"
[echo] Port '10039' is in use on host 'localhost'
[echo] An instance of the server 'WebSphere_Portal' may already be running
action-set-managed-node-flag:
Tue May 07 16:53:07 SGT 2013
[echo] Is this a Managed Node ? false
action-set-conntype-property:
Tue May 07 16:53:07 SGT 2013
[echo] wsadminConnType set to: SOAP
set-wsadmin-scripting-classpath-in-jacl-properties-1:
Tue May 07 16:53:08 SGT 2013
[echo] C:/IBM/WebSphere/wp_profile
[echo] wsadmin.properties com.ibm.ws.scripting.classpath: '${com.
ibm.ws.scripting.classpath}'
[echo] com.ibm.ws.scripting.classpath.initial: 'C:/IBM
/WebSphere/wp_profile/ConfigEngine/lib/wkplc.misc.jar;C:/IBM/WebSphere/AppServer
/deploytool/itp/batchboot.jar;C:/IBM/WebSphere/AppServer/deploytool/itp/batch2.j
ar;C:/IBM/WebSphere/PortalServer/base/wp.base/shared/app/wp.base.jar;C:/IBM/WebS
phere/PortalServer/shared/app/wp.base.jar;C:/IBM/WebSphere/wp_profile/ConfigEngi
ne/shared/app/lotusworkplacelib/lwp.clbcmpAPI.jar'
set-wsadmin-scripting-classpath-in-jacl-properties-2:
Tue May 07 16:53:08 SGT 2013
[echo] jacl.properties com.ibm.ws.scripting.classpath: 'C:/IBM
/WebSphere/wp_profile/ConfigEngine/lib/wkplc.misc.jar;C:/IBM/WebSphere/AppServer
/deploytool/itp/batchboot.jar;C:/IBM/WebSphere/AppServer/deploytool/itp/batch2.j
ar;C:/IBM/WebSphere/PortalServer/base/wp.base/shared/app/wp.base.jar;C:/IBM/WebS
phere/PortalServer/shared/app/wp.base.jar;C:/IBM/WebSphere/wp_profile/ConfigEngi
ne/shared/app/lotusworkplacelib/lwp.clbcmpAPI.jar'
[echo] jacl.properties com.ibm.ws.scripting.port: '10025'
[echo] jacl.properties com.ibm.ws.scripting.host: 'local.porta
l7.com'
set-wsadmin-scripting-classpath-in-jacl-properties:
Tue May 07 16:53:08 SGT 2013
register-iwidget-definition:
Tue May 07 16:53:08 SGT 2013
[wplc-get-host-port-in-server] Task parameters:
[wplc-get-host-port-in-server] Global attributes:
[wplc-get-host-port-in-server] server="WebSphere_Portal"
[wplc-get-host-port-in-server] osarch="amd64"
[wplc-get-host-port-in-server] node="IBMNC9REKB1JLG"
[wplc-get-host-port-in-server] pathseparator=";"
[wplc-get-host-port-in-server] engineinstalllocation="C:/IBM/WebSphere/wp_pr
ofile/ConfigEngine"
[wplc-get-host-port-in-server] cell="IBMNC9REKB1JLG"
[wplc-get-host-port-in-server] Instance attributes (Set 1 of 1):
[wplc-get-host-port-in-server] endPointName="SOAP_CONNECTOR_ADDRESS"
[wplc-get-host-port-in-server] attribute=[ *** NONE_SPECIFIED *** ]
[wplc-get-host-port-in-server] end point: SOAP_CONNECTOR_ADDRESS found
[wplc-get-host-port-in-server] Settings the host: local.portal7.com as ant prope
rty: ${hostInJMX}
[wplc-get-host-port-in-server] Settings the port: 10025 as ant property: ${portI
nJMX}
[wplc-get-host-port-in-server] Status = Complete
[echo] Determined soap host: local.portal7.com
[echo] Determined soap connector: 10025
iseries-switch-to-was-user:
Tue May 07 16:53:16 SGT 2013
[wsadmin] WASX7209I: Connected to process "WebSphere_Portal" on node IBMNC9REK
B1JLG using SOAP connector; The type of process is: UnManagedProcess
[wsadmin] WASX7303I: The following options are passed to the scripting environ
ment and are available as arguments that are stored in the argv variable: "[C:/I
BM/WebSphere/wp_profile/ConfigEngine, IWidgetDefinition=/MyHelloWidgetEVSUB/Hell
oEventSub.xml, IWidgetCatalog=${IWidgetCatalog}, PortletDefinition=${PortletDefi
nition}, PortletUniqueName=${PortletUniqueName}]"
[wsadmin] false
[wsadmin] logged in as "uid=wpsadmin,o=defaultWIMFileBasedRealm"
[wsadmin] [05/07/13 16:54:16.715 SGT] EJPXD0001I
[wsadmin] EJPFD0085I: Report started at 5/7/13 4:54 PM.EJPFD0087I: Object [Obj
ectIDImpl 'Z3_J1RDFTVJ849QD0IFUPU3HP1046', PORTLET_DEFINITION, VP: 0, [Domain: r
el], DB: 0000-33ECF6FA9F8824DD807C3EFB117300C4] processed successfully.EJPFD0086
I: Report completetd at 5/7/13 4:54 PM.
[wsadmin] success
delete-temp-dirs:
Tue May 07 16:54:25 SGT 2013
[delete] Deleting: C:\IBM\WebSphere\wp_profile\ConfigEngine\config\work\was\w
p_portal.properties
[delete] Deleting: C:\IBM\WebSphere\wp_profile\ConfigEngine\properties\wkplc_
comp_ascii.properties
[delete] Deleting: C:\IBM\WebSphere\wp_profile\ConfigEngine\properties\wkplc_
ascii.properties
[delete] Deleting 5 files from C:\IBM\WebSphere\wp_profile\ConfigEngine\prope
rties
cleanup-work-dir:
Tue May 07 16:54:26 SGT 2013
[echo] Cleaning up...
[delete] Deleting directory C:\IBM\WebSphere\wp_profile\ConfigEngine\config\w
ork
[echo] Done.
[mkdir] Created dir: C:\IBM\WebSphere\wp_profile\ConfigEngine\config\work
action-post-config:
Tue May 07 16:54:26 SGT 2013
[echo] executing post-configuration tasks
BUILD SUCCESSFUL
Total time: 1 minute 47 seconds
isIseries currently set to: null
uploading registry
Created admin client: com.ibm.ws.management.AdminClientImpl#11f311f3
Created config Service Proxy: com.ibm.websphere.management.configservice.ConfigS
erviceProxy#5b905b9
CELL: IBMNC9REKB1JLG
NODE: IBMNC9REKB1JLG
Websphere:_Websphere_Config_Data_Type=Registry,_Websphere_Config_Data_Id=cells/I
BMNC9REKB1JLG|registry.xml#Registry_1365043526167,_WEBSPHERE_CONFIG_SESSION=anon
ymous1367916867190
update-registry-sync-property:
Tue May 07 16:54:28 SGT 2013
[echo] updated RegistrySynchronized in file wkplc.properties with value: tr
ue
Return Value: 0

iWidgets get registered by cloning a generic iWidget Portlet and assigning your URL to it. Usually it also gets a weird name. Since all portlets are shown in the order they were added, go to the Administration - Portlets and there to the last page. Most probably your widget is there. I was unable to find a way to rename the widget, so I made a copy of it and when doing that you can give a new name to your widget. Afterwards the original widget can be removed.
Please note that iWidget Portlet requires widget_container capability, which is provided by mm_enabler module. It is not included into the theme by default. It can be included either by modifying profile json files like described here http://infolib.lotus.com/resources/portal/8.0.0/doc/en_us/PT800ACD001/dev/themeopt_add_oobmod.html
or by using add-theme-modules COnfigEngine command, which does exactly the same.
I also had to change a page theme profile in order to get my widget added, otherwise I always got this missing widget_container 2.1 capability error message.

Related

Apache LockFile issue

I am trying to migrate from Apache 2.2 on Debian 7 to Apache 2.4 on CentOS 7.
when httpd starts, it fails to start as below.
root# journalctl -xe
Jun 23 14:26:04 ww-test httpd[17716]: AH00526: Syntax error on line 47 of /etc/httpd/conf/httpd.conf:
Jun 23 14:26:04 ww-test httpd[17716]: Invalid command 'LockFile', perhaps misspelled or defined by a module not included in the server configuration
Jun 23 14:26:04 ww-test systemd[1]: httpd.service: main process exited, code=exited, status=1/FAILURE
Jun 23 14:26:04 ww-test kill[17718]: kill: cannot find process ""
Jun 23 14:26:04 ww-test systemd[1]: httpd.service: control process exited, code=exited status=1
Jun 23 14:26:04 ww-test systemd[1]: Failed to start The Apache HTTP Server.
vi httpd.conf
45 # The accept serialization lock file MUST BE STORED ON A LOCAL DISK.
46 #
47 LockFile ${APACHE_LOCK_DIR}/accept.lock
48
I installed the related package as below although I am not sure if it is the right one.
============================================================================================================================================================================================
Package Arch Version Repository Size
============================================================================================================================================================================================
Installing:
lockfile-progs x86_64 0.1.15-7.el7 /lockfile-progs-0.1.15-7.el7.x86_64 50 k
Installing for dependencies:
liblockfile x86_64 1.08-17.el7 base 21 k
Transaction Summary
Thanks for reading it.
Replace this line:
LockFile ${APACHE_LOCK_DIR}/accept.lock
To:
Mutex file:/var/httpd/locks default
Directives AcceptMutex, LockFile, RewriteLock, SSLMutex, SSLStaplingMutex, and WatchdogMutexPath have been replaced with a single Mutex directive. You will need to evaluate any use of these removed directives in your 2.2 configuration to determine if they can just be deleted or will need to be replaced using Mutex

spark-shell first launch error

I followed these instructions
http://davidssysadminnotes.blogspot.com/2016/01/installing-spark-centos-7.html
http://tecadmin.net/setup-hadoop-2-4-single-node-cluster-on-linux/#
for CentOS 7.2. When I launch pyspark, everything seems ok:
[idf#node1 ~]$ pyspark
Python 2.7.11 |Anaconda 4.0.0 (64-bit)| (default, Dec 6 2015, 18:08:32)
Type "copyright", "credits" or "license" for more information.
IPython 4.1.2 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
16/04/01 16:02:34 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 1.6.1
/_/
Using Python version 2.7.11 (default, Dec 6 2015 18:08:32)
SparkContext available as sc, HiveContext available as sqlContext.
In [1]:
But when I launch spark-shell, I get errors:
[idf#node1 ~]$ spark-shell
16/04/01 15:59:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.1
/_/
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
16/04/01 15:59:42 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar."
16/04/01 15:59:42 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar."
16/04/01 15:59:42 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar."
16/04/01 15:59:42 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
Fri Apr 01 15:59:42 EDT 2016 Thread[main,5,main] java.io.FileNotFoundException: derby.log (Permission denied)
16/04/01 15:59:43 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
----------------------------------------------------------------
Fri Apr 01 15:59:43 EDT 2016:
Booting Derby version The Apache Software Foundation - Apache Derby - 10.10.1.1 - (1458268): instance a816c00e-0153-d368-eecc-000031edf8d8
on database directory /tmp/spark-f22a65d6-4bda-4426-8b80-0f34b39b28dd/metastore with class loader sun.misc.Launcher$AppClassLoader#4e25154f
Loaded from file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar
java.vendor=Oracle Corporation
java.runtime.version=1.8.0_45-b14
user.dir=/home/idf
os.name=Linux
os.arch=amd64
os.version=3.10.0-327.10.1.el7.x86_64
derby.system.home=null
Database Class Loader started - derby.database.classpath=''
16/04/01 16:00:02 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/04/01 16:00:02 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/04/01 16:00:06 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar."
16/04/01 16:00:06 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar."
16/04/01 16:00:06 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar."
16/04/01 16:00:06 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] java.io.FileNotFoundException: derby.log (Permission denied)
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.dataDictionary in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.lockManagerJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.dvfJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.javaCompiler in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.replication.slave in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.rawStore.transactionJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.ef in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.rawStore.transactionJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.database in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.NoneAuthentication in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.netServer.autoStart in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.dvfJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.mgmt.null in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.nativeAuthentication in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.lockManagerJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.replication.master in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.dvfCDC in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.access.btree in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.lockManagerJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.uuidJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.cryptographyJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.rawStore.data.genericJ4 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.rawStore.data.genericJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.access in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.jdbc169 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.cryptographyJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.optimizer in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.mgmt.jmx in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.dvfJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.specificAuthentication in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.JNDIAuthentication in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.basicAuthentication in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.rawStore.data.genericJ4 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.validation in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.rawStore.data.genericJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.classManagerJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.streams in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.classManagerJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.resourceAdapterJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.rawStore.transactionJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.jdbcJ8 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.jdbcJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.classManagerJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.jdbcJ4 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.rawStore.log in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.rawStore.log.readonly in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.access.heap in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.jdbcJ8 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.daemon in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.cacheManagerJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.tcf in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.jdbcJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.jdbcJ4 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.access.uniquewithduplicatenullssort in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.cacheManagerJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.cacheManagerJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.resultSetStatisticsFactory in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.cacheManagerJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.cryptographyJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.rawStore.data.genericJ4 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.database.slave in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.XPLAINFactory in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.resourceAdapterJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.access.sort in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.mgmt.jmx in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.rawStore.transactionJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.JNDIAuthentication in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.rawStore.transactionJ1 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.timer in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.lcf in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.rawStore in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.resourceAdapterJ2 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.classes.jdbc169 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.lf in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.jdbcJ8 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.jdbcJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.env.jdk.jdbcJ4 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.nodeFactory in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main] Ignored duplicate property derby.module.lockManagerJ6 in jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/derby/modules.properties
Fri Apr 01 16:00:06 EDT 2016 Thread[main,5,main]
16/04/01 16:00:06 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
Fri Apr 01 16:00:07 EDT 2016 Thread[main,5,main] Cleanup action starting
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1#3531509c, see the next exception for details.
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
The spark-shell alreeady provide for you an sqlContext with Hive-support:
[meox#debian-meox: ~/spark-1.6.1-bin-hadoop2.6] # ./bin/spark-shell --master local[4] --packages com.databricks:spark-csv_2.10:1.4.0
Ivy Default Cache set to: /home/meox/.ivy2/cache
The jars for the packages stored in: /home/meox/.ivy2/jars
:: loading settings :: url = jar:file:/home/meox/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.databricks#spark-csv_2.10 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.databricks#spark-csv_2.10;1.4.0 in central
found org.apache.commons#commons-csv;1.1 in list
found com.univocity#univocity-parsers;1.5.1 in list
:: resolution report :: resolve 429ms :: artifacts dl 21ms
:: modules in use:
com.databricks#spark-csv_2.10;1.4.0 from central in [default]
com.univocity#univocity-parsers;1.5.1 from list in [default]
org.apache.commons#commons-csv;1.1 from list in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 3 | 0 | 0 | 0 || 3 | 0 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
0 artifacts copied, 3 already retrieved (0kB/10ms)
16/04/02 11:02:35 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/04/02 11:02:35 INFO SecurityManager: Changing view acls to: meox
16/04/02 11:02:35 INFO SecurityManager: Changing modify acls to: meox
16/04/02 11:02:35 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(meox); users with modify permissions: Set(meox)
16/04/02 11:02:35 INFO HttpServer: Starting HTTP Server
16/04/02 11:02:35 INFO Utils: Successfully started service 'HTTP class server' on port 40697.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.1
/_/
Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_95)
Type in expressions to have them evaluated.
Type :help for more information.
16/04/02 11:02:41 INFO SparkContext: Running Spark version 1.6.1
16/04/02 11:02:41 INFO SecurityManager: Changing view acls to: meox
16/04/02 11:02:41 INFO SecurityManager: Changing modify acls to: meox
16/04/02 11:02:41 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(meox); users with modify permissions: Set(meox)
16/04/02 11:02:42 INFO Utils: Successfully started service 'sparkDriver' on port 41544.
16/04/02 11:02:42 INFO Slf4jLogger: Slf4jLogger started
16/04/02 11:02:42 INFO Remoting: Starting remoting
16/04/02 11:02:42 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem#192.168.0.50:37295]
16/04/02 11:02:42 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 37295.
16/04/02 11:02:42 INFO SparkEnv: Registering MapOutputTracker
16/04/02 11:02:42 INFO SparkEnv: Registering BlockManagerMaster
16/04/02 11:02:42 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-e5d301cf-bb94-4738-aecd-9d97f0f6174a
16/04/02 11:02:42 INFO MemoryStore: MemoryStore started with capacity 511.5 MB
16/04/02 11:02:42 INFO SparkEnv: Registering OutputCommitCoordinator
16/04/02 11:02:43 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/04/02 11:02:43 INFO SparkUI: Started SparkUI at http://192.168.0.50:4040
16/04/02 11:02:43 INFO HttpFileServer: HTTP File server directory is /tmp/spark-af3ded9a-8e45-4d5a-8d14-a3114dd899aa/httpd-90b07dcf-a8bc-4c32-9e0a-4d698033c7aa
16/04/02 11:02:43 INFO HttpServer: Starting HTTP Server
16/04/02 11:02:43 INFO Utils: Successfully started service 'HTTP file server' on port 45007.
16/04/02 11:02:43 INFO SparkContext: Added JAR file:/home/meox/.ivy2/jars/com.databricks_spark-csv_2.10-1.4.0.jar at http://192.168.0.50:45007/jars/com.databricks_spark-csv_2.10-1.4.0.jar with timestamp 1459587763133
16/04/02 11:02:43 INFO SparkContext: Added JAR file:/home/meox/.ivy2/jars/org.apache.commons_commons-csv-1.1.jar at http://192.168.0.50:45007/jars/org.apache.commons_commons-csv-1.1.jar with timestamp 1459587763134
16/04/02 11:02:43 INFO SparkContext: Added JAR file:/home/meox/.ivy2/jars/com.univocity_univocity-parsers-1.5.1.jar at http://192.168.0.50:45007/jars/com.univocity_univocity-parsers-1.5.1.jar with timestamp 1459587763135
16/04/02 11:02:43 INFO Executor: Starting executor ID driver on host localhost
16/04/02 11:02:43 INFO Executor: Using REPL class URI: http://192.168.0.50:40697
16/04/02 11:02:43 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 58692.
16/04/02 11:02:43 INFO NettyBlockTransferService: Server created on 58692
16/04/02 11:02:43 INFO BlockManagerMaster: Trying to register BlockManager
16/04/02 11:02:43 INFO BlockManagerMasterEndpoint: Registering block manager localhost:58692 with 511.5 MB RAM, BlockManagerId(driver, localhost, 58692)
16/04/02 11:02:43 INFO BlockManagerMaster: Registered BlockManager
16/04/02 11:02:43 INFO SparkILoop: Created spark context..
Spark context available as sc.
16/04/02 11:02:44 INFO HiveContext: Initializing execution hive, version 1.2.1
16/04/02 11:02:44 INFO ClientWrapper: Inspected Hadoop version: 2.6.0
16/04/02 11:02:44 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
16/04/02 11:02:44 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
16/04/02 11:02:44 INFO ObjectStore: ObjectStore, initialize called
16/04/02 11:02:44 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
16/04/02 11:02:44 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
16/04/02 11:02:45 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/04/02 11:02:45 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/04/02 11:02:47 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
16/04/02 11:02:48 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:48 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:51 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:51 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:51 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
16/04/02 11:02:51 INFO ObjectStore: Initialized ObjectStore
16/04/02 11:02:51 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/04/02 11:02:52 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/04/02 11:02:52 INFO HiveMetaStore: Added admin role in metastore
16/04/02 11:02:52 INFO HiveMetaStore: Added public role in metastore
16/04/02 11:02:52 INFO HiveMetaStore: No user is added in admin role, since config is empty
16/04/02 11:02:52 INFO HiveMetaStore: 0: get_all_databases
16/04/02 11:02:52 INFO audit: ugi=meox ip=unknown-ip-addr cmd=get_all_databases
16/04/02 11:02:52 INFO HiveMetaStore: 0: get_functions: db=default pat=*
16/04/02 11:02:52 INFO audit: ugi=meox ip=unknown-ip-addr cmd=get_functions: db=default pat=*
16/04/02 11:02:52 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:53 INFO SessionState: Created local directory: /tmp/c9aee4a3-05b4-4606-8cc9-eb6e3af28fa1_resources
16/04/02 11:02:53 INFO SessionState: Created HDFS directory: /tmp/hive/meox/c9aee4a3-05b4-4606-8cc9-eb6e3af28fa1
16/04/02 11:02:53 INFO SessionState: Created local directory: /tmp/meox/c9aee4a3-05b4-4606-8cc9-eb6e3af28fa1
16/04/02 11:02:53 INFO SessionState: Created HDFS directory: /tmp/hive/meox/c9aee4a3-05b4-4606-8cc9-eb6e3af28fa1/_tmp_space.db
16/04/02 11:02:53 INFO HiveContext: default warehouse location is /user/hive/warehouse
16/04/02 11:02:53 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
16/04/02 11:02:53 INFO ClientWrapper: Inspected Hadoop version: 2.6.0
16/04/02 11:02:53 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
16/04/02 11:02:54 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
16/04/02 11:02:54 INFO ObjectStore: ObjectStore, initialize called
16/04/02 11:02:54 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
16/04/02 11:02:54 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
16/04/02 11:02:54 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/04/02 11:02:54 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/04/02 11:02:55 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
16/04/02 11:02:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:56 INFO Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery#0" since the connection used is closing
16/04/02 11:02:56 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
16/04/02 11:02:56 INFO ObjectStore: Initialized ObjectStore
16/04/02 11:02:57 INFO HiveMetaStore: Added admin role in metastore
16/04/02 11:02:57 INFO HiveMetaStore: Added public role in metastore
16/04/02 11:02:57 INFO HiveMetaStore: No user is added in admin role, since config is empty
16/04/02 11:02:57 INFO HiveMetaStore: 0: get_all_databases
16/04/02 11:02:57 INFO audit: ugi=meox ip=unknown-ip-addr cmd=get_all_databases
16/04/02 11:02:57 INFO HiveMetaStore: 0: get_functions: db=default pat=*
16/04/02 11:02:57 INFO audit: ugi=meox ip=unknown-ip-addr cmd=get_functions: db=default pat=*
16/04/02 11:02:57 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
16/04/02 11:02:57 INFO SessionState: Created local directory: /tmp/2d5b72da-368c-4c00-8d14-ca8c5cfbf66c_resources
16/04/02 11:02:57 INFO SessionState: Created HDFS directory: /tmp/hive/meox/2d5b72da-368c-4c00-8d14-ca8c5cfbf66c
16/04/02 11:02:57 INFO SessionState: Created local directory: /tmp/meox/2d5b72da-368c-4c00-8d14-ca8c5cfbf66c
16/04/02 11:02:57 INFO SessionState: Created HDFS directory: /tmp/hive/meox/2d5b72da-368c-4c00-8d14-ca8c5cfbf66c/_tmp_space.db
16/04/02 11:02:57 INFO SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.
If you see the last lines says: " 16/04/02 11:02:57 INFO SparkILoop: Created sql context (with Hive support).."
Remember to:
enable the log (cp conf/log4j.properties.template conf/log4j.properties)
create folder /user/hive/warehouse/ (with the right permissions)

struts2 web app is not working in tomcat7 insalled in susi linux

I have developed a web application based on struts2.2.1, using Eclipse indigo IDE with apache-tomcat-7.0.28. To make it work without IDE I have deployed it in Eclipse IDE using: right-click on the project->Export->war.
So, I get war file which I can able to make it work on any windows server installed with apache-tomcat-7.0.28, just copying the war file to tomcat's webapps folder and running the surver. Every things work fine here.
But, when I take this same war file to Linux server installed with same apache-tomcat-7.0.28, it is unable to open event the jsp file specified in web.xml and it shows error 404 page.
I even tried on different linux machines (like Ubuntu), but its not working and I get same 404 error.
why is it so?
UPDATE:
it took this form catalina.out file inside logs folder and inside
Jul 17, 2012 9:35:55 AM org.apache.catalina.startup.HostConfig checkResources
INFO: Undeploying context [/ai]
Jul 17, 2012 9:36:55 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive /usr/apache-tomcat-7.0.28/webapps/ai.war
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/core_rt is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/core is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/fmt_rt is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/fmt is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://jakarta.apache.org/taglibs/standard/permittedTaglibs is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://jakarta.apache.org/taglibs/standard/scriptfree is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/sql_rt is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/sql is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/xml_rt is already defined
Jul 17, 2012 9:36:56 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/xml is already defined
log4j:WARN No appenders could be found for logger (com.opensymphony.xwork2.config.providers.XmlConfigurationProvider).
log4j:WARN Please initialize the log4j system properly.
Jul 17, 2012 9:36:56 AM org.apache.catalina.core.StandardContext startInternal
SEVERE: Error filterStart
Jul 17, 2012 9:36:56 AM org.apache.catalina.core.StandardContext startInternal
SEVERE: Context [/ai] startup failed due to previous errors
Jul 17, 2012 9:36:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/ai] created a ThreadLocal with key of type [com.opensymphony.xwork2.inject.ContainerImpl$10] (value [com.opensymphony.xwork2.inject.ContainerImpl$10#626028]) and a value of type [java.lang.Object[]] (value [[Ljava.lang.Object;#970110]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
Jul 17, 2012 9:36:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/ai] created a ThreadLocal with key of type [com.opensymphony.xwork2.inject.ContainerImpl$10] (value [com.opensymphony.xwork2.inject.ContainerImpl$10#1c6f1f4]) and a value of type [java.lang.Object[]] (value [[Ljava.lang.Object;#36d036]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
Jul 17, 2012 9:38:46 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory /usr/apache-tomcat-7.0.28/webapps/GeneTest
Jul 17, 2012 10:21:48 AM org.apache.catalina.startup.HostConfig checkResources
INFO: Undeploying context [/ai]
Jul 17, 2012 10:22:38 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive /usr/apache-tomcat-7.0.28/webapps/ai.war
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/core_rt is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/core is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/fmt_rt is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/fmt is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://jakarta.apache.org/taglibs/standard/permittedTaglibs is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://jakarta.apache.org/taglibs/standard/scriptfree is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/sql_rt is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/sql is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/xml_rt is already defined
Jul 17, 2012 10:22:39 AM org.apache.catalina.startup.TaglibUriRule body
INFO: TLD skipped. URI: http://java.sun.com/jstl/xml is already defined
log4j:WARN No appenders could be found for logger (com.opensymphony.xwork2.config.providers.XmlConfigurationProvider).
log4j:WARN Please initialize the log4j system properly.
Jul 17, 2012 10:22:39 AM org.apache.catalina.core.StandardContext startInternal
SEVERE: Error filterStart
Jul 17, 2012 10:22:39 AM org.apache.catalina.core.StandardContext startInternal
SEVERE: Context [/ai] startup failed due to previous errors
Jul 17, 2012 10:22:39 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/ai] created a ThreadLocal with key of type [com.opensymphony.xwork2.inject.ContainerImpl$10] (value [com.opensymphony.xwork2.inject.ContainerImpl$10#1c8e80d]) and a value of type [java.lang.Object[]] (value [[Ljava.lang.Object;#fadb88]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
Jul 17, 2012 10:22:39 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/ai] created a ThreadLocal with key of type [com.opensymphony.xwork2.inject.ContainerImpl$10] (value [com.opensymphony.xwork2.inject.ContainerImpl$10#162c87a]) and a value of type [java.lang.Object[]] (value [[Ljava.lang.Object;#57f389]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
web.xml
<?xml version="1.0" encoding="UTF-8"?>
<web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://java.sun.com/xml/ns/javaee" xmlns:web="http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd" id="WebApp_ID" version="3.0">
<display-name>ai</display-name>
<welcome-file-list>
<welcome-file>login.jsp</welcome-file>
</welcome-file-list>
<filter>
<filter-name>struts2</filter-name>
<filter-class>org.apache.struts2.dispatcher.ng.filter.StrutsPrepareAndExecuteFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>struts2</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
</web-app>
UPDATE: after upgrading form struts2.2.1 to struts2.3.4
Jul 18, 2012 10:38:20 AM org.apache.catalina.core.StandardServer await
INFO: A valid shutdown command was received via the shutdown port. Stopping the Server instance.
Jul 18, 2012 10:38:20 AM org.apache.coyote.AbstractProtocol pause
INFO: Pausing ProtocolHandler ["http-bio-8080"]
Jul 18, 2012 10:38:20 AM org.apache.coyote.AbstractProtocol pause
INFO: Pausing ProtocolHandler ["ajp-bio-8009"]
Jul 18, 2012 10:38:20 AM org.apache.catalina.core.StandardService stopInternal
INFO: Stopping service Catalina
Jul 18, 2012 10:38:20 AM org.apache.coyote.AbstractProtocol stop
INFO: Stopping ProtocolHandler ["http-bio-8080"]
Jul 18, 2012 10:38:20 AM org.apache.coyote.AbstractProtocol destroy
INFO: Destroying ProtocolHandler ["http-bio-8080"]
Jul 18, 2012 10:38:20 AM org.apache.coyote.AbstractProtocol stop
INFO: Stopping ProtocolHandler ["ajp-bio-8009"]
Jul 18, 2012 10:38:20 AM org.apache.coyote.AbstractProtocol destroy
INFO: Destroying ProtocolHandler ["ajp-bio-8009"]
Jul 18, 2012 10:38:25 AM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/lib64/jvm/jdk1.6.0_22/jre/lib/i386/server:/usr/lib64/jvm/jdk1.6.0_22/jre/lib/i386:/usr/lib64/jvm/jdk1.6.0_22/jre/../lib/i386:/usr/java/packages/lib/i386:/lib:/usr/lib
Jul 18, 2012 10:38:25 AM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["http-bio-8080"]
Jul 18, 2012 10:38:25 AM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["ajp-bio-8009"]
Jul 18, 2012 10:38:25 AM org.apache.coyote.AbstractProtocol init
SEVERE: Failed to initialize end point associated with ProtocolHandler ["ajp-bio-8009"]
java.net.BindException: Address already in use <null>:8009
at org.apache.tomcat.util.net.JIoEndpoint.bind(JIoEndpoint.java:406)
at org.apache.tomcat.util.net.AbstractEndpoint.init(AbstractEndpoint.java:610)
at org.apache.coyote.AbstractProtocol.init(AbstractProtocol.java:423)
at org.apache.catalina.connector.Connector.initInternal(Connector.java:974)
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:102)
at org.apache.catalina.core.StandardService.initInternal(StandardService.java:559)
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:102)
at org.apache.catalina.core.StandardServer.initInternal(StandardServer.java:814)
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:102)
at org.apache.catalina.startup.Catalina.load(Catalina.java:624)
at org.apache.catalina.startup.Catalina.load(Catalina.java:649)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.startup.Bootstrap.load(Bootstrap.java:281)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:450)
Caused by: java.net.BindException: Address already in use
at java.net.PlainSocketImpl.socketBind(Native Method)
at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:365)
at java.net.ServerSocket.bind(ServerSocket.java:319)
at java.net.ServerSocket.<init>(ServerSocket.java:185)
at java.net.ServerSocket.<init>(ServerSocket.java:141)
at org.apache.tomcat.util.net.DefaultServerSocketFactory.createSocket(DefaultServerSocketFactory.java:49)
at org.apache.tomcat.util.net.JIoEndpoint.bind(JIoEndpoint.java:393)
... 16 more
Jul 18, 2012 10:38:25 AM org.apache.catalina.core.StandardService initInternal
SEVERE: Failed to initialize connector [Connector[AJP/1.3-8009]]
org.apache.catalina.LifecycleException: Failed to initialize component [Connector[AJP/1.3-8009]]
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:106)
at org.apache.catalina.core.StandardService.initInternal(StandardService.java:559)
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:102)
at org.apache.catalina.core.StandardServer.initInternal(StandardServer.java:814)
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:102)
at org.apache.catalina.startup.Catalina.load(Catalina.java:624)
at org.apache.catalina.startup.Catalina.load(Catalina.java:649)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.startup.Bootstrap.load(Bootstrap.java:281)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:450)
Caused by: org.apache.catalina.LifecycleException: Protocol handler initialization failed
at org.apache.catalina.connector.Connector.initInternal(Connector.java:976)
at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:102)
... 12 more
Caused by: java.net.BindException: Address already in use <null>:8009
at org.apache.tomcat.util.net.JIoEndpoint.bind(JIoEndpoint.java:406)
at org.apache.tomcat.util.net.AbstractEndpoint.init(AbstractEndpoint.java:610)
at org.apache.coyote.AbstractProtocol.init(AbstractProtocol.java:423)
at org.apache.catalina.connector.Connector.initInternal(Connector.java:974)
... 13 more
Caused by: java.net.BindException: Address already in use
at java.net.PlainSocketImpl.socketBind(Native Method)
at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:365)
at java.net.ServerSocket.bind(ServerSocket.java:319)
at java.net.ServerSocket.<init>(ServerSocket.java:185)
at java.net.ServerSocket.<init>(ServerSocket.java:141)
at org.apache.tomcat.util.net.DefaultServerSocketFactory.createSocket(DefaultServerSocketFactory.java:49)
at org.apache.tomcat.util.net.JIoEndpoint.bind(JIoEndpoint.java:393)
... 16 more
Jul 18, 2012 10:38:25 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 544 ms
Jul 18, 2012 10:38:25 AM org.apache.catalina.core.StandardService startInternal
INFO: Starting service Catalina
Jul 18, 2012 10:38:25 AM org.apache.catalina.core.StandardEngine startInternal
INFO: Starting Servlet Engine: Apache Tomcat/7.0.28
Jul 18, 2012 10:38:25 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive /usr/apache-tomcat-7.0.28/webapps/ai3.war
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Parsing configuration file [struts-default.xml]
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Parsing configuration file [struts-plugin.xml]
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Parsing configuration file [struts.xml]
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.ObjectFactory)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.FileManager)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.conversion.impl.XWorkConverter)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.TextProvider)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.ActionProxyFactory)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.conversion.ObjectTypeDeterminer)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (org.apache.struts2.dispatcher.mapper.ActionMapper)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (jakarta) for (org.apache.struts2.dispatcher.multipart.MultiPartRequest)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (org.apache.struts2.views.freemarker.FreemarkerManager)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (org.apache.struts2.components.UrlRenderer)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.validator.ActionValidatorManager)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.util.ValueStackFactory)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.util.reflection.ReflectionProvider)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.util.reflection.ReflectionContextFactory)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.util.PatternMatcher)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (org.apache.struts2.dispatcher.StaticContentLoader)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (com.opensymphony.xwork2.UnknownHandlerManager)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Choosing bean (struts) for (org.apache.struts2.views.util.UrlHelper)
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Overriding property struts.i18n.reload - old value: false new value: true
Jul 18, 2012 10:38:27 AM com.opensymphony.xwork2.util.logging.jdk.JdkLogger info
INFO: Overriding property struts.configuration.xml.reload - old value: false new value: true
Jul 18, 2012 10:38:27 AM org.apache.catalina.core.StandardContext startInternal
SEVERE: Error filterStart
Jul 18, 2012 10:38:27 AM org.apache.catalina.core.StandardContext startInternal
SEVERE: Context [/ai3] startup failed due to previous errors
Jul 18, 2012 10:38:27 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory /usr/apache-tomcat-7.0.28/webapps/ROOT
Jul 18, 2012 10:38:27 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory /usr/apache-tomcat-7.0.28/webapps/docs
Jul 18, 2012 10:38:27 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory /usr/apache-tomcat-7.0.28/webapps/examples
Jul 18, 2012 10:38:27 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory /usr/apache-tomcat-7.0.28/webapps/host-manager
Jul 18, 2012 10:38:27 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory /usr/apache-tomcat-7.0.28/webapps/manager
Jul 18, 2012 10:38:27 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory /usr/apache-tomcat-7.0.28/webapps/GeneTest
Jul 18, 2012 10:38:27 AM org.apache.coyote.AbstractProtocol start
INFO: Starting ProtocolHandler ["http-bio-8080"]
Jul 18, 2012 10:38:27 AM org.apache.catalina.startup.Catalina start
INFO: Server startup in 2106 ms
AFTER THE UPGRADATION This stacktrace shows the application port is already in use means the process is already running try to use different port for the application.or
before the upgradation:RUNTIME Classpath has duplicate TLD's you have to check below paths to
appserver's /lib folder or the JDK's /lib folder and then those check again in the /WEB-INF/lib folder of the WAR build. remove the tld's from the webapps/lib folder if already there in the appservers/lib or JDK's/lib folder

log4j - DailyRollingFileAppender, override subAppend()

I am new to log4j's DailyRollingFileAppender class and I would like to use this to perform daily rotation of the log file and at the same time would like to also manually modify the log file every time there is an event triggered to log event.
For example, I would like to always increment the value by one for "TOTAL COUNT:" inside the log file. How can I go about doing that?
Example of the log content:
07 Oct 2011 16:57:51 [INFO ] - Failed
07 Oct 2011 16:57:51 [WARN ] - Failed
07 Oct 2011 16:57:51 [ERROR] - Successful
07 Oct 2011 16:57:51 [FATAL] - Failed
07 Oct 2011 16:57:52 [DEBUG] - Successful
07 Oct 2011 16:57:52 [INFO ] - Failed
07 Oct 2011 16:57:52 [WARN ] - Failed
07 Oct 2011 16:57:52 [ERROR] - Successful
07 Oct 2011 16:57:52 [FATAL] - Failed
07 Oct 2011 16:57:53 [DEBUG] - Successful
07 Oct 2011 16:57:53 [INFO ] - Failed
07 Oct 2011 16:57:53 [WARN ] - Failed
07 Oct 2011 16:57:53 [ERROR] - Successful
07 Oct 2011 16:57:53 [FATAL] - Failed
07 Oct 2011 16:57:54 [DEBUG] – Successful
TOTAL COUNT: 15
It seems overriding subAppend() in the DailyRollingFileAppender is the way to go. You'll also have to be cautious when calling super.subAppend(), as the WriterAppender implements it like this:
protected void subAppend(LoggingEvent event) {
this.qw.write(this.layout.format(event));
// ...
}
and you don't want layout with your "TOTAL COUNT" line.
In my subAppend() I'd copy exactly what D.R.F.Appender does, adding:
line counting logic, if not present elsewhere,
a condition checking if a special line should be printed, then formatting and printing it directly to qw.

file upload in JSF using myfaces component

I am creating a JSF application where file uploading functionality is required.I have added all the required jar files in my /WEB-INF/lib folder.
jsf-api.jar
jsf-impl.jar
jstl.jar
standard.jar
myfaces-extensions.jar
commons-collections.jar
commons-digester.jar
commons-beanutils.jar
commons-logging.jar
commons-fileupload-1.0.jar
but still when trying to deploy the application on apache 6.0.29 i am getting the following error.
org.apache.catalina.core.StandardContext
addApplicationListener
INFO: The listener "com.sun.faces.config.ConfigureListener" is already
configured for this context. The duplicate definition has been ignored.
org.apache.catalina.core.StandardContext start
SEVERE: Error listenerStart
PM org.apache.catalina.core.StandardContext start
SEVERE: Context [/jsfApplication] startup failed due to previous errors
org.apache.catalina.loader.WebappClassLoader
clearReferencesJdbc
The web application [/jsfApplication] registered the JBDC driver
[com.mysql.jdbc.Driver] but failed to unregister it when the web
application was stopped. To prevent a memory leak, the JDBC Driver has been
forcibly unregistered.
org.apache.catalina.loader.WebappClassLoader
clearReferencesThreads
SEVERE: The web application [/jsfApplication] appears to have started a thread
named [Timer-0] but has failed to stop it. This is very likely to create a
memory leak.
org.apache.catalina.loader.WebappClassLoader
clearReferencesThreads
SEVERE: The web application [/jsfApplication] appears to have started a thread
named [MySQL Statement Cancellation Timer] but has failed to stop it. This
is very likely to create a memory leak.
log4j:ERROR LogMananger.repositorySelector was null likely due to error in
class reloading, using NOPLoggerRepository.
i am using also using hibernate and spring framework for this application.
please help.
thanks,
Update:
This is the complete error message which I am getting whenever I am adding myFaces-extension.jar file to my /WEB-INF/lib folder.
Using CATALINA_BASE: /home/prt/Desktop/apache-tomcat-6.0.29 Using CATALINA_HOME: /home/prt/Desktop/apache-tomcat-6.0.29 Using CATALINA_TMPDIR: /home/prt/Desktop/apache-tomcat-6.0.29/temp Using JRE_HOME: /usr/jdk1.6.0_20 Using CLASSPATH: /home/prt/Desktop/apache-tomcat-6.0.29/bin/bootstrap.jar 8 Jan, 2011 7:08:54 PM org.apache.catalina.core.AprLifecycleListener init INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/jdk1.6.0_20/jre/lib/i386/client:/usr/jdk1.6.0_20/jre/lib/i386:/usr/jdk1.6.0_20/jre/../lib/i386:/usr/java/packages/lib/i386:/lib:/usr/lib 8 Jan, 2011 7:08:54 PM org.apache.coyote.http11.Http11Protocol init INFO: Initializing Coyote HTTP/1.1 on http-8080 8 Jan, 2011 7:08:54 PM org.apache.catalina.startup.Catalina load INFO: Initialization processed in 643 ms 8 Jan, 2011 7:08:54 PM org.apache.catalina.core.StandardService start INFO: Starting service Catalina 8 Jan, 2011 7:08:54 PM org.apache.catalina.core.StandardEngine start INFO: Starting Servlet Engine: Apache Tomcat/6.0.29 8 Jan, 2011 7:08:54 PM org.apache.catalina.startup.HostConfig deployDescriptor INFO: Deploying configuration descriptor host-manager.xml 8 Jan, 2011 7:08:55 PM org.apache.catalina.startup.HostConfig deployDescriptor INFO: Deploying configuration descriptor manager.xml 8 Jan, 2011 7:08:55 PM org.apache.catalina.startup.HostConfig deployWAR INFO: Deploying web application archive jsfApplication.war 8 Jan, 2011 7:08:55 PM org.apache.catalina.loader.WebappClassLoader validateJarFile INFO: validateJarFile(/home/prt/Desktop/apache-tomcat-6.0.29/webapps/jsfApplication/WEB-INF/lib/servlet-api.jar) - jar not loaded. See Servlet Spec 2.3, section 9.7.2. Offending class: javax/servlet/Servlet.class 8 Jan, 2011 7:08:55 PM org.apache.catalina.core.StandardContext addApplicationListener INFO: The listener "com.sun.faces.config.ConfigureListener" is already configured for this context. The duplicate definition has been ignored. 8 Jan, 2011 7:08:58 PM org.apache.catalina.core.StandardContext start SEVERE: Error listenerStart 8 Jan, 2011 7:08:58 PM org.apache.catalina.core.StandardContext start SEVERE: Context [/jsfApplication] startup failed due to previous errors 8 Jan, 2011 7:08:58 PM org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc SEVERE: The web application [/jsfApplication] registered the JBDC driver [com.mysql.jdbc.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered. 8 Jan, 2011 7:08:58 PM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads SEVERE: The web application [/jsfApplication] appears to have started a thread named [Timer-0] but has failed to stop it. This is very likely to create a memory leak. 8 Jan, 2011 7:08:58 PM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads SEVERE: The web application [/jsfApplication] appears to have started a thread named [MySQL Statement Cancellation Timer] but has failed to stop it. This is very likely to create a memory leak. log4j:ERROR LogMananger.repositorySelector was null likely due to error in class reloading, using NOPLoggerRepository. 8 Jan, 2011 7:08:58 PM org.apache.catalina.startup.HostConfig deployDirectory INFO: Deploying web application directory examples 8 Jan, 2011 7:08:58 PM org.apache.catalina.loader.WebappClassLoader loadClass INFO: Illegal access: this web application instance has been stopped already. Could not load java.net.BindException. The eventual following stack trace is caused by an error thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access, and has no functional impact. java.lang.IllegalStateException at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1531) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1491) at com.mysql.jdbc.CommunicationsException.(CommunicationsException.java:161) at com.mysql.jdbc.MysqlIO.send(MysqlIO.java:2759) at com.mysql.jdbc.MysqlIO.quit(MysqlIO.java:1410) at com.mysql.jdbc.Connection.realClose(Connection.java:4947) at com.mysql.jdbc.Connection.cleanup(Connection.java:2063) at com.mysql.jdbc.Connection.finalize(Connection.java:3403) at java.lang.ref.Finalizer.invokeFinalizeMethod(Native Method) at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:83) at java.lang.ref.Finalizer.access$100(Finalizer.java:14) at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:160) 8 Jan, 2011 7:08:58 PM org.apache.catalina.startup.HostConfig deployDirectory INFO: Deploying web application directory docs 8 Jan, 2011 7:08:58 PM org.apache.catalina.startup.HostConfig deployDirectory INFO: Deploying web application directory ROOT 8 Jan, 2011 7:08:58 PM org.apache.coyote.http11.Http11Protocol start INFO: Starting Coyote HTTP/1.1 on http-8080 8 Jan, 2011 7:08:58 PM org.apache.jk.common.ChannelSocket init INFO: JK: ajp13 listening on /0.0.0.0:8009 8 Jan, 2011 7:08:58 PM org.apache.jk.server.JkMain start INFO: Jk running ID=0 time=0/24 config=null 8 Jan, 2011 7:08:58 PM org.apache.catalina.startup.Catalina start INFO: Server startup in 3905 ms
I got also problems with myfaces-extensions using x tags. The conflict is due to the fact that myfaces is a very old solution and if your other libraries are uptodate the versions won't match. As posted at tomahawk inputfileupload uploaded file is null in my answer you should use tomahawk t tag and libraries related to it. Actual solution is posted in the main answer of that very same link.
I'm pretty sure this will do the thing. Tick my answer if the post included the answer to your question in its wiki-link :)

Resources