KafkLog4JAppender not pushing application logs to kafka topic - log4j

I am pretty new to using the Kafka stream.
In a particular requirement I have to push my log4j logs directly to Kafka topic.
I have a standalone kafka installation running on centos and i have verified it with the kafka publisher and consumer clients. Also i am using the bundled zookeeper instance.
Now i have also created a standalone java app with log4j logging enabled. Also i have edited the log4j.properties file as below -
log4j.rootCategory=INFO
log4j.appender.file=org.apache.log4j.DailyRollingFileAppender
log4j.appender.file.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.file.File=/home/edureka/Desktop/Anurag/logMe
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n
log4j.logger.com=INFO,file,KAFKA
#Kafka Appender
log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n
log4j.appender.KAFKA.ProducerType=async
log4j.appender.KAFKA.BrokerList=localhost:2181
log4j.appender.KAFKA.Topic=test
log4j.appender.KAFKA.Serializer=kafka.test.AppenderStringSerializer
Now when i am running the application, all the logs are going into the local log file but the consumer is still not showing any entry happening.
The topic i am using is test in either scenario.
Also no error log is being generated the the detailed logs of the log4j library are as below -
log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader#a1d92a.
log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader#a1d92a class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader#a1d92a.
log4j: Using URL [file:/home/edureka/workspace/TestKafkaLog4J/bin/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/home/edureka/workspace/TestKafkaLog4J/bin/log4j.properties
log4j: Parsing for [root] with value=[DEBUG, stdout, file].
log4j: Level token is [DEBUG].
log4j: Category root set to DEBUG
log4j: Parsing appender named "stdout".
log4j: Parsing layout options for "stdout".
log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n].
log4j: End of parsing for "stdout".
log4j: Setting property [target] to [System.out].
log4j: Parsed "stdout" options.
log4j: Parsing appender named "file".
log4j: Parsing layout options for "file".
log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n].
log4j: End of parsing for "file".
log4j: Setting property [file] to [/home/edureka/Desktop/Anurag/logMe].
log4j: Setting property [maxBackupIndex] to [10].
log4j: Setting property [maxFileSize] to [5MB].
log4j: setFile called: /home/edureka/Desktop/Anurag/logMe, true
log4j: setFile ended
log4j: Parsed "file" options.
log4j: Finished configuring.
2015-05-11 19:44:40 DEBUG TestMe:19 - This is debug : anurag
2015-05-11 19:44:40 INFO TestMe:23 - This is info : anurag
2015-05-11 19:44:40 WARN TestMe:26 - This is warn : anurag
2015-05-11 19:44:40 ERROR TestMe:27 - This is error : anurag
2015-05-11 19:44:40 FATAL TestMe:28 - This is fatal : anurag
2015-05-11 19:44:40 INFO TestMe:29 - message from log4j appender
Any help will be really great.
Thanks,
AJ

In your output, I don't see the KAFKA appender being created, so no wonder nothing is logged to Kafka. I'm guessing the reason for that is that you only log from a class named TestMe (probably in the default package), while the KAFKA appender is only added to the logger named "com".

Related

Why doesn't org.apache.log4j.RollingFileAppender create the archive files?

This is Log4j 1.2.17. I have file:/C:/workspaces/gitlab/QMT/bin/log4j.properties on the classpath. Its contents are:
log4j.rootLogger=DEBUG, stdout, queuesLog
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Threshold=INFO
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %C{1}:%M:%L - %m%n
log4j.appender.queuesLog=org.apache.log4j.RollingFileAppender
log4j.appender.queuesLog.Threshold=DEBUG
log4j.appender.queuesLog.MaxFileSize=1MB
log4j.appender.queuesLog.MaxBackupIndex=10
log4j.appender.queuesLog.layout=org.apache.log4j.PatternLayout
log4j.appender.queuesLog.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %C{1}:%M:%L - %m%n
log4j.appender.queuesLog.File=C:/QEU/logs/queues.log
So when I saw that all of the archives (e.g. queues.log.1) are missing, I concluded that Log4j wasn't obeying this configuration file. I checked several things, whose details I don't believe are necessary for this question. Those checks told me that Log4j should be obeying the above configuration file. Finally, I learned about -Dlog4j.debug.
log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader#659e0bfd.
log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader#659e0bfd class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader#659e0bfd.
log4j: Using URL [file:/C:/workspaces/gitlab/QMT/bin/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/C:/workspaces/gitlab/QMT/bin/log4j.properties
log4j: Parsing for [root] with value=[DEBUG, stdout, queuesLog].
log4j: Level token is [DEBUG].
log4j: Category root set to DEBUG
log4j: Parsing appender named "stdout".
log4j: Parsing layout options for "stdout".
log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %C{1}:%M:%L - %m%n].
log4j: End of parsing for "stdout".
log4j: Setting property [threshold] to [INFO].
log4j: Parsed "stdout" options.
log4j: Parsing appender named "queuesLog".
log4j: Parsing layout options for "queuesLog".
log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %C{1}:%M:%L - %m%n].
log4j: End of parsing for "queuesLog".
log4j: Setting property [file] to [C:/QEU/logs/queues.log].
log4j: Setting property [maxBackupIndex] to [10].
log4j: Setting property [maxFileSize] to [1MB].
log4j: Setting property [threshold] to [DEBUG].
log4j: setFile called: C:/QEU/logs/queues.log, true
log4j: setFile ended
log4j: Parsed "queuesLog" options.
log4j: Finished configuring.
log4j: rolling over count=1025467879
log4j: maxBackupIndex=10
log4j: Renaming file C:\QEU\logs\queues.log to C:\QEU\logs\queues.log.1
log4j: setFile called: C:/QEU/logs/queues.log, true
log4j: setFile ended
This confirms that file:/C:/workspaces/gitlab/QMT/bin/log4j.properties is in use. Furthermore,
log4j: Renaming file C:\QEU\logs\queues.log to C:\QEU\logs\queues.log.1
is explicit, but the archive files aren't there!
queues.log grows until the operating system doesn't like it anymore, which happens at 4 GiB.

In a java application, if all 3 log4j APIs exists in the classpath, how to choose one of them to be effective?

In a Cloudera Spark-on-YARN installation, I encounter an application that has multiple log4j APIs in its classpath:
log4j:1.2
log4j-1.2-api:2.x
log4j-core:2.x
When the application is launched. I found the following error:
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
It cannot be fixed because the vendor of the platform exclusively used the log4j:1.2 configuration everywhere: it has integrated a managed log4j.properties file (not the log4j2.properties), which is in the classpath, used by many components, and cannot be tampered.
How do I tell the classloader or log4j implementation to not trying to find log4j2 file if it is missing?
UPDATE 1: I've tried the following Spark arguments to customise the driver JVM. Effects are as follows:
--conf spark.driver.extraJavaOptions="-XX:+UseG1GC -Dlog4j1.compatibility=true -Dlog4j.configuration=./conf_5/log4j.properties -Dlog4j.debug"
result is:
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. Set system property 'org.apache.logging.log4j.simplelog.StatusLogger.level' to TRACE to show Log4j2 internal initialization logging.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
log4j: setFile called: /tmp/spark-b190d9df-64bf-4ff6-b72d-2cb8de1d888a/__driver_logs__/driver.log, true
log4j: setFile ended
log4j: Finalizing appender named [_DriverLogAppender].
--conf spark.driver.extraJavaOptions="-XX:+UseG1GC -Dlog4j1.compatibility=true -Dlog4j.configurationFile=./conf_5/log4j.properties -Dlog4j.debug"
result is:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
log4j: setFile called: /tmp/spark-817bc6eb-bcc0-4cee-ba32-dbd4d89048e5/__driver_logs__/driver.log, true
log4j: setFile ended
log4j: Finalizing appender named [_DriverLogAppender].
So none of them is working properly. There seems to be no way to either use the right version or use the wrong configuration file

k8s spark executor not able to parse log4j rootLogger Level

In my k8s spark application, I would like to change the log4j log LEVEL in executor pods.
In log4j properties file, I have set rootLogger to WARN, but still in executors pods I can see it is parsing to INFO.
log4j.properties:
log4j.rootLogger=WARN,console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d %-5p [%t] %c{2}(%L): %m%n
spark-submit:
spark-submit \
--master k8s://.. \
--deploy-mode client \
--conf spark.driver.extraJavaOptions='-Dlog4j.debug=true -Dlog4j.configuration=file:///opt/log4j.properties'\
--conf spark.executor.extraJavaOptions='-Dlog4j.debug=true -Dlog4j.configuration=file:///opt/log4j.properties'\
--class org.log4jTestRunner \
logger-test-1.0.jar
Driver logs:
log4j: Using URL [file:/opt/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/opt/log4j.properties
log4j: Parsing for [root] with value=[WARN,console].
log4j: Level token is [WARN].
log4j: Category root set to WARN
log4j: Parsing appender named "console".
log4j: Parsing layout options for "console".
log4j: Setting property [conversionPattern] to [%d %-5p [%t] %c{2}(%L): %m%n].
log4j: End of parsing for "console".
log4j: Setting property [target] to [System.err].
log4j: Parsed "console" options.
Executor logs:
log4j: Using URL [file:/opt/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/opt/log4j.properties
log4j: Parsing for [root] with value=[INFO,console].
log4j: Level token is [INFO].
log4j: Category root set to INFO
log4j: Parsing appender named ""console"".
log4j: Parsing layout options for ""console"".
log4j: Setting property [conversionPattern] to [%d %-5p [%t] %c{2}(%L): %m%n].
log4j: End of parsing for ""console"".
log4j: Setting property [target] to [System.err].
log4j: Parsed ""console"" options.
I can see in Driver it parsing correctly and log4j log level is respected.
Not sure, why in executors it is working differently.
I am using k8s with spark 3.x version.
Thank you in advance.

log4j RollingFileAppender - issue with file rename

i have webapplication hosted in Websphere App Server. it uses log4j version 1.2.15 for logging and below is my log4j config. the log file is not rotating as soon as it reaches threshold, though log4j debugger says renaming to new file. but the same configuration works fine in another environment.Please note single instance of JVM only running in both of my server environments. So there is no chance for locking the same log file by different JVM is not possible.
log4j.properties
log4j.appender.local=org.apache.log4j.RollingFileAppender
log4j.appender.local.MaxFileSize=1MB
log4j.appender.local.MaxBackupIndex=10
log4j.appender.local.File=${applogs.home}\\web-app.log
log4j.appender.local.layout=org.apache.log4j.PatternLayout
log4j.appender.local.layout.ConversionPattern=%d\t%r\t%p\t%c\t%m%n
log4j.appender.local.Threshold=DEBUG
system.out log
[1/9/20 15:41:42:520 EST] 00000086 SystemOut O log4j: rolling over count=1048745
[1/9/20 15:41:42:536 EST] 00000086 SystemOut O log4j: maxBackupIndex=10
[1/9/20 15:41:42:551 EST] 00000086 SystemOut O log4j: Renaming file <nas_path_of_server>\web-app.log to <nas_path_of_server>\web-app.log.1
[1/9/20 15:41:42:551 EST] 00000086 SystemOut O log4j: setFile called: <nas_path_of_server>\web-app.log, true
[1/9/20 15:41:42:551 EST] 00000086 SystemOut O log4j: setFile ended
Resolved
I was able to fix the issue. it turned out that nas_path_of_server did not have modify permission for the application service id which caused this issue.

When use oozie spark action, spark driver spawned on one Node can find custom log4j config file, but on other Nodes cannot find. why?

Use oozie to run spark action workflow has problem.
If driver spawned on node (172.12.0.27), log config is always right.
If driver spawned on other nodes(172.12.0.18,172.12.0.20), log config is always wrong.
Use spark-submit to run job without above problem, driver spawned on every node works right.
How can I track problem?
There are three nodes(172.12.0.27,172.12.0.18,172.12.0.20). If use spark-submit with custom log4j config file, no problem on any of the node .
spark-submit works right.
spark-submit --master yarn --deploy-mode cluster --driver-memory 1g --num-executors 4 --executor-memory 1g --files "/root/alenym/log4j.properties" --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j.properties -Dtb.spark.prod.env=true" --class com.stc.data.thingsboard.jobs.example.TestLogJob /root/alenym/dp_advance_analysis/bigdata/tb-sql-analysis/target/tb-sql-analysis-1.0.0-SNAPSHOT.jar
oozie workflow has wrong.
workflow.xml like this below.
<workflow-app xmlns='uri:oozie:workflow:0.5' name='spark-test'>
<start to='spark-node' />
<action name='spark-node'>
<spark xmlns="uri:oozie:spark-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>${nameNode}/user/${wf:user()}/${examplesRoot}/spark/hive-site.xml</job-xml>
<master>${master}</master>
<mode>cluster</mode>
<name>spark-test-oozie</name>
<class>com.stc.data.thingsboard.jobs.example.TestLogJob</class>
<jar>${nameNode}/user/${wf:user()}/${examplesRoot}/spark/lib/tb-sql-analysis-1.0.0-SNAPSHOT.jar</jar>
<spark-opts>--driver-memory 1g --num-executors 10 --executor-memory 1g --conf spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j-ym.properties -Dtb.spark.prod.env=true </spark-opts>
</spark>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Workflow failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]
</message>
</kill>
<end name='end' />
</workflow-app>
job.properties
nameNode=hdfs://HDFS80599
jobTracker=rm1
master=yarn
queueName=default
examplesRoot=batchtest
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/spark
custom "log4j-ym.properties" file is in "./lib" directory, so that --files has include
hdfs://HDFS80599/user/root/batchtest/spark/lib/log4j-ym.properties#log4j-ym.properties,
Log Type: stdout
Log Upload Time: Mon Aug 26 10:05:44 +0800 2019
Log Length: 261750
Oozie Launcher starts
Oozie Java/Map-Reduce/Pig action launcher-job configuration
=================================================================
Workflow job id : 0000993-190728182827383-oozie-hado-W
Workflow action id: 0000993-190728182827383-oozie-hado-W#spark-node
Classpath :
------------------------
...
...
------------------------
Main class : org.apache.oozie.action.hadoop.SparkMain
Maximum output : 2048
Arguments :
Java System Properties:
------------------------
#
#Mon Aug 26 10:04:54 CST 2019
java.runtime.name=Java(TM) SE Runtime Environment
sun.boot.library.path=/usr/local/jdk/jre/lib/amd64
java.vm.version=25.191-b12
oozie.action.externalChildIDs=/data/emr/yarn/local/usercache/root/appcache/application_1565412953433_3795/container_e22_1565412953433_3795_01_000002/externalChildIDs
hadoop.root.logger=INFO,CLA
java.vm.vendor=Oracle Corporation
java.vendor. url=/emr-yarn-jobhistory/http://172.21.0.48:5024/http\://java.oracle.com/
path.separator=\:
java.vm.name=Java HotSpot(TM) 64-Bit Server VM
file.encoding.pkg=sun.io
oozie.job.launch.time=1566785085000
user.country=US
sun.java.launcher=SUN_STANDARD
sun.os.patch.level=unknown
java.vm.specification.name=Java Virtual Machine Specification
user.dir=/data/emr/yarn/local/usercache/root/appcache/application_1565412953433_3795/container_e22_1565412953433_3795_01_000002
oozie.action.newId=/data/emr/yarn/local/usercache/root/appcache/application_1565412953433_3795/container_e22_1565412953433_3795_01_000002/newId
java.runtime.version=1.8.0_191-b12
java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment
java.endorsed.dirs=/usr/local/jdk/jre/lib/endorsed
os.arch=amd64
oozie.job.id=0000993-190728182827383-oozie-hado-W
oozie.action.id=0000993-190728182827383-oozie-hado-W#spark-node
yarn.app.container.log.dir=/data/emr/yarn/logs/application_1565412953433_3795/container_e22_1565412953433_3795_01_000002
java.io.tmpdir=./tmp
...
...
>>> Invoking Main class now >>>
Fetching child yarn jobs
tag id : oozie-e1db067250aafedb4df7ee644cd82ab4
Child yarn jobs are found -
Warning: Spark Log4J settings are overwritten. Child job IDs may not be available
Spark Version 2.3
Spark Action Main class : org.apache.spark.deploy.SparkSubmit
Oozie Spark action configuration
=================================================================
--master
yarn
--deploy-mode
cluster
--name
spark-test-oozie
--class
com.stc.data.thingsboard.jobs.example.TestLogJob
--conf
spark.oozie.action.id=0000993-190728182827383-oozie-hado-W#spark-node
--conf
spark.oozie.child.mapreduce.job.tags=oozie-e1db067250aafedb4df7ee644cd82ab4
--conf
spark.oozie.action.rootlogger.log.level=INFO
--conf
spark.oozie.job.id=0000993-190728182827383-oozie-hado-W
--conf
spark.oozie.action.spark.setup.hadoop.conf.dir=false
--conf
spark.oozie.HadoopAccessorService.created=true
--driver-memory
1g
--num-executors
10
--executor-memory
1g
--conf
spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j-ym.properties -Dtb.spark.prod.env=true
--conf
spark.executor.extraClassPath=$PWD/*
--conf
spark.driver.extraClassPath=$PWD/*
--conf
spark.yarn.tags=oozie-e1db067250aafedb4df7ee644cd82ab4
--conf
spark.yarn.security.tokens.hadoopfs.enabled=false
--conf
spark.yarn.security.tokens.hive.enabled=false
--conf
spark.yarn.security.tokens.hbase.enabled=false
--conf
spark.yarn.security.credentials.hadoopfs.enabled=false
--conf
spark.yarn.security.credentials.hive.enabled=false
--conf
spark.yarn.security.credentials.hbase.enabled=false
--conf
spark.executor.extraJavaOptions=-Dlog4j.configuration=spark-log4j.properties
--files
hdfs://HDFS80599/user/root/batchtest/spark/lib/breeze_2.11-0.13.2.jar,...,hdfs://HDFS80599/user/root/batchtest/spark/lib/calcite-core-1.2.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/calcite-linq4j-1.2.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/chill-java-0.8.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/chill_2.11-0.8.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-beanutils-1.7.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-beanutils-core-1.8.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-cli-1.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-codec-1.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-collections-3.2.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-compiler-3.0.8.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-compress-1.4.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-compress-1.9.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-configuration-1.6.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/commons-crypto-1.0.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/datanucleus-core-4.1.17.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/datanucleus-rdbms-4.1.19.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/derby-10.10.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/disruptor-3.3.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/eigenbase-properties-1.1.5.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/fastutil-6.5.6.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/findbugs-annotations-1.3.9-1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/flume-ng-configuration-1.6.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/flume-ng-core-1.6.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/flume-ng-sdk-1.6.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/geronimo-annotation_1.0_spec-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/geronimo-jaspic_1.0_spec-1.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/geronimo-jta_1.1_spec-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/gson-2.7.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/guava-11.0.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/guava-14.0.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/guice-3.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/guice-assistedinject-3.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/guice-servlet-3.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-common-2.8.4-tests.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-common-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-hdfs-2.8.4-tests.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-hdfs-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-hdfs-client-2.8.4-tests.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-hdfs-client-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-hdfs-native-client-2.8.4-tests.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-hdfs-native-client-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-hdfs-nfs-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-client-app-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-client-common-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-client-core-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-client-hs-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-client-hs-plugins-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-client-jobclient-2.8.4-tests.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-client-jobclient-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-client-shuffle-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-mapreduce-examples-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-nfs-2.8.4.jar,hdfs://HDFS80599/user/hadoop/share/lib/lib_20190728182750/oozie/hadoop-temrfs-1.0.6.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-api-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-applications-distributedshell-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-applications-unmanaged-am-launcher-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-client-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-common-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-registry-2.7.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-registry-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-server-applicationhistoryservice-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-server-common-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-server-nodemanager-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-server-resourcemanager-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-server-sharedcachemanager-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-server-tests-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-server-timeline-pluginstorage-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hadoop-yarn-server-web-proxy-2.8.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-annotations-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-client-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-common-1.1.1-tests.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-common-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-hadoop-compat-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-hadoop2-compat-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-prefix-tree-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-procedure-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-protocol-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hbase-server-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hive-beeline-1.2.1.spark2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hive-cli-1.2.1.spark2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hive-exec-1.2.1.spark2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hive-hbase-handler-2.3.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hive-hcatalog-core-2.3.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hive-jdbc-1.2.1.spark2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hive-metastore-1.2.1.spark2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hk2-api-2.4.0-b34.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hk2-locator-2.4.0-b34.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/hk2-utils-2.4.0-b34.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/htrace-core-3.0.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/htrace-core-3.1.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/httpclient-4.3.6.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/httpcore-4.3.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/ivy-2.4.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-annotations-2.6.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-annotations-2.6.5.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-core-2.6.5.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-core-asl-1.9.13.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-databind-2.6.5.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-jaxrs-1.9.13.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-mapper-asl-1.9.13.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-module-paranamer-2.6.5.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-module-scala_2.11-2.6.5.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jackson-xc-1.9.13.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jamon-runtime-2.3.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/janino-3.0.8.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jasper-compiler-5.5.23.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jasper-runtime-5.5.23.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/java-xmlbuilder-0.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/java-xmlbuilder-1.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/javassist-3.18.1-GA.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/javax.annotation-api-1.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/javax.inject-1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/javax.inject-2.4.0-b34.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/javax.jdo-3.2.0-m3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/javax.servlet-api-3.1.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/javax.ws.rs-api-2.0.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/javolution-5.5.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jaxb-api-2.2.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jaxb-impl-2.2.3-1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jcl-over-slf4j-1.7.16.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jcodings-1.0.8.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jcommander-1.30.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jdo-api-3.0.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-client-1.9.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-client-2.22.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-common-2.22.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-container-servlet-2.22.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-container-servlet-core-2.22.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-core-1.9.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-guava-2.22.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-json-1.9.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-media-jaxb-2.22.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-server-1.9.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jersey-server-2.22.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jets3t-0.9.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jets3t-0.9.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jettison-1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jetty-6.1.14.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jetty-all-7.6.0.v20120127.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jetty-sslengine-6.1.26.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jetty-util-6.1.26.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jline-0.9.94.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jline-2.12.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/joda-time-2.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jodd-core-3.5.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/joni-2.1.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jpam-1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jsch-0.1.42.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/json-1.8.jar,hdfs://HDFS80599/user/hadoop/share/lib/lib_20190728182750/oozie/json-simple-1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/json4s-ast_2.11-3.2.11.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/json4s-core_2.11-3.2.11.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/json4s-jackson_2.11-3.2.11.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jsp-2.1-6.1.14.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jsp-api-2.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jsp-api-2.1-6.1.14.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jsr305-1.3.9.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jsr305-3.0.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jta-1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jtransforms-2.4.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/jul-to-slf4j-1.7.16.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/kafka-clients-0.8.2.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/kafka_2.11-0.8.2.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/kryo-shaded-3.0.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/leveldbjni-all-1.8.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/libfb303-0.9.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/libthrift-0.9.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/log4j-1.2-api-2.6.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/log4j-1.2.17.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/log4j-api-2.6.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/log4j-core-2.6.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/log4j-web-2.6.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/log4j-ym.properties#log4j-ym.properties,hdfs://HDFS80599/user/root/batchtest/spark/lib/log4j2.xml#log4j2.xml,hdfs://HDFS80599/user/root/batchtest/spark/lib/lz4-java-1.4.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/machinist_2.11-0.6.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/macro-compat_2.11-1.1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/mail-1.4.7.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/metrics-core-2.2.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/metrics-core-3.1.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/metrics-graphite-3.1.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/metrics-json-3.1.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/metrics-json-3.1.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/metrics-jvm-3.1.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/metrics-jvm-3.1.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/mina-core-2.0.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/minlog-1.3.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/mx4j-3.0.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/objenesis-2.1.jar,hdfs://HDFS80599/user/hadoop/share/lib/lib_20190728182750/oozie/oozie-hadoop-utils-hadoop-2-4.3.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/oozie-sharelib-hive2-4.3.1.jar,hdfs://HDFS80599/user/hadoop/share/lib/lib_20190728182750/oozie/oozie-sharelib-oozie-4.3.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/oozie-sharelib-spark-4.3.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/opencsv-2.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/orc-core-1.3.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/oro-2.0.8.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/osgi-resource-locator-1.0.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/paranamer-2.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/parquet-column-1.8.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/parquet-common-1.8.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/parquet-encoding-1.8.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/parquet-format-2.3.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/parquet-hadoop-1.8.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/parquet-hadoop-bundle-1.6.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/parquet-hadoop-bundle-1.8.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/parquet-jackson-1.8.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/pmml-model-1.2.15.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/pmml-schema-1.2.15.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/protobuf-java-2.5.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/py4j-0.10.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/pyrolite-4.13.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/scala-compiler-2.11.8.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/scala-library-2.11.8.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/scala-parser-combinators_2.11-1.0.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/scala-reflect-2.11.8.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/scala-xml_2.11-1.0.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/scalap-2.11.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/servlet-api-2.5-6.1.14.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/shapeless_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/slf4j-api-1.7.16.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/slf4j-log4j12-1.7.16.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/slider-core-0.90.2-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/snappy-java-1.0.5.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/snappy-java-1.1.2.6.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-catalyst_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-core_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-graphx_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-hive-thriftserver_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-hive_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-kvstore_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-launcher_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-mllib-local_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-mllib_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-network-common_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-network-shuffle_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-repl_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-sketch_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-sql_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-streaming_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-tags_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-unsafe_2.11-2.3.2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spire-macros_2.11-0.13.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/spire_2.11-0.13.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/stax-api-1.0-2.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/stream-2.7.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/stringtemplate-3.2.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/super-csv-2.2.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/tephra-api-0.6.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/tephra-core-0.6.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/tephra-hbase-compat-1.0-0.6.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/transaction-api-1.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/twill-api-0.6.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/twill-common-0.6.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/twill-core-0.6.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/twill-discovery-api-0.6.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/twill-discovery-core-0.6.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/twill-zookeeper-0.6.0-incubating.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/univocity-parsers-2.2.1.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/unused-1.0.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/validation-api-1.1.0.Final.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/xbean-asm5-shaded-4.4.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/xercesImpl-2.11.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/xml-apis-1.4.01.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/xmlenc-0.52.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/xz-1.0.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/zkclient-0.3.jar,hdfs://HDFS80599/user/root/batchtest/spark/lib/zookeeper-3.4.6.jar,spark-log4j.properties,hive-site.xml
--conf
spark.yarn.jars=hdfs://HDFS80599/user/root/batchtest/spark/lib/spark-yarn_2.11-2.3.2.jar
--verbose
hdfs://HDFS80599/user/root/batchtest/spark/lib/tb-sql-analysis-1.0.0-SNAPSHOT.jar
The expected result is below. can output utf-8 log:
TestLogJob:41 - This is from log.info|zhong wen (中文)
Log Type: stderr
Log Upload Time: Mon Aug 26 10:05:38 +0800 2019
Log Length: 493
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/emr/yarn/local/filecache/0/34352/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/service/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Log Type: stdout
Log Upload Time: Mon Aug 26 10:05:38 +0800 2019
Log Length: 286
2019-08-26 10:05:05 INFO AbstractTbSparkSqlJob:130 - tb.spark.prod.env=true
2019-08-26 10:05:05 INFO AbstractTbSparkSqlJob:131 - tb.spark.test.env=false
2019-08-26 10:05:36 INFO TestLogJob:41 - This is from log.info|zhong wen (中文)
The wrong log example like this below. Driver use Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
and so that log content is not utf-8 :
TestLogJob:41 - This is from log.info|zhong wen (??)
Log Type: stderr
Log Upload Time: Mon Aug 26 10:05:37 +0800 2019
Log Length: 571
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/emr/yarn/local/filecache/41582/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/service/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Log Type: stdout
Log Upload Time: Mon Aug 26 10:05:37 +0800 2019
Log Length: 398
2019-08-26 10:05:05.284 [Driver] INFO com.stc.data.thingsboard.AbstractTbSparkSqlJob - tb.spark.prod.env=true
2019-08-26 10:05:05.288 [Driver] INFO com.stc.data.thingsboard.AbstractTbSparkSqlJob - tb.spark.test.env=false
2019-08-26 10:05:36.599 [Driver] INFO com.stc.data.thingsboard.jobs.example.TestLogJob - This is from log.info|zhong wen (??)
The spark job is supposed to be able to be launched from any node, however, your config file is not accessible by any node.
You can pass it to your spark action ! (This is available from spark-action:0.2)
<action name='spark-node'>
<spark xmlns="uri:oozie:spark-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>${nameNode}/user/${wf:user()}/${examplesRoot}/spark/hive-site.xml</job-xml>
<master>${master}</master>
<mode>cluster</mode>
<name>spark-test-oozie</name>
<class>com.stc.data.thingsboard.jobs.example.TestLogJob</class>
<jar>${nameNode}/user/${wf:user()}/${examplesRoot}/spark/lib/tb-sql-analysis-1.0.0-SNAPSHOT.jar</jar>
<spark-opts>--driver-memory 1g --num-executors 10 --executor-memory 1g --conf spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j-ym.properties -Dtb.spark.prod.env=true </spark-opts>
<file>path/to/log4j-ym.properties#log4j-ym.properties</file>
</spark>
<ok to="end" />
<error to="fail" />
</action>
I have resolved my problem , although not completely.
When I add -Dlog4j.debug=true , I can find more debug log from log4j.
below is from 172.12.0.27 , log is right.
Log Type: stderr
Log Upload Time: Mon Aug 26 17:07:39 +0800 2019
Log Length: 493
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/emr/yarn/local/filecache/0/36267/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/service/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Log Type: stdout
Log Upload Time: Mon Aug 26 17:07:39 +0800 2019
Log Length: 1445
log4j: Trying to find [log4j-ym.properties] using context classloader sun.misc.Launcher$AppClassLoader#629f0666.
log4j: Using URL [file:/data/emr/yarn/local/usercache/root/appcache/application_1565412953433_3905/container_e22_1565412953433_3905_01_000002/log4j-ym.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/data/emr/yarn/local/usercache/root/appcache/application_1565412953433_3905/container_e22_1565412953433_3905_01_000002/log4j-ym.properties
log4j: Parsing for [root] with value=[WARN, stdout].
log4j: Level token is [WARN].
log4j: Category root set to WARN
log4j: Parsing appender named "stdout".
log4j: Parsing layout options for "stdout".
log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n].
log4j: End of parsing for "stdout".
log4j: Setting property [encoding] to [UTF-8].
log4j: Setting property [target] to [System.out].
log4j: Parsed "stdout" options.
log4j: Parsing for [com.stc.data] with value=[INFO].
log4j: Level token is [INFO].
log4j: Category com.stc.data set to INFO
log4j: Handling log4j.additivity.com.stc.data=[null]
log4j: Finished configuring.
2019-08-26 17:07:06 INFO AbstractTbSparkSqlJob:130 - tb.spark.prod.env=true
2019-08-26 17:07:06 INFO AbstractTbSparkSqlJob:131 - tb.spark.test.env=false
2019-08-26 17:07:37 INFO TestLogJob:41 - This is from log.info|zhong wen (中文)
below is from 172.12.0.20, and log is not right.
Log Type: stderr
Log Upload Time: Mon Aug 26 17:07:34 +0800 2019
Log Length: 19110
DEBUG StatusLogger Using ShutdownCallbackRegistry class org.apache.logging.log4j.core.util.DefaultShutdownCallbackRegistry
DEBUG StatusLogger Took 0.051332 seconds to load 209 plugins from sun.misc.Launcher$AppClassLoader#629f0666
DEBUG StatusLogger PluginManager 'Converter' found 44 plugins
DEBUG StatusLogger Starting OutputStreamManager SYSTEM_OUT.false.false-1
DEBUG StatusLogger Starting LoggerContext[name=629f0666, org.apache.logging.log4j.core.LoggerContext#3bf9ce3e]...
DEBUG StatusLogger Reconfiguration started for context[name=629f0666] at URI null (org.apache.logging.log4j.core.LoggerContext#3bf9ce3e) with optional ClassLoader: null
DEBUG StatusLogger PluginManager 'ConfigurationFactory' found 4 plugins
DEBUG StatusLogger Missing dependencies for Yaml support
DEBUG StatusLogger Using configurationFactory org.apache.logging.log4j.core.config.ConfigurationFactory$Factory#66d3eec0
TRACE StatusLogger Trying to find [log4j2-test629f0666.properties] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test629f0666.properties] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.properties] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.properties] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test629f0666.yml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test629f0666.yml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.yml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.yml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test629f0666.yaml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test629f0666.yaml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.yaml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.yaml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test629f0666.json] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test629f0666.json] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.json] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.json] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test629f0666.jsn] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test629f0666.jsn] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.jsn] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.jsn] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test629f0666.xml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test629f0666.xml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.xml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test629f0666.xml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test.properties] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test.properties] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.properties] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.properties] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test.yml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test.yml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.yml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.yml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test.yaml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test.yaml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.yaml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.yaml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test.json] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test.json] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.json] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.json] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test.jsn] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test.jsn] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.jsn] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.jsn] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2-test.xml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2-test.xml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.xml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2-test.xml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2629f0666.properties] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2629f0666.properties] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.properties] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.properties] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2629f0666.yml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2629f0666.yml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.yml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.yml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2629f0666.yaml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2629f0666.yaml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.yaml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.yaml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2629f0666.json] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2629f0666.json] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.json] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.json] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2629f0666.jsn] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2629f0666.jsn] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.jsn] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.jsn] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2629f0666.xml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2629f0666.xml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.xml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2629f0666.xml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2.properties] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2.properties] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.properties] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.properties] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2.yml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2.yml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.yml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.yml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2.yaml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2.yaml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.yaml] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.yaml] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2.json] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2.json] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.json] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.json] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2.jsn] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
TRACE StatusLogger Trying to find [log4j2.jsn] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.jsn] using sun.misc.Launcher$AppClassLoader#629f0666 class loader.
TRACE StatusLogger Trying to find [log4j2.jsn] using ClassLoader.getSystemResource().
TRACE StatusLogger Trying to find [log4j2.xml] using context class loader sun.misc.Launcher$AppClassLoader#629f0666.
DEBUG StatusLogger Initializing configuration XmlConfiguration[location=jar:file:/data/emr/yarn/local/filecache/42404/tb-sql-analysis-1.0.0-SNAPSHOT.jar!/log4j2.xml]
DEBUG StatusLogger Installed script engines
DEBUG StatusLogger Scala Interpreter Version: 1.0, Language: Scala, Threading: Not Thread Safe, Compile: true, Names: {scala}
DEBUG StatusLogger Oracle Nashorn Version: 1.8.0_191, Language: ECMAScript, Threading: Not Thread Safe, Compile: true, Names: {nashorn, Nashorn, js, JS, JavaScript, javascript, ECMAScript, ecmascript}
DEBUG StatusLogger PluginManager 'Core' found 119 plugins
DEBUG StatusLogger PluginManager 'Level' found 0 plugins
DEBUG StatusLogger No scheduled items
DEBUG StatusLogger PluginManager 'Lookup' found 14 plugins
DEBUG StatusLogger Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
TRACE StatusLogger TypeConverterRegistry initializing.
DEBUG StatusLogger PluginManager 'TypeConverter' found 26 plugins
DEBUG StatusLogger PatternLayout$Builder(pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n", PatternSelector=null, Configuration(jar:file:/data/emr/yarn/local/filecache/42404/tb-sql-analysis-1.0.0-SNAPSHOT.jar!/log4j2.xml), Replace=null, charset="null", alwaysWriteExceptions="null", noConsoleNoAnsi="null", header="null", footer="null")
DEBUG StatusLogger PluginManager 'Converter' found 44 plugins
DEBUG StatusLogger Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.ConsoleAppender].
DEBUG StatusLogger PluginManager 'Converter' found 44 plugins
DEBUG StatusLogger Starting OutputStreamManager SYSTEM_OUT.false.false-2
DEBUG StatusLogger ConsoleAppender$Builder(PatternLayout(%d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n), Filter=null, target="SYSTEM_OUT", name="Console", follow="null", direct="null", ignoreExceptions="null")
DEBUG StatusLogger Starting OutputStreamManager SYSTEM_OUT.false.false
DEBUG StatusLogger Building Plugin[name=appenders, class=org.apache.logging.log4j.core.config.AppendersPlugin].
DEBUG StatusLogger createAppenders(={Console})
DEBUG StatusLogger Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
DEBUG StatusLogger createAppenderRef(ref="Console", level="null", Filter=null)
DEBUG StatusLogger Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
DEBUG StatusLogger createLogger(additivity="false", level="INFO", name="com.stc.data", includeLocation="null", ={Console}, ={}, Configuration(jar:file:/data/emr/yarn/local/filecache/42404/tb-sql-analysis-1.0.0-SNAPSHOT.jar!/log4j2.xml), Filter=null)
DEBUG StatusLogger Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
DEBUG StatusLogger createAppenderRef(ref="Console", level="null", Filter=null)
DEBUG StatusLogger Building Plugin[name=root, class=org.apache.logging.log4j.core.config.LoggerConfig$RootLogger].
DEBUG StatusLogger createLogger(additivity="null", level="ERROR", includeLocation="null", ={Console}, ={}, Configuration(jar:file:/data/emr/yarn/local/filecache/42404/tb-sql-analysis-1.0.0-SNAPSHOT.jar!/log4j2.xml), Filter=null)
DEBUG StatusLogger Building Plugin[name=loggers, class=org.apache.logging.log4j.core.config.LoggersPlugin].
DEBUG StatusLogger createLoggers(={com.stc.data, root})
DEBUG StatusLogger Configuration XmlConfiguration[location=jar:file:/data/emr/yarn/local/filecache/42404/tb-sql-analysis-1.0.0-SNAPSHOT.jar!/log4j2.xml] initialized
DEBUG StatusLogger Starting configuration XmlConfiguration[location=jar:file:/data/emr/yarn/local/filecache/42404/tb-sql-analysis-1.0.0-SNAPSHOT.jar!/log4j2.xml]
DEBUG StatusLogger Started configuration XmlConfiguration[location=jar:file:/data/emr/yarn/local/filecache/42404/tb-sql-analysis-1.0.0-SNAPSHOT.jar!/log4j2.xml] OK.
TRACE StatusLogger Stopping org.apache.logging.log4j.core.config.DefaultConfiguration#128d2484...
TRACE StatusLogger DefaultConfiguration notified 1 ReliabilityStrategies that config will be stopped.
TRACE StatusLogger DefaultConfiguration stopping root LoggerConfig.
TRACE StatusLogger DefaultConfiguration notifying ReliabilityStrategies that appenders will be stopped.
TRACE StatusLogger DefaultConfiguration stopping remaining Appenders.
DEBUG StatusLogger Shutting down OutputStreamManager SYSTEM_OUT.false.false-1
TRACE StatusLogger DefaultConfiguration stopped 1 remaining Appenders.
TRACE StatusLogger DefaultConfiguration cleaning Appenders from 1 LoggerConfigs.
DEBUG StatusLogger Stopped org.apache.logging.log4j.core.config.DefaultConfiguration#128d2484 OK
TRACE StatusLogger Reregistering MBeans after reconfigure. Selector=org.apache.logging.log4j.core.selector.ClassLoaderContextSelector#20765ed5
TRACE StatusLogger Reregistering context (1/1): '629f0666' org.apache.logging.log4j.core.LoggerContext#3bf9ce3e
TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=629f0666'
TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=629f0666,component=StatusLogger'
TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=629f0666,component=ContextSelector'
TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=629f0666,component=Loggers,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=629f0666,component=Appenders,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=629f0666,component=AsyncAppenders,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=629f0666,component=AsyncLoggerRingBuffer'
TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=629f0666,component=Loggers,name=*,subtype=RingBuffer'
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=629f0666
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=629f0666,component=StatusLogger
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=629f0666,component=ContextSelector
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=629f0666,component=Loggers,name=
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=629f0666,component=Loggers,name=com.stc.data
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=629f0666,component=Appenders,name=Console
TRACE StatusLogger Using default SystemClock for timestamps.
TRACE StatusLogger Using DummyNanoClock for nanosecond timestamps.
DEBUG StatusLogger Reconfiguration complete for context[name=629f0666] at URI jar:file:/data/emr/yarn/local/filecache/42404/tb-sql-analysis-1.0.0-SNAPSHOT.jar!/log4j2.xml (org.apache.logging.log4j.core.LoggerContext#3bf9ce3e) with optional ClassLoader: null
DEBUG StatusLogger Shutdown hook enabled. Registering a new one.
DEBUG StatusLogger LoggerContext[name=629f0666, org.apache.logging.log4j.core.LoggerContext#3bf9ce3e] started OK.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/emr/yarn/local/filecache/42556/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/service/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Log Type: stdout
Log Upload Time: Mon Aug 26 17:07:34 +0800 2019
Log Length: 398
2019-08-26 17:07:01.622 [Driver] INFO com.stc.data.thingsboard.AbstractTbSparkSqlJob - tb.spark.prod.env=true
2019-08-26 17:07:01.650 [Driver] INFO com.stc.data.thingsboard.AbstractTbSparkSqlJob - tb.spark.test.env=false
2019-08-26 17:07:32.600 [Driver] INFO com.stc.data.thingsboard.jobs.example.TestLogJob - This is from log.info|zhong wen (??)
This is from System.out.println |zhong wen (??)
So, I find the other nodes alway try to find log4j2 config file. Although I don't know why, but I try to add log4j2.configurationFile. and the other two nodes can log utf8 character correctly.
<spark-opts>--driver-memory 1g --num-executors 10 --executor-memory 1g --conf spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j-ym.properties -Dtb.spark.prod.env=true -Dlog4j.configurationFile=log4j2-ym.xml</spark-opts>

Resources