Grails 2.4.3 considers FATAL log level as ERROR - log4j

I'm learning about log4j configuration in Grails. Below is my Config.groovy. The logger grails.app.controllers.logging.FatalController is configured to log fatal level only.
log4j.main = {
// Example of changing the log pattern for the default console appender:
//
//appenders {
// console name:'stdout', layout:pattern(conversionPattern: '%c{2} %m%n')
//}
fatal 'grails.app.controllers.logging.FatalController'
error 'org.codehaus.groovy.grails.web.servlet', // controllers
'org.codehaus.groovy.grails.web.pages', // GSP
'org.codehaus.groovy.grails.web.sitemesh', // layouts
'org.codehaus.groovy.grails.web.mapping.filter', // URL mapping
'org.codehaus.groovy.grails.web.mapping', // URL mapping
'org.codehaus.groovy.grails.commons', // core / classloading
'org.codehaus.groovy.grails.plugins', // plugins
'org.codehaus.groovy.grails.orm.hibernate', // hibernate integration
'org.springframework',
'org.hibernate',
'net.sf.ehcache.hibernate'
warn 'grails.app.services.logging.WarnService',
'grails.app.controllers.logging.WarnController'
This is my FatalController.groovy:
package logging
class FatalController {
def index(){
log.debug("This is not shown")
log.warn("neither this")
log.error("or that")
log.fatal("but this does")
render "logged"
}
}
Now, when I execute this I expected it to log "but this does". However it doesn't. When I changed Config.groovy line:
fatal 'grails.app.controllers.logging.FatalController'
to this:
all 'grails.app.controllers.logging.FatalController'
The output I get is this:
2014-10-15 12:33:04,070 [http-bio-8080-exec-2] DEBUG logging.FatalController - This is not shown
2014-10-15 12:33:04,071 [http-bio-8080-exec-2] WARN logging.FatalController - neither this
| Error 2014-10-15 12:33:04,072 [http-bio-8080-exec-2] ERROR logging.FatalController - or that
| Error 2014-10-15 12:33:04,072 [http-bio-8080-exec-2] ERROR logging.FatalController - but this does
Notice that the message "but this does" is defined in FatalController.groovy to be logged as fatal
log.fatal("but this does")
And what the log message say is that it is a ERROR level message log:
| Error 2014-10-15 12:33:04,072 [http-bio-8080-exec-2] ERROR logging.FatalController - but this does
So there are two problems: 1) FATAL log messages are not shown when the logger level is defined as FATAL and 2) when I code log.fatal("something"), the log shows it as an ERROR level message.
What am I doing wrong here?

I think it's a Grails bug (I'm using Grails 2.4.5).
I found a workaround: try to use log4j in "old way".
import org.apache.log4j.Logger
class FatalController {
Logger log = Logger.getLogger(getClass())
def index() {
// ...
}
}

Related

To suppress or not to suppress (deprecation warning)?

I have the following code in Android Studio Dolphin:
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.KITKAT_WATCH) {
powerManager.isScreenOn();
} else {
powerManager.isInteractive();
}
When I run Code > Inspect code... I see warning messages:
warning: [deprecation] isScreenOn() in PowerManager has been deprecated
But if I add
#SuppressWarnings("deprecation")
I then see the warning
Redundant suppression
which obviously is.
So, no matter what I do, I see warning messages...

Serilog MinimumLevel Override with AspNetCore

Serilog with ASP NET 5 Razor Pages.
Reducing log verbosity is very useful for Informational logs.
However for debug logs, how to get a MinimumLevel.Override("Microsoft.AspNetCore") to be specific to a debug file sink?
Creating 2 configurations could be a solution, but feels like something more elegant may be possible?
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.Enrich.FromLogContext()
// for debug file sink I want the override to be Debug
.MinimumLevel.Override("Microsoft.AspNetCore", LogEventLevel.Debug)
.WriteTo.File("debug.txt", restrictedToMinimumLevel: LogEventLevel.Debug)
// for info and warning file sinks I want the override to be Warning
.MinimumLevel.Override("Microsoft.AspNetCore", LogEventLevel.Warning)
.WriteTo.File("info.txt", restrictedToMinimumLevel: LogEventLevel.Information)
.WriteTo.File("warning.txt", restrictedToMinimumLevel: LogEventLevel.Warning)
.CreateLogger();
Everything works as expected using just one override. But not together.
In the example above the Warning override takes precedence and no AspNetCore Debug event logs are written to debug.txt
Edit
In summary, I'd like my debug log to include Information event level from Microsoft.AspNetCore and my info log file to include Warning event level from Microsoft.AspNetCore
I got the 2 logs files how I wanted by commenting out and in 1. and 2. below
// 1. for debug file sink I want AspNetCore.Information or Debug level override
.MinimumLevel.Override("Microsoft.AspNetCore", LogEventLevel.Information)
.WriteTo.File($#"{logFilePath}debugx.txt", restrictedToMinimumLevel: LogEventLevel.Debug, rollingInterval: RollingInterval.Day)
// 2. for info and warning file sinks below I want only AspNetCore warnings
//.MinimumLevel.Override("Microsoft.AspNetCore", LogEventLevel.Warning)
It's an interesting one
You want to filter log data and want to populate into different file sinks.
For Example /Logs/Error/Errlog.txt and /Logs/Info/InfoLog.txt
You can achieve this by using Serilog.Expressions nuget package. If time permits, I will paste a working example here.
Serilog.Expressions sample from Serilog
https://github.com/serilog/serilog-expressions/blob/dev/example/Sample/Program.cs
In below example it will exclude Name=User line and only print second line on console
using var log = new LoggerConfiguration()
.Filter.ByExcluding("#m like 'Welcome!%' and Name = 'User'")
.WriteTo.Console()
.CreateLogger();
// Logged normally
log.Information("Welcome!, {Name}", "User");
// Excluded by the filter
log.Information("Welcome!, {Name}", "Domain\\UserName");
Here is the filtering example for \Logs\Info\Info-20210720.txt which filters Error, Fatal or Warning levels. More information here
var exprInfo = "#l='Error' or #l='Fatal' or #l='Warning'";
var loggerInfo = new LoggerConfiguration()
.WriteTo.File(
#"C:\Temp\Logs\Info\Info-.txt",
fileSizeLimitBytes: 1_000_000,
outputTemplate: "{Timestamp:yyyy-MM-dd HH:mm:ss.fff} [{Level}] [{SourceContext}] [{EventId}] {Message}{NewLine}{Exception}",
rollingInterval: RollingInterval.Day,
rollOnFileSizeLimit: true,
shared: true,
flushToDiskInterval: TimeSpan.FromSeconds(1))
.MinimumLevel.Override("Microsoft", LogEventLevel.Debug)
.Filter.ByExcluding(exprInfo)
.CreateLogger();
try
{
loggerInfo.Debug("TEST");
SelfLog.Enable(Console.Out);
var sw = System.Diagnostics.Stopwatch.StartNew();
for (var i = 0; i < 100; ++i)
{
loggerInfo.Information("Hello, file logger!>>>>>>{Count}", i);
loggerInfo.Information("Writing to log file with INFORMATION severity level.");
loggerInfo.Debug("Writing to log file with DEBUG severity level.");
loggerInfo.Warning("Writing to log file with WARNING severity level.");
loggerInfo.Error("Writing to log file with ERROR severity level.");
loggerInfo.Fatal("Writing to log file with CRITICAL severity level.");
}
sw.Stop();
Console.WriteLine($"Elapsed: {sw.ElapsedMilliseconds} ms");
Console.WriteLine($"Size: {new FileInfo("log.txt").Length}");
Console.WriteLine("Press any key to delete the temporary log file...");
Console.ReadKey(true);
}
catch (Exception ex)
{
loggerInfo.Fatal(ex, "Application Start-up for Serilog failed");
throw;
}
finally
{
Log.CloseAndFlush();
}
I solved it by using sub loggers and filters as described in here: How can I override Serilog levels differently for different sinks?
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.Enrich.FromLogContext()
// Includes Debug from Microsoft.AspNetCore (noisy)
// useful for deep debugging
.WriteTo.File($#"logs/debug.txt", rollingInterval: RollingInterval.Day)
// Info-with-framework (useful for debugging)
.WriteTo.Logger(lc => lc
.MinimumLevel.Information()
.Filter.ByExcluding("RequestPath in ['/health-check', '/health-check-db']")
.WriteTo.File("logs/info-with-framework.txt", rollingInterval: RollingInterval.Day)
.WriteTo.Console()
)
// Info
// framework minimum level is Warning (normal everyday looking at logs)
.WriteTo.Logger(lc => lc
.MinimumLevel.Information()
.Filter.ByExcluding("RequestPath in ['/health-check', '/health-check-db']")
.Filter.ByExcluding("SourceContext = 'Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware'")
.Filter.ByExcluding(logEvent =>
logEvent.Level < LogEventLevel.Warning &&
Matching.FromSource("Microsoft.AspNetCore").Invoke(logEvent))
.WriteTo.File("logs/info.txt", rollingInterval: RollingInterval.Day))
// Warning (bad things - Warnings, Error and Fatal)
.WriteTo.Logger(lc => lc
.MinimumLevel.Warning()
// stopping duplicate stacktraces, see blog 2021/03/10/a11-serilog-logging-in-razor-pages
.Filter.ByExcluding("SourceContext = 'Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware'")
.WriteTo.File("logs/warning.txt", rollingInterval: RollingInterval.Day))
// SignalR - tweak levels by filtering on these namespaces
// Microsoft.AspNetCore.SignalR
// Microsoft.AspNetCore.Http.Connections
.CreateLogger();
Although this works, there may be a better way https://nblumhardt.com/2016/07/serilog-2-write-to-logger/
I feel like you don't need those minium level override calls. The restricted to minimum level parameter in the sinks will take are of filtering.
You do need to set the minimum level to info so the info sink can work.

npm winston set log level for basic logging instance

Setting winston log level to 'debug' in 'easy mode' was not well documented so I've shown an example below (and will submit a PR soon).
the answer is winston.level = 'debug'
I want to use the winston logging package in a node script and not bother with any config, just be able to call winston.debug, winston.info, winston.error and then pass in the log level as a command line param. The docs for 'easy mode' did not include how to set log level so I've shown it below.
The code:
var winston = require('winston');
winston.transports.Console.level = "debug";
winston.log("error", "error test 1");
winston.log("info", "info test 1");
winston.log("debug", "debug test 1");
winston.level = "debug";
winston.log("error", "error test 2");
winston.log("info", "info test 2");
winston.log("debug", "debug test 2");
Will output:
error: error test 1
info: info test 1
error: error test 2
info: info test 2
debug: debug test 2
Hope this helps
the answer is winston.level = 'debug'

Spring-Kafka Integration 1.0.0.RELEASE Issue with Producer

I am not able to publish message using Spring Kafka Integration, though my Kafka Java Client is working fine.
The Java code is running on Windows and Kafka is running on Linux box.
KafkaProducerContext<String, String> kafkaProducerContext = new KafkaProducerContext<String, String>();
ProducerMetadata<String, String> producerMetadata = new ProducerMetadata<String, String>("test-cass");
producerMetadata.setValueClassType(String.class);
producerMetadata.setKeyClassType(String.class);
Encoder<String> encoder = new StringEncoder<String>();
producerMetadata.setValueEncoder(encoder);
producerMetadata.setKeyEncoder(encoder);
ProducerFactoryBean<String, String> producer = new ProducerFactoryBean<String, String>(producerMetadata, "172.16.1.42:9092");
ProducerConfiguration<String, String> config = new ProducerConfiguration<String, String>(producerMetadata, producer.getObject());
kafkaProducerContext.setProducerConfigurations(Collections.singletonMap("test-cass", config));
KafkaProducerMessageHandler<String, String> handler = new KafkaProducerMessageHandler<String, String>(kafkaProducerContext);
handler.handleMessage(MessageBuilder.withPayload("foo")
.setHeader("messagekey", "3")
.setHeader("topic", "test-cass")
.build());
I am getting following error
"C:\Program Files\Java\jdk1.7.0_71\bin\java" -Didea.launcher.port=7542 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA 13.1.6\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.7.0_71\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\jce.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\jfxrt.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\resources.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\rt.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\zipfs.jar;C:\projects\SpringCassandraInt\target\classes;C:\Users\hs\.m2\repository\org\springframework\data\spring-data-cassandra\1.1.2.RELEASE\spring-data-cassandra-1.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\data\spring-cql\1.1.2.RELEASE\spring-cql-1.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-context\4.1.4.RELEASE\spring-context-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-aop\4.1.4.RELEASE\spring-aop-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\hs\.m2\repository\org\springframework\spring-beans\4.0.9.RELEASE\spring-beans-4.0.9.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-core\4.1.2.RELEASE\spring-core-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\commons-logging\commons-logging\1.1.3\commons-logging-1.1.3.jar;C:\Users\hs\.m2\repository\org\springframework\spring-expression\4.1.2.RELEASE\spring-expression-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-tx\4.1.4.RELEASE\spring-tx-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\data\spring-data-commons\1.9.2.RELEASE\spring-data-commons-1.9.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\slf4j\slf4j-api\1.7.10\slf4j-api-1.7.10.jar;C:\Users\hs\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.10\jcl-over-slf4j-1.7.10.jar;C:\Users\hs\.m2\repository\com\datastax\cassandra\cassandra-driver-dse\2.0.4\cassandra-driver-dse-2.0.4.jar;C:\Users\hs\.m2\repository\com\datastax\cassandra\cassandra-driver-core\2.0.4\cassandra-driver-core-2.0.4.jar;C:\Users\hs\.m2\repository\io\netty\netty\3.9.0.Final\netty-3.9.0.Final.jar;C:\Users\hs\.m2\repository\com\codahale\metrics\metrics-core\3.0.2\metrics-core-3.0.2.jar;C:\Users\hs\.m2\repository\com\google\guava\guava\15.0\guava-15.0.jar;C:\Users\hs\.m2\repository\org\liquibase\liquibase-core\3.1.1\liquibase-core-3.1.1.jar;C:\Users\hs\.m2\repository\org\yaml\snakeyaml\1.13\snakeyaml-1.13.jar;C:\Users\hs\.m2\repository\ch\qos\logback\logback-classic\1.1.2\logback-classic-1.1.2.jar;C:\Users\hs\.m2\repository\ch\qos\logback\logback-core\1.1.2\logback-core-1.1.2.jar;C:\Users\hs\.m2\repository\org\springframework\integration\spring-integration-core\4.1.2.RELEASE\spring-integration-core-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\projectreactor\reactor-core\1.1.4.RELEASE\reactor-core-1.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\com\goldmansachs\gs-collections\5.0.0\gs-collections-5.0.0.jar;C:\Users\hs\.m2\repository\com\goldmansachs\gs-collections-api\5.0.0\gs-collections-api-5.0.0.jar;C:\Users\hs\.m2\repository\com\lmax\disruptor\3.2.1\disruptor-3.2.1.jar;C:\Users\hs\.m2\repository\io\gatling\jsr166e\1.0\jsr166e-1.0.jar;C:\Users\hs\.m2\repository\org\springframework\retry\spring-retry\1.1.1.RELEASE\spring-retry-1.1.1.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-messaging\4.1.4.RELEASE\spring-messaging-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\integration\spring-integration-stream\4.1.2.RELEASE\spring-integration-stream-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\integration\spring-integration-xml\4.1.2.RELEASE\spring-integration-xml-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-oxm\4.1.4.RELEASE\spring-oxm-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\ws\spring-xml\2.2.0.RELEASE\spring-xml-2.2.0.RELEASE.jar;C:\Users\hs\.m2\repository\com\jayway\jsonpath\json-path\1.2.0\json-path-1.2.0.jar;C:\Users\hs\.m2\repository\net\minidev\json-smart\2.1.0\json-smart-2.1.0.jar;C:\Users\hs\.m2\repository\net\minidev\asm\1.0.2\asm-1.0.2.jar;C:\Users\hs\.m2\repository\asm\asm\3.3.1\asm-3.3.1.jar;C:\Users\hs\.m2\repository\org\springframework\integration\spring-integration-kafka\1.0.0.RELEASE\spring-integration-kafka-1.0.0.RELEASE.jar;C:\Users\hs\.m2\repository\org\apache\avro\avro-compiler\1.7.6\avro-compiler-1.7.6.jar;C:\Users\hs\.m2\repository\org\apache\avro\avro\1.7.6\avro-1.7.6.jar;C:\Users\hs\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\hs\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\hs\.m2\repository\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;C:\Users\hs\.m2\repository\org\xerial\snappy\snappy-java\1.0.5\snappy-java-1.0.5.jar;C:\Users\hs\.m2\repository\org\apache\commons\commons-compress\1.4.1\commons-compress-1.4.1.jar;C:\Users\hs\.m2\repository\org\tukaani\xz\1.0\xz-1.0.jar;C:\Users\hs\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\hs\.m2\repository\org\apache\velocity\velocity\1.7\velocity-1.7.jar;C:\Users\hs\.m2\repository\commons-collections\commons-collections\3.2.1\commons-collections-3.2.1.jar;C:\Users\hs\.m2\repository\com\yammer\metrics\metrics-annotation\2.2.0\metrics-annotation-2.2.0.jar;C:\Users\hs\.m2\repository\com\yammer\metrics\metrics-core\2.2.0\metrics-core-2.2.0.jar;C:\Users\hs\.m2\repository\org\apache\kafka\kafka_2.10\0.8.1.1\kafka_2.10-0.8.1.1.jar;C:\Users\hs\.m2\repository\org\apache\zookeeper\zookeeper\3.3.4\zookeeper-3.3.4.jar;C:\Users\hs\.m2\repository\log4j\log4j\1.2.15\log4j-1.2.15.jar;C:\Users\hs\.m2\repository\javax\mail\mail\1.4\mail-1.4.jar;C:\Users\hs\.m2\repository\javax\activation\activation\1.1\activation-1.1.jar;C:\Users\hs\.m2\repository\javax\jms\jms\1.1\jms-1.1.jar;C:\Users\hs\.m2\repository\com\sun\jdmk\jmxtools\1.2.1\jmxtools-1.2.1.jar;C:\Users\hs\.m2\repository\com\sun\jmx\jmxri\1.2.1\jmxri-1.2.1.jar;C:\Users\hs\.m2\repository\jline\jline\0.9.94\jline-0.9.94.jar;C:\Users\hs\.m2\repository\net\sf\jopt-simple\jopt-simple\3.2\jopt-simple-3.2.jar;C:\Users\hs\.m2\repository\org\scala-lang\scala-library\2.10.1\scala-library-2.10.1.jar;C:\Users\hs\.m2\repository\com\101tec\zkclient\0.3\zkclient-0.3.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA 13.1.6\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain com.agillic.dialogue.kafka.outbound.SpringKafkaTest
15:39:11.736 [main] INFO o.s.i.k.support.ProducerFactoryBean - Using producer properties => {metadata.broker.list=172.16.1.42:9092, compression.codec=0}
2015-02-19 15:39:12 INFO VerifiableProperties:68 - Verifying properties
2015-02-19 15:39:12 INFO VerifiableProperties:68 - Property compression.codec is overridden to 0
2015-02-19 15:39:12 INFO VerifiableProperties:68 - Property metadata.broker.list is overridden to 172.16.1.42:9092
15:39:12.164 [main] INFO o.s.b.f.config.PropertiesFactoryBean - Loading properties file from URL [jar:file:/C:/Users/hs/.m2/repository/org/springframework/integration/spring-integration-core/4.1.2.RELEASE/spring-integration-core-4.1.2.RELEASE.jar!/META-INF/spring.integration.default.properties]
15:39:12.208 [main] DEBUG o.s.i.k.o.KafkaProducerMessageHandler - org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler#5204db6b received message: GenericMessage [payload=foo, headers={timestamp=1424356752208, id=00c483d9-ecf8-2937-4a2c-985bd3afcae4, topic=test-cass, messagekey=3}]
Exception in thread "main" org.springframework.messaging.MessageHandlingException: error occurred in message handler [org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler#5204db6b]; nested exception is java.lang.NullPointerException
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:84)
at com.agillic.dialogue.kafka.outbound.SpringKafkaTest.main(SpringKafkaTest.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.NullPointerException
at org.springframework.integration.kafka.support.KafkaProducerContext.getTopicConfiguration(KafkaProducerContext.java:58)
at org.springframework.integration.kafka.support.KafkaProducerContext.send(KafkaProducerContext.java:190)
at org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler.handleMessageInternal(KafkaProducerMessageHandler.java:81)
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:78)
... 6 more
Process finished with exit code 1
Actually when we introduced KafkaHeaders we did appropriate documentation changes: https://github.com/spring-projects/spring-integration-kafka/blob/master/README.md. See Important note:
Since the last Milestone, we have introduced the KafkaHeaders interface with constants. The messageKey and topic default headers now require a kafka_ prefix. When migrating from an earlier version, you need to specify message-key-expression="headers.messageKey" and topic-expression="headers.topic" on the , or simply change the headers upstream to the new headers from KafkaHeaders using a or MessageBuilder. Or, of course, configure them on the adapter if you are using constant values.
UPDATE
Regarding NullPointerException: it's really an issue. Feel free to raise a JIRA ticket and we'll take care of that. We are even welcome for the contribution!

Error.getStackTrace() returns a string unparseable in flashdevelop?

Accessing a string created by an Error's getStackTrace function is resulting in very unusual behaviour in the FlashDevelop IDE.
package
{
import flash.display.Sprite;
public class Main extends Sprite
{
public function Main():void
{
print("Start");
var err:Error = new Error();
var stack:String = err.getStackTrace();
print(stack);
// also occurs when this is replaced with stack.length or stack[0]
print("End");
}
private function print(input:*):void
{
trace(input);
trace("---");
}
}
}
When run in flash CS4 that outputs
Start
---
Error
at Main()
---
End
---
But when run in FlashDevelop (replacing trace() with FlashConnect.trace()) it outputs
Start
---
Is that a bug, or is it FlashDevelop handling errors in a different way intentionally?
If it is the latter is there a workaround to access the stacktrace of an error?
I managed to fix this by switching to using a dubugging version of the flash player, i hope this helps anyone else with this problem.
instructions for specifying a debug player
Make sure you are compiling in Debug configuration and you may have to enable (set True) "Verbose Stack Trace" in your Project Properties > Compiler Options

Resources