I am running my Fat Jar in Flink Cluster which reads Kafka and saves in Cassandra, the code is,
final Properties prop = getProperties();
final FlinkKafkaConsumer<String> flinkConsumer = new FlinkKafkaConsumer<>
(kafkaTopicName, new SimpleStringSchema(), prop);
flinkConsumer.setStartFromEarliest();
final DataStream<String> stream = env.addSource(flinkConsumer);
DataStream<Person> sensorStreaming = stream.flatMap(new FlatMapFunction<String, Person>() {
#Override
public void flatMap(String value, Collector<Person> out) throws Exception {
try {
out.collect(objectMapper.readValue(value, Person.class));
} catch (JsonProcessingException e) {
logger.error("Json Processing Exception", e);
}
}
});
savePersonDetails(sensorStreaming);
env.execute();
and The Person POJO contains,
#Column(name = "event_time")
private Instant eventTime;
There is codec required to store Instant as below for Cassandra side,
final Cluster cluster = ClusterManager.getCluster(cassandraIpAddress);
cluster.getConfiguration().getCodecRegistry().register(InstantCodec.instance);
When i run standalone works fine, but when i run local cluster throws me an error as below,
Caused by: com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [timestamp <-> java.time.Instant]
at com.datastax.driver.core.CodecRegistry.notFound(CodecRegistry.java:679)
at com.datastax.driver.core.CodecRegistry.createCodec(CodecRegistry.java:526)
at com.datastax.driver.core.CodecRegistry.findCodec(CodecRegistry.java:506)
at com.datastax.driver.core.CodecRegistry.access$200(CodecRegistry.java:140)
at com.datastax.driver.core.CodecRegistry$TypeCodecCacheLoader.load(CodecRegistry.java:211)
at com.datastax.driver.core.CodecRegistry$TypeCodecCacheLoader.load(CodecRegistry.java:208)
I read the below document for registering,
https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/custom_serializers.html
but InstantCodec is 3rd party one. How can i register it?
I solved the problem, there was LocalDateTime which was emitting from and when i was converting with same type, there was above error. I changed the type into java.util Date type then it worked.
Using Jackrabbit Oak, I've been attempting to configure security through SecurityProvider and SecurityConfigurations. In particular, I've been using the restrictions which generally works as expected. However, when dealing with JCR-SQL2 queries, more gets filtered out than expected.
Details
It can be reproduced with the repository below.
/
node [nt:unstructured]
subnode [nt:unstructured]
On node, I add an access control entry with privilege JCR_ALL for user together with a restriction for rep:glob -> "", such that user do not have access to any children of node.
It works as expected when using session.getNode:
session.getNode("/node") returns the node
session.getNode("/node/subnode") throws PathNotFoundException as expected due to the restriction.
However, when I execute the following JCR-SQL2 query:
SELECT * FROM [nt:unstructured]
I get no results back. Here I would have expected to get /node, as it is otherwise available when using session.getNode.
Code
public static void main(String[] args) throws Exception {
Repository repository = new Jcr().with(new MySecurityProvider()).createRepository();
Session session = repository.login(new UserIdCredentials("")); // principal is "SystemPrincipal.INSTANCE"
// Create nodes
Node node = session.getRootNode().addNode("node", "nt:unstructured");
node.addNode("subnode", "nt:unstructured");
// Add access control entry + restriction
AccessControlManager acm = session.getAccessControlManager();
JackrabbitAccessControlList acl = (JackrabbitAccessControlList) acm
.getApplicablePolicies("/node").nextAccessControlPolicy();
Privilege[] privileges = new Privilege[]{acm.privilegeFromName(Privilege.JCR_ALL)};
Map<String, Value> restrictions = new HashMap<String, Value>() {{put("rep:glob", new StringValue(""));}};
acl.addEntry(new PrincipalImpl("user"), privileges, true, restrictions);
acm.setPolicy("/node", acl);
session.save();
// executes query
RowIterator rows = repository.login(new UserIdCredentials("user")).getWorkspace().getQueryManager()
.createQuery("SELECT * FROM [nt:unstructured]", Query.JCR_SQL2).execute().getRows();
System.out.println("Number of rows: " + rows.getSize()); //Prints 0
}
If one were to remove restrictions from the code above, both node and subnode appears in the query results as expected.
MySecurityProvider uses ConfigurationParameters.EMPTY and the default implementations of all SecurityConfigurations, except for AuthenticationConfiguration which I've implemented myself:
class MyAuthenticationConfiguration extends AuthenticationConfigurationImpl {
public MyAuthenticationConfiguration(SecurityProvider securityProvider) {
super(securityProvider);
}
#NotNull
#Override
public LoginContextProvider getLoginContextProvider(ContentRepository contentRepository) {
return new LoginContextProvider() {
#NotNull
public LoginContext getLoginContext(Credentials credentials, String workspaceName) {
String userId = ((UserIdCredentials) credentials).getUserId();
Set<Principal> principalSets = new HashSet<>();
if (userId.isEmpty()) {
principalSets.add(SystemPrincipal.INSTANCE);
} else {
principalSets.add(new PrincipalImpl(userId));
}
Map<String, ? extends Principal> publicPrivileges = new HashMap<>();
AuthInfoImpl authInfoImpl = new AuthInfoImpl(userId, publicPrivileges, principalSets);
Subject subject = new Subject(true, principalSets, Collections.singleton(authInfoImpl), new HashSet<Principal>());
return new PreAuthContext(subject);
}
};
}
}
I am using Jackrabbit Oak version 1.10.0
This turned out be a bug in Jackrabbit Oak - Link to issue.
This has been resolved as of version 1.12.0
Is there a way to achieve the behavior of the code below using annotation driven code?
#Bean
#ServiceActivator(inputChannel = "toKafka")
public MessageHandler handler() throws Exception {
KafkaProducerMessageHandler<String, String> handler =
new KafkaProducerMessageHandler<>(kafkaTemplate());
handler.setTopicExpression(new LiteralExpression("someTopic"));
handler.setMessageKeyExpression(new LiteralExpression("someKey"));
handler.setSendSuccessChannel(success());
handler.setSendFailureChannel(failure());
return handler;
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
// set more properties
return new DefaultKafkaProducerFactory<>(props);
}
Can I specify the send success/failure channels using Spring Integration annotations?
I'd like as much as possible to keep a consistent pattern of doing things (e.g., specifying the flow of messages) throughout my app, and I like the Spring Integration diagrams (e.g., of how channels are connected) IntelliJ automatically generates when you configure your Spring Integration app with XML or Java annotations.
No; it is not possible, the success/failure channels have to be set explicitly when using Java configuration.
This configuration is specific to the Kafka handler and #ServiceActivator is a generic annotation for all types of message handler.
I am setting up a mail:inbound-channel-adapter using Java annotations and ImapIdleChannelAdapter.
It is not clear what object type to pass from #InboundChannelAdapter to #ServiceActivator.
Code snippet:
#InboundChannelAdapter(value = "inputChannel", poller = #Poller(fixedDelay = "5000"))
public ImapIdleChannelAdapter getMailAdapter() {
ImapMailReceiver mailReceiver = new ImapMailReceiver("imaps://username:password#map-mail.outlook.com:993/INBOX");
...
return new ImapIdleChannelAdapter(mailReceiver);
}
#ServiceActivator(inputChannel = "inputChannel")
public void readMessage(Message<javax.mail.Message> message) {
System.out.println(message.getPayload().getAllRecipients());
}
ImapIdleChannelAdapter source says that "The Message payload will be the javax.mail.Message instance that was received". Nevertheless, I receive a class cast exception ImapIdleChannelAdapter cannot be cast to javax.mail.Message when running the code above.
If I change SA's method argument to javax.mail.Message, I am getting spel.SpelEvaluationException: EL1004E: Method call cannot be found on .. type
#ServiceActivator(inputChannel = "inputChannel")
public void readMessage(javax.mail.Message message) throws MessagingException {
System.out.println(message.getAllRecipients());
}
The ImapIdleChannelAdapter is an event-driven component. It’s not a source to poll. It will produce messages by its own internal tasks.
You must remove an #InboundChannelAdapter from its configuration and add simple #Bean. The channel must be configured one ImapIdleChannelAdapter object directly.
I have creating a simple Kafka Producer & Consumer.I am using kafka_2.11-0.9.0.0. Here is my Producer code.
public class KafkaProducerTest {
public static String topicName = "test-topic-2";
public static void main(String[] args) {
// TODO Auto-generated method stub
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("key.serializer",
StringSerializer.class.getName());
props.put("value.serializer",
StringSerializer.class.getName());
Producer<String, String> producer = new KafkaProducer(props);
for (int i = 0; i < 100; i++) {
ProducerRecord<String, String> producerRecord = new ProducerRecord<String, String>(
topicName, Integer.toString(i), Integer.toString(i));
System.out.println(producerRecord);
producer.send(producerRecord);
}
producer.close();
}
}
While starting the bundle I a facing the below error:
2016-05-20 09:44:57,792 | ERROR | nsole user karaf | ShellUtil | 44 - org.apache.karaf.shell.core - 4.0.3 | Exception caught while executing command
org.apache.karaf.shell.support.MultiException: Error executing command on bundles:
Error starting bundle162: Activator start error in bundle NewKafkaArtifact [162].
at org.apache.karaf.shell.support.MultiException.throwIf(MultiException.java:61)
at org.apache.karaf.bundle.command.BundlesCommand.doExecute(BundlesCommand.java:69)[24:org.apache.karaf.bundle.core:4.0.3]
at org.apache.karaf.bundle.command.BundlesCommand.execute(BundlesCommand.java:54)[24:org.apache.karaf.bundle.core:4.0.3]
at org.apache.karaf.shell.impl.action.command.ActionCommand.execute(ActionCommand.java:83)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.karaf.shell.impl.console.osgi.secured.SecuredCommand.execute(SecuredCommand.java:67)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.karaf.shell.impl.console.osgi.secured.SecuredCommand.execute(SecuredCommand.java:87)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.felix.gogo.runtime.Closure.executeCmd(Closure.java:480)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.felix.gogo.runtime.Closure.executeStatement(Closure.java:406)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.felix.gogo.runtime.Pipe.run(Pipe.java:108)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.felix.gogo.runtime.Closure.execute(Closure.java:182)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.felix.gogo.runtime.Closure.execute(Closure.java:119)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.felix.gogo.runtime.CommandSessionImpl.execute(CommandSessionImpl.java:94)[44:org.apache.karaf.shell.core:4.0.3]
at org.apache.karaf.shell.impl.console.ConsoleSessionImpl.run(ConsoleSessionImpl.java:270)[44:org.apache.karaf.shell.core:4.0.3]
at java.lang.Thread.run(Thread.java:745)[:1.8.0_66]
Caused by: java.lang.Exception: Error starting bundle162: Activator start error in bundle NewKafkaArtifact [162].
at org.apache.karaf.bundle.command.BundlesCommand.doExecute(BundlesCommand.java:66)[24:org.apache.karaf.bundle.core:4.0.3]
... 12 more
Caused by: org.osgi.framework.BundleException: Activator start error in bundle NewKafkaArtifact [162].
at org.apache.felix.framework.Felix.activateBundle(Felix.java:2276)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.Felix.startBundle(Felix.java:2144)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:998)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.karaf.bundle.command.Start.executeOnBundle(Start.java:38)[24:org.apache.karaf.bundle.core:4.0.3]
at org.apache.karaf.bundle.command.BundlesCommand.doExecute(BundlesCommand.java:64)[24:org.apache.karaf.bundle.core:4.0.3]
... 12 more
Caused by: org.apache.kafka.common.config.ConfigException: Invalid value org.apache.kafka.common.serialization.StringSerializer for configuration key.serializer: Class org.apache.kafka.common.serialization.StringSerializer could not be found.
at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:255)[141:kafka-examples:1.0.0.SNAPSHOT-jar-with-dependencies]
at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:145)[141:kafka-examples:1.0.0.SNAPSHOT-jar-with-dependencies]
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:49)[141:kafka-examples:1.0.0.SNAPSHOT-jar-with-dependencies]
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:56)[141:kafka-examples:1.0.0.SNAPSHOT-jar-with-dependencies]
at org.apache.kafka.clients.producer.ProducerConfig.<init>(ProducerConfig.java:317)[141:kafka-examples:1.0.0.SNAPSHOT-jar-with-dependencies]
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:181)[141:kafka-examples:1.0.0.SNAPSHOT-jar-with-dependencies]
at com.NewKafka.NewKafkaArtifact.KafkaProducerTest.main(KafkaProducerTest.java:25)[162:NewKafkaArtifact:0.0.1.SNAPSHOT]
at com.NewKafka.NewKafkaArtifact.StartKafka.start(StartKafka.java:11)[162:NewKafkaArtifact:0.0.1.SNAPSHOT]
at org.apache.felix.framework.util.SecureAction.startActivator(SecureAction.java:697)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.Felix.activateBundle(Felix.java:2226)[org.apache.felix.framework-5.4.0.jar:]
... 16 more
I have tried setting the key.serializer and value.serializer like below:
props.put("key.serializer",StringSerializer.class.getName());
props.put("value.serializer",StringSerializer.class.getName());
Also like, But still getting the same error. What is I am doing wrong here.
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
Its issue with the version you are using.
It was also suggested to version 0.8.2.2_1.
Suggest you to adjust the version of kafka you are using and give a try.
code wise, I cross checked many code samples in kafka dev list and seems like you have written in right way.
i.e Thread.currentThread().setContextClassLoader(null);
I find the reason by reading the kafka client source code.
kafka client use Class.forName(trimmed, true, Utils.getContextOrKafkaClassLoader()) to get the Class object, and the create the instance, the key point is the classLoader, which is specified by the last param, the implementation of method Utils.getContextOrKafkaClassLoader() is
public static ClassLoader getContextOrKafkaClassLoader() {
ClassLoader cl = Thread.currentThread().getContextClassLoader();
if (cl == null)
return getKafkaClassLoader();
else
return cl;
}
so, by default, the Class object of org.apache.kafka.common.serialization.StringSerializer is load by the applicationClassLoader, if your target class is not loaded by the applicationClassLoader, this problem will happend !
to solve the problem, simply set the ContextClassLoader of current thread to null before new KafkaProducer instance like this
Thread.currentThread().setContextClassLoader(null);
Producer<String, String> producer = new KafkaProducer(props);
hope my answer can let you know what happend .
The issue appears to be with the class loader, as #Ram Ghadiyaram indicated in his answer. In order to get this working with kafka-clients 2.x, I had to do the following:
public Producer<String, String> createProducer() {
ClassLoader original = Thread.currentThread().getContextClassLoader();
Thread.currentThread().setContextClassLoader(null);
Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
BOOTSTRAP_SERVERS);
... etc ...
KafkaProducer<String, String> producer = new KafkaProducer<>(props);
Thread.currentThread().setContextClassLoader(original);
return producer;
}
This allows the system to continue loading additional classes with the original classloader. This was needed in Wildfly/JBoss (the specific app I'm working with is Keycloak).
try using these props instead of yours props.
props.put("key.serializer",
"org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer",
"org.apache.kafka.common.serialization.StringSerializer");
Here is full Kafka Producer Example:-
import java.util.Properties;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;
public class FxDateProducer {
public static void main(String[] args) throws Exception{
if(args.length == 0){
System.out.println("Enter topic name”);
return;
}
String topicName = args[0].toString();
Properties props = new Properties();
//Assign localhost id
props.put("bootstrap.servers", “localhost:9092");
//Set acknowledgements for producer requests.
props.put("acks", “all");
//If the request fails, the producer can automatically retry,
props.put("retries", 0);
//Specify buffer size in config
props.put("batch.size", 16384);
//Reduce the no of requests less than 0
props.put("linger.ms", 1);
//The buffer.memory controls the total amount of memory available to the producer for buffering.
props.put("buffer.memory", 33554432);
props.put("key.serializer",
"org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer",
"org.apache.kafka.common.serialization.StringSerializer");
Producer<String, String> producer = new KafkaProducer
<String, String>(props);
for(int i = 0; i < 10; i++)
producer.send(new ProducerRecord<String, String>(topicName,
Integer.toString(i), Integer.toString(i)));
System.out.println(“Message sent successfully”);
producer.close();
}
}
Recently i found the solution. Setting the Thead Context loader to null resolved the issue for me. Thanks.
Thread.currentThread().setContextClassLoader(null);
Producer<String, String> producer = new KafkaProducer(props);
It happens because of kafka-version issue. Make sure, you use the correct kafka version. The version that I used is 'kafka_2.12-1.0.1'
But try using below properties in your code .This fixed my issue.
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,"org.apache.kafka.common.serialization.StringSerializer");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,"org.apache.kafka.common.serialization.StringSerializer");
Earlier I was using below properties which was causing the issue.
//props.put("key.serializer","org.apache.kafka.common.serialization.Stringserializer");
//props.put("value.serializer","org.apache.kafka.common.serialization.Stringserializer");