Exception in agent.runonserver in lotus notes - lotus-notes

I have the code which calls java agent from lotusscript agent
Sub insertDealDetails()
On Error GoTo errhandler
MsgBox "inside deal details"
Dim agent As NotesAgent
On Error GoTo errhandler
Set agent = db.GetAgent("Procs")
If agent.RunOnServer(doc.Noteid) = 0 Then
MessageBox "Agent ran",, "Success"
Else
MessageBox "Agent did not run",, "Failure"
End If
Exit Sub
errhandler:
MsgBox "Error in function insertDealDetails in agtSubmit Agent" & Erl & Error
End Sub
Now if any exception occurs in Procs agent,how the main agent calling insertDealDetails() can be supplied with exception so that it stops the main agent.

Use an In-Memory Document,
write your error message into this document in Java agent and
read the error message in LotusScript code.
LotusScript
Call agent.RunWithDocumentContext(doc)
If doc.ErrorMessage(0) <> "" Then
print doc.ErrorMessage(0)
' handle the error
End If
Java agent
Document doc = agentContext.getDocumentContext();
...
doc.replaceItemValue("ErrorMessage", "Your Error Message from Java Agent");
You don't need to save the In-Memory document at any time.

it is updated code
`Sub insertDealDetails()
On Error GoTo errhandler
MsgBox "inside deal details"
Dim agent As NotesAgent
Dim in_doc As Notesdocument
On Error GoTo errhandler
Set agent = db.GetAgent("Procs")
Set in_doc = db.createDocument()
If agent.Runwithdocumentcontext(in_doc,doc.Noteid) Then`
MsgBox "doc.ErrorMessage(0):::::::::"&in_doc.ErrorMessage(0)
If in_doc.ErrorMessage(0)<>"" Then
Call prompt("2")
End If
End If
Exit Sub
errhandler:
MsgBox "Error in function insertDealDetails in agtSubmit Agent" & Erl & Error
End Sub
the problem now here is it is not returning from java code called in Procs agent.What i will be doing wrong here.
public class JavaAgent extends AgentBase {
public void NotesMain() {
Connection con = null;
CallableStatement stmt = null;
Database db;
lotus.domino.Document doc = null;
try {
Session session = getSession();
AgentContext agentContext = session.getAgentContext();
doc = agentContext.getDocumentContext();
con = JavaAgent.getConnection(); //making connectiion here
//executing code here and exception occurs
System.out.println("success");
} catch(Exception e) {
try {
doc.replaceItemValue("ErrorMessage", e.getMessage());
} catch (NotesException e1) {
e1.printStackTrace();
}
e.printStackTrace();
}finally{
try {
stmt.close();
con.close();
} catch (SQLException e) {
try {
doc.replaceItemValue("ErrorMessage", e.getMessage());
} catch (NotesException e1) {
e1.printStackTrace();
}
e.printStackTrace();
}
}
}
}
The logs for the same is as follows:
**[0DCC:01AD-053C] 01/27/2016 01:24:56 PM HTTP JVM: class load: JavaAgent from: <unknown>
[0DCC:04B2-199C] 01/27/2016 01:24:56 PM HTTP JVM: before connection:::::
[0DCC:04B2-199C] 01/27/2016 01:24:56 PM HTTP JVM: inside dobi
[0DCC:04B2-199C] 01/27/2016 01:24:56 PM HTTP JVM: inside dobi 2
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: inside dobi 3
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: after connection:::::Oracle Database 11g Release 11.1.0.0.0 - Production
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: message is Invalid column index
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: java.sql.SQLException: Invalid column index
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:162)
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:227)
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: at oracle.jdbc.driver.OraclePreparedStatement.setStringInternal(OraclePreparedStatement.java:4596)
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: at oracle.jdbc.driver.OracleCallableStatement.setString(OracleCallableStatement.java:4249)
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: at JavaAgent.NotesMain(Unknown Source)
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: at lotus.domino.AgentBase.runNotes(Unknown Source)
[0DCC:04B2-199C] 01/27/2016 01:25:28 PM HTTP JVM: at lotus.domino.NotesThread.run(Unknown Source)
[0DF8:000A-0F84] Router: DNS server returned an error searching for MX records. The destination domain may not exist: 11.17.108.223, Error: Not implemented(NOTIMP)**

The working code with steps for the same is as follows:
%REM
Sub insertDetailsOracle
Description: To insert details in Oracle Table
Date: 28/03/2014
'******************* Logic ***************************
'Validation In submit agent
'1)Create in-memory document
'2)Run java agent with Runwithdocumentcontext passing newly created in-memory document
' as well as note-id of original request document context
'If pass i.e. no exceptions in Java agent
'1) submit the Case
'Else
'1) Log Error message and exit the agent
'***************************************************
%END REM
Function insertDetailsOracle() As String
On Error GoTo errhandler
Dim agent As NotesAgent
Dim agentValue As Boolean
Set agent = db.GetAgent("Procs")
'Create in-memory document
Set in_doc = db.createDocument()
'Running java agent with Runwithdocumentcontext
agentValue = agent.Runwithdocumentcontext(in_doc,doc.Noteid)
'Return error message as per message passed by java agent in in-memory document's field
If in_doc.ErrorMessage(0)<>"" Then
insertDealDetails = in_doc.ErrorMessage(0)
Else
insertDealDetails = "1"
End If
Exit Function
errhandler:
MsgBox "Error in function insertDealDetailsOracle in agtSubmit Agent" & Erl & Error
End Function
The java method will be same as below.

Related

Vanilla Jhispter : can't run test (SocketTimeOut Exception)

I generated a new application with JHipster with gradle and mongoDB choice.
Gradle compiles well :
c:\webs\workspace-jhipster\jpoc>gradle clean compileJava compileTestJava
:clean
:cleanResources UP-TO-DATE
:bootBuildInfo
:nodeSetup SKIPPED
:npmSetup SKIPPED
:webpackBuildDev SKIPPED
:processResources
:compileJava
:classes
:compileTestJava
BUILD SUCCESSFUL
Total time: 6.704 secs
The problem arrives when I wish to run a single test :
gradle test --tests com.jpoc.service.UserServiceIntTest
which outputs :
com.jpoc.service.UserServiceIntTest > assertThatUserMustExistToResetPassword FAILED
java.lang.IllegalStateException
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException
Caused by: org.springframework.beans.factory.BeanCreationException
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException
Caused by: org.springframework.beans.factory.BeanCreationException
Caused by: org.springframework.beans.BeanInstantiationException
Caused by: de.flapdoodle.embed.process.exceptions.DistributionException
Caused by: java.io.IOException
Caused by: java.net.SocketTimeoutException
I pretty sure this is a misconfiguration problem, but I don't see which one.
I use lastest jhipster 4.2.0
Thank you.
To setup mongoDB proxy in test suite with spring boot, I put for instance:
#BeforeClass
public static void setup_mongo() throws UnknownHostException, IOException{
String proxyHost = "proxy.priv.atos.fr";
String proxyPort = "3128";
String proxy = System.getenv("http_proxy");
System.out.println("Proxy URL : " + proxy);
if(proxy != null){
if(proxyHost == null && proxyPort == null){
URL proxyurl = new URL(proxy);
proxyHost = proxyurl.getHost();
proxyPort = String.valueOf(proxyurl.getPort());
}
}
MongodStarter starter ;
System.out.println("Proxy Host : " + proxyHost);
System.out.println("Proxy Port : " + proxyPort);
if (proxyHost != null && proxyPort != null) {
IRuntimeConfig runtimeConfig = new RuntimeConfigBuilder().defaults(Command.MongoD)
.artifactStore(
new ArtifactStoreBuilder().defaults(Command.MongoD)
.download(
new DownloadConfigBuilder()
.defaultsForCommand(Command.MongoD)
.proxyFactory(
new HttpProxyFactory(
proxyHost,
Integer.parseInt(proxyPort)))
.build()).build()).build();
starter = MongodStarter.getInstance(runtimeConfig);
} else {
starter = MongodStarter.getDefaultInstance();
}
IMongodConfig mongodConfig = new MongodConfigBuilder()
.version(Version.Main.PRODUCTION)
.net(new Net(0, Network.localhostIsIPv6())).build();
MongodExecutable mongodExecutable = null;
mongodExecutable = starter.prepare(mongodConfig);
mongodExecutable.start();
}
Like this, it downloads the mongoDB server and try to run it. The next problem is that I don't have permissions to run this executable inside JVM.

I get an error when creating a job collection using Azure SDK

The error message is
BadRequest: The condition specified by the ETag is not satisfied.
this is the code i am using
_cloudServiceName = ConfigurationManager.AppSettings["CloudServiceName"];
_maxJobCount = Int32.Parse(ConfigurationManager.AppSettings["MaxJobCount"]);
string certpath = HttpContext.Current.Server.MapPath("/App_Data/DanAzureCertificate.pfx");
X509Certificate2 cert = new X509Certificate2(certpath, "SOLID");
CertificateCloudCredentials creds = new CertificateCloudCredentials(_subScriptionID, cert);
smClient = new SchedulerManagementClient(creds);
smClient.LongRunningOperationInitialTimeout = int.MaxValue;
schedulerClient = new SchedulerClient(_cloudServiceName, "", creds);
schedulerClient.LongRunningOperationInitialTimeout = int.MaxValue;
JobCollectionCreateParameters pc = new JobCollectionCreateParameters();
pc.IntrinsicSettings = new JobCollectionIntrinsicSettings();
pc.IntrinsicSettings.Quota = new JobCollectionQuota();
pc.IntrinsicSettings.Quota.MaxJobCount = _maxJobCount;
pc.IntrinsicSettings.Plan = JobCollectionPlan.Standard;
pc.Label = jobCollectionName;
// smClient = new SchedulerManagementClient();
smClient.LongRunningOperationInitialTimeout = 3600;
var createreposnse = smClient.JobCollections.Create(_cloudServiceName, jobCollectionName, pc);
this is what i get in the output windows
The thread 0x2524 has exited with code 0 (0x0).
Application Insights Telemetry: {"name":"Microsoft.ApplicationInsights.Dev.ac922776334349f8bcc2e2ed951af0a9.RemoteDependency","time":"2017-01-14T12:13:41.2610531Z","iKey":"ac922776-3343-49f8-bcc2-e2ed951af0a9","tags":{"ai.device.roleInstance":"HP","ai.internal.sdkVersion":"rddf: 2.1.0.363"},"data":{"baseType":"RemoteDependencyData","baseData":{"ver":2,"name":"https://management.core.windows.net/c44db09b-bb78-417c-a130-934ce4edb2f5/cloudservices/newCloudService/resources/scheduler/~/JobCollections/JC1","id":"e9odaNEE8sI=","value":1888.9979,"resultCode":"404","dependencyKind":1,"success":false,"properties":{"DeveloperMode":"true"}}}}
Exception thrown: 'Hyak.Common.CloudException' in Microsoft.Threading.Tasks.dll
Application Insights Telemetry: {"name":"Microsoft.ApplicationInsights.Dev.ac922776334349f8bcc2e2ed951af0a9.RemoteDependency","time":"2017-01-14T12:13:48.7034807Z","iKey":"ac922776-3343-49f8-bcc2-e2ed951af0a9","tags":{"ai.device.roleInstance":"HP","ai.internal.sdkVersion":"rddf: 2.1.0.363"},"data":{"baseType":"RemoteDependencyData","baseData":{"ver":2,"name":"https://management.core.windows.net/c44db09b-bb78-417c-a130-934ce4edb2f5/cloudservices/newCloudService/resources/scheduler/JobCollections/JC1","id":"2Y8TsJ/Dv/Y=","value":1418.5722,"resultCode":"400","dependencyKind":1,"success":false,"properties":{"DeveloperMode":"true"}}}}
Exception thrown: 'Hyak.Common.CloudException' in Microsoft.Threading.Tasks.dll
'iisexpress.exe' (CLR v4.0.30319: /LM/W3SVC/2/ROOT-1-131288695585980488): Loaded 'C:\Windows\assembly\GAC_MSIL\Microsoft.VisualStudio.Debugger.Runtime\14.0.0.0__b03f5f7f11d50a3a\Microsoft.VisualStudio.Debugger.Runtime.dll'. Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
The program '[11560] iisexpress.exe: Program Trace' has exited with code 0 (0x0).
The program '[11560] iisexpress.exe' has exited with code -1 (0xffffffff).
The program '[9564] iexplore.exe' has exited with code -1 (0xffffffff).
any idea how to solve this ?

Using ODA's isBefore and isAfter methods for date comparisons

in an XPage repeat control I'm trying to compute a string based upon date values in the underlying Notes view. The first two columns of the view are StartDate and EndDate respectively.
In my code (see below) the print statements work fine and prints lovely looking dates on the console. As soon as it gets to the date comparisons it throws some horrible errors.
var vReturn = "unknown";
try {
var vNow = new java.util.Date();
var vDateToday:org.openntf.domino.DateTime = session.createDateTime(vNow);
print("Today=" + vDateToday);
var vStartDate:org.openntf.domino.DateTime = row.getColumnValues()[0];
print("vStartDate=" + vStartDate);
var vEndDate:org.openntf.domino.DateTime = row.getColumnValues()[1];
print("vEndDate=" + vEndDate);
if (vDateToday.isBefore(vStartDate)) {
vReturn = "Forthcoming";
}
if (vDateToday.isAfter(vStartDate) && vDateToday.isBefore(vEndDate)) {
vReturn = "Current";
}
if (vDateToday.isAfter(vEndDate)) {
vReturn = "Completed";
}
}catch(e){
print("Travellog: " + e.toString());
}
return vReturn;
The first dozen or so lines output to the console looks like this:
19/12/2016 11:25:45 HTTP JVM: Today=19/12/2016 11:25:45 GMT
19/12/2016 11:25:45 HTTP JVM: vStartDate=19/12/2016 00:00:00 GMT
19/12/2016 11:25:45 HTTP JVM: vEndDate=27/12/2016 00:00:00 GMT
19/12/2016 11:25:45 HTTP JVM: java.lang.NullPointerException
19/12/2016 11:25:45 HTTP JVM: at org.openntf.domino.xsp.script.WrapperOpenDomino$OpenFunction.call(WrapperOpenDomino.java:400)
19/12/2016 11:25:45 HTTP JVM: at com.ibm.jscript.types.BuiltinFunction.call(BuiltinFunction.java:75)
19/12/2016 11:25:45 HTTP JVM: at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
19/12/2016 11:25:45 HTTP JVM: at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
19/12/2016 11:25:45 HTTP JVM: at com.ibm.jscript.ASTTree.ASTIf.interpret(ASTIf.java:79)
19/12/2016 11:25:45 HTTP JVM: at com.ibm.jscript.ASTTree.ASTBlock.interpret(ASTBlock.java:100)
19/12/2016 11:25:45 HTTP JVM: at com.ibm.jscript.ASTTree.ASTTry.interpret(ASTTry.java:109)
19/12/2016 11:25:45 HTTP JVM: at com.ibm.jscript.ASTTree.ASTProgram.interpret(ASTProgram.java:119)
19/12/2016 11:25:45 HTTP JVM: at com.ibm.jscript.ASTTree.ASTProgram.interpretEx(ASTProgram.java:139)
...
I have tried wrapping getColumnValues in session.CreateDateTime like so:
var vStartDate:org.openntf.domino.DateTime = session.createDateTime(row.getColumnValues()[0])
but that throws errors too.
Can anyone point me in the right direction? I've tried every variation I can think of!
P.S. The examples in the OpenNTF Domino example database look simple but they only ever use the current system date, never dates from documents or view entries.
It sounds like your columns aren't displaying dates. The following button code in that demo database works successfully for me:
<xp:button value="Run SSJS Tests" id="button4"
xp:key="SSJSButton">
<xp:eventHandler event="onclick" submit="true"
refreshMode="partial" refreshId="SSJSDiv">
<xp:this.action><![CDATA[#{javascript:try {
var now = new java.util.Date();
var vw:NotesView = database.getView("AllContacts");
var ec:NotesViewEntryCollection = vw.getAllEntries();
var ent1 = ec.getFirstEntry();
var ent2 = ec.getNextEntry();
print(ent1.getColumnValues());
print(ent1.getColumnValues().get(6).getClass().getName());
var date1:org.openntf.domino.DateTime = ent1.getColumnValues().get(6);
var date2:org.openntf.domino.DateTime = ent2.getColumnValues().get(6);
date1.adjustDay(1);
retVal = "Running SSJS date1.isAfter(date2)<br/>";
if (date1.isAfter(date2)) {
retVal += #Text(date1) + " is after " + #Text(date2) + "<br/>";
} else {
retVal += #Text(date1) + " is NOT after " + #Text(date2) + "<br/>";
}
retVal += "<br/><br/>Running SSJS date2.isAfter(date1)<br/>";
if (date2.isAfter(date1)) {
retVal += #Text(date2) + " is after " + #Text(date1) + "<br/>";
} else {
retVal += #Text(date2) + " is NOT after " + #Text(date1) + "<br/>";
}
viewScope.put("SSJSTest",retVal);
} catch (e) {
#ErrorMessage(e.toString());
}}]]></xp:this.action>
</xp:eventHandler>
</xp:button>
Possibly a better option is to use row.setPreferJavaDates(). That then ensures a Java Date (java.util.Date) is outputted instead of a NotesDateTime. That also removes the need to recycle. The isBefore() and isAfter() methods just convert the NotesDateTime to a java.util.Date and use the in-built isBefore() and isAfter() methods available in that class anyway.
Paul's comment where he points out that there is a difference between NotesViewEntry and NotesXspViewEntry pointed me in the right direction. My code is in a repeat control and it turns out that repeat controls return NotesXspViewEntries. I messed about trying to get the underlying NotesViewEntry but after reading a comment by Tim Tripconey about the dangers of doing this (see How do I get the parent NotesViewEntry from the NotesXSPViewEntry?) I decided to go for the underlying document instead. Here is my code:
var vReturn = "unknown";
try {
var vNow = new java.util.Date();
var vDateToday:org.openntf.domino.DateTime = session.createDateTime(vNow);
var vDoc:NotesDocument = row.getDocument();
var vStartDate:org.openntf.domino.DateTime = vDoc.getItemValueDateTimeArray("DateStart").get(0);
var vEndDate:org.openntf.domino.DateTime = vDoc.getItemValueDateTimeArray("DateEnd").get(0);
if (vDateToday.isBeforeIgnoreTime(vStartDate)) {
vReturn = "Forthcoming";
}
if ((vDateToday.equalsIgnoreTime(vStartDate) || vDateToday.isAfterIgnoreTime(vStartDate))
&& (vDateToday.equalsIgnoreTime(vEndDate) || vDateToday.isBefore(vEndDate))) {
vReturn = "Currently Away";
}
if (vDateToday.isAfterIgnoreTime(vEndDate)) {
vReturn = "Completed";
}
}catch(e){
print("Travellog: error in itinerary status link on home page: " + e.toString());
}
return vReturn;
I probably need to make it a bit more robust - e.g. if one of the date fields does not contain a value (because a date field with no value returns an empty string which causes .getItemValueDateTimeArray to throw an error).
Thanks for your help Paul. Once again you've pulled me out of the coding bottomless pit!

Error using sstableloader on cassandra

I am a cassandra newbie. I see the following error with Cassandra cqlsh 5.0.1 | Cassandra 2.1.2 | CQL spec 3.2.0
Here is the error I see when using sstableloader:
./sstableloader -d <hostname> -u <user> -pw <pass> <filename>
Could not retrieve endpoint ranges:
java.lang.IllegalArgumentException
java.lang.RuntimeException: Could not retrieve endpoint ranges:
at org.apache.cassandra.tools.BulkLoader$ExternalClient.init(BulkLoader.java:337)
at org.apache.cassandra.io.sstable.SSTableLoader.stream(SSTableLoader.java:157)
at org.apache.cassandra.tools.BulkLoader.main(BulkLoader.java:105)
Caused by: java.lang.IllegalArgumentException
at java.nio.Buffer.limit(Buffer.java:275)
at org.apache.cassandra.utils.ByteBufferUtil.readBytes(ByteBufferUtil.java:543)
at org.apache.cassandra.serializers.CollectionSerializer.readValue(CollectionSerializer.java:122)
at org.apache.cassandra.serializers.MapSerializer.deserializeForNativeProtocol(MapSerializer.java:99)
at org.apache.cassandra.serializers.MapSerializer.deserializeForNativeProtocol(MapSerializer.java:28)
at org.apache.cassandra.serializers.CollectionSerializer.deserialize(CollectionSerializer.java:48)
at org.apache.cassandra.db.marshal.AbstractType.compose(AbstractType.java:66)
at org.apache.cassandra.cql3.UntypedResultSet$Row.getMap(UntypedResultSet.java:282)
at org.apache.cassandra.config.CFMetaData.fromSchemaNoTriggers(CFMetaData.java:1793)
at org.apache.cassandra.config.CFMetaData.fromThriftCqlRow(CFMetaData.java:1101)
at org.apache.cassandra.tools.BulkLoader$ExternalClient.init(BulkLoader.java:329)
... 2 more
What is weird is that I get this error only for a particular keyspace. When I creating a new keyspace (with the same exact command as the issue keyspace and try sstableloader I am not seeing the same issue. When I set DEBUG log level I see the following:
DEBUG [Thrift:1] 2015-02-20 00:32:38,006 CustomTThreadPoolServer.java:212 - Thrift transport error occurred during processing of message.
org.apache.thrift.transport.TTransportException: null
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:129) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:101) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:362) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:284) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:191) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:27) ~[libthrift-0.9.1.jar:0.9.1]
at org.apache.cassandra.thrift.CustomTThreadPoolServer$WorkerProcess.run(CustomTThreadPoolServer.java:202) ~[apache-cassandra-2.1.2.jar:2.1.2]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_65]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_65]
at java.lang.Thread.run(Thread.java:745) [na:1.7.0_65]
Not sure if this is actually an error since per some links online I see that this message appears regardless when setting debug log level
I'm trying this with Cassandra 2.1.8. The error you're seeing is the product of this block of Cassandra code:
try
{
// Query endpoint to ranges map and schemas from thrift
InetAddress host = hostiter.next();
Cassandra.Client client = createThriftClient(host.getHostAddress(), rpcPort, this.user, this.passwd, this.transportFactory);
setPartitioner(client.describe_partitioner());
Token.TokenFactory tkFactory = getPartitioner().getTokenFactory();
for (TokenRange tr : client.describe_ring(keyspace))
{
Range<Token> range = new Range<>(tkFactory.fromString(tr.start_token), tkFactory.fromString(tr.end_token), getPartitioner());
for (String ep : tr.endpoints)
{
addRangeForEndpoint(range, InetAddress.getByName(ep));
}
}
String cfQuery = String.format("SELECT * FROM %s.%s WHERE keyspace_name = '%s'",
Keyspace.SYSTEM_KS,
SystemKeyspace.SCHEMA_COLUMNFAMILIES_CF,
keyspace);
CqlResult cfRes = client.execute_cql3_query(ByteBufferUtil.bytes(cfQuery), Compression.NONE, ConsistencyLevel.ONE);
for (CqlRow row : cfRes.rows)
{
String columnFamily = UTF8Type.instance.getString(row.columns.get(1).bufferForName());
String columnsQuery = String.format("SELECT * FROM %s.%s WHERE keyspace_name = '%s' AND columnfamily_name = '%s'",
Keyspace.SYSTEM_KS,
SystemKeyspace.SCHEMA_COLUMNS_CF,
keyspace,
columnFamily);
CqlResult columnsRes = client.execute_cql3_query(ByteBufferUtil.bytes(columnsQuery), Compression.NONE, ConsistencyLevel.ONE);
CFMetaData metadata = CFMetaData.fromThriftCqlRow(row, columnsRes);
knownCfs.put(metadata.cfName, metadata);
}
break;
}
catch (Exception e)
{
if (!hostiter.hasNext())
throw new RuntimeException("Could not retrieve endpoint ranges: ", e);
}
So, what you have is a large variety of errors all rolled up in the message, "Could not retrieve endpoint ranges." You will not be able to tell what your specific error is without downloading the Cassandra source and debugging through it. That's what I did.
My schema is built in a multi-step process using https://github.com/DonBranson/cql_schema_versioning. One step does this:
ALTER TABLE user_reputation DROP ban_votes;
The DROP triggers a Cassandra BulkLoader bug that prints the error message you're seeing. However, the myriad other error conditions show the same message. The error message gives us absolutely nothing to help actually solve the problem.
I also found that the BulkLoader will not work if you're encrypting internode communication like this:
internode_encryption: all
So, in the DigitalOcean cloud where I'm running, where I have to encrypt internode comm, it will fail, but at least it will display a message that indicates a connection failure:
../apache-cassandra-2.1.8/bin/sstableloader -d 192.168.56.101 makeyourcase/arenas
Established connection to initial hosts
Opening sstables and calculating sections to stream
Skipping file makeyourcase-arenas.arenas_name_idx-jb-2-Data.db: column family makeyourcase.arenas.arenas_name_idx doesn't exist
Skipping file makeyourcase-arenas.arenas_name_idx-jb-1-Data.db: column family makeyourcase.arenas.arenas_name_idx doesn't exist
Streaming relevant part of makeyourcase/arenas/makeyourcase-arenas-jb-2-Data.db makeyourcase/arenas/makeyourcase-arenas-jb-1-Data.db to [/192.168.56.102, /192.168.56.101]
ERROR 01:46:39 [Stream #a7e5fb80-3593-11e5-9b52-cdde6a46fde5] Streaming error occurred
java.net.ConnectException: Connection refused
at sun.nio.ch.Net.connect0(Native Method) ~[na:1.7.0_71]

Spring-Kafka Integration 1.0.0.RELEASE Issue with Producer

I am not able to publish message using Spring Kafka Integration, though my Kafka Java Client is working fine.
The Java code is running on Windows and Kafka is running on Linux box.
KafkaProducerContext<String, String> kafkaProducerContext = new KafkaProducerContext<String, String>();
ProducerMetadata<String, String> producerMetadata = new ProducerMetadata<String, String>("test-cass");
producerMetadata.setValueClassType(String.class);
producerMetadata.setKeyClassType(String.class);
Encoder<String> encoder = new StringEncoder<String>();
producerMetadata.setValueEncoder(encoder);
producerMetadata.setKeyEncoder(encoder);
ProducerFactoryBean<String, String> producer = new ProducerFactoryBean<String, String>(producerMetadata, "172.16.1.42:9092");
ProducerConfiguration<String, String> config = new ProducerConfiguration<String, String>(producerMetadata, producer.getObject());
kafkaProducerContext.setProducerConfigurations(Collections.singletonMap("test-cass", config));
KafkaProducerMessageHandler<String, String> handler = new KafkaProducerMessageHandler<String, String>(kafkaProducerContext);
handler.handleMessage(MessageBuilder.withPayload("foo")
.setHeader("messagekey", "3")
.setHeader("topic", "test-cass")
.build());
I am getting following error
"C:\Program Files\Java\jdk1.7.0_71\bin\java" -Didea.launcher.port=7542 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA 13.1.6\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.7.0_71\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\jce.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\jfxrt.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\resources.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\rt.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.7.0_71\jre\lib\ext\zipfs.jar;C:\projects\SpringCassandraInt\target\classes;C:\Users\hs\.m2\repository\org\springframework\data\spring-data-cassandra\1.1.2.RELEASE\spring-data-cassandra-1.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\data\spring-cql\1.1.2.RELEASE\spring-cql-1.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-context\4.1.4.RELEASE\spring-context-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-aop\4.1.4.RELEASE\spring-aop-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\hs\.m2\repository\org\springframework\spring-beans\4.0.9.RELEASE\spring-beans-4.0.9.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-core\4.1.2.RELEASE\spring-core-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\commons-logging\commons-logging\1.1.3\commons-logging-1.1.3.jar;C:\Users\hs\.m2\repository\org\springframework\spring-expression\4.1.2.RELEASE\spring-expression-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-tx\4.1.4.RELEASE\spring-tx-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\data\spring-data-commons\1.9.2.RELEASE\spring-data-commons-1.9.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\slf4j\slf4j-api\1.7.10\slf4j-api-1.7.10.jar;C:\Users\hs\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.10\jcl-over-slf4j-1.7.10.jar;C:\Users\hs\.m2\repository\com\datastax\cassandra\cassandra-driver-dse\2.0.4\cassandra-driver-dse-2.0.4.jar;C:\Users\hs\.m2\repository\com\datastax\cassandra\cassandra-driver-core\2.0.4\cassandra-driver-core-2.0.4.jar;C:\Users\hs\.m2\repository\io\netty\netty\3.9.0.Final\netty-3.9.0.Final.jar;C:\Users\hs\.m2\repository\com\codahale\metrics\metrics-core\3.0.2\metrics-core-3.0.2.jar;C:\Users\hs\.m2\repository\com\google\guava\guava\15.0\guava-15.0.jar;C:\Users\hs\.m2\repository\org\liquibase\liquibase-core\3.1.1\liquibase-core-3.1.1.jar;C:\Users\hs\.m2\repository\org\yaml\snakeyaml\1.13\snakeyaml-1.13.jar;C:\Users\hs\.m2\repository\ch\qos\logback\logback-classic\1.1.2\logback-classic-1.1.2.jar;C:\Users\hs\.m2\repository\ch\qos\logback\logback-core\1.1.2\logback-core-1.1.2.jar;C:\Users\hs\.m2\repository\org\springframework\integration\spring-integration-core\4.1.2.RELEASE\spring-integration-core-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\projectreactor\reactor-core\1.1.4.RELEASE\reactor-core-1.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\com\goldmansachs\gs-collections\5.0.0\gs-collections-5.0.0.jar;C:\Users\hs\.m2\repository\com\goldmansachs\gs-collections-api\5.0.0\gs-collections-api-5.0.0.jar;C:\Users\hs\.m2\repository\com\lmax\disruptor\3.2.1\disruptor-3.2.1.jar;C:\Users\hs\.m2\repository\io\gatling\jsr166e\1.0\jsr166e-1.0.jar;C:\Users\hs\.m2\repository\org\springframework\retry\spring-retry\1.1.1.RELEASE\spring-retry-1.1.1.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-messaging\4.1.4.RELEASE\spring-messaging-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\integration\spring-integration-stream\4.1.2.RELEASE\spring-integration-stream-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\integration\spring-integration-xml\4.1.2.RELEASE\spring-integration-xml-4.1.2.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\spring-oxm\4.1.4.RELEASE\spring-oxm-4.1.4.RELEASE.jar;C:\Users\hs\.m2\repository\org\springframework\ws\spring-xml\2.2.0.RELEASE\spring-xml-2.2.0.RELEASE.jar;C:\Users\hs\.m2\repository\com\jayway\jsonpath\json-path\1.2.0\json-path-1.2.0.jar;C:\Users\hs\.m2\repository\net\minidev\json-smart\2.1.0\json-smart-2.1.0.jar;C:\Users\hs\.m2\repository\net\minidev\asm\1.0.2\asm-1.0.2.jar;C:\Users\hs\.m2\repository\asm\asm\3.3.1\asm-3.3.1.jar;C:\Users\hs\.m2\repository\org\springframework\integration\spring-integration-kafka\1.0.0.RELEASE\spring-integration-kafka-1.0.0.RELEASE.jar;C:\Users\hs\.m2\repository\org\apache\avro\avro-compiler\1.7.6\avro-compiler-1.7.6.jar;C:\Users\hs\.m2\repository\org\apache\avro\avro\1.7.6\avro-1.7.6.jar;C:\Users\hs\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\hs\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\hs\.m2\repository\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;C:\Users\hs\.m2\repository\org\xerial\snappy\snappy-java\1.0.5\snappy-java-1.0.5.jar;C:\Users\hs\.m2\repository\org\apache\commons\commons-compress\1.4.1\commons-compress-1.4.1.jar;C:\Users\hs\.m2\repository\org\tukaani\xz\1.0\xz-1.0.jar;C:\Users\hs\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\hs\.m2\repository\org\apache\velocity\velocity\1.7\velocity-1.7.jar;C:\Users\hs\.m2\repository\commons-collections\commons-collections\3.2.1\commons-collections-3.2.1.jar;C:\Users\hs\.m2\repository\com\yammer\metrics\metrics-annotation\2.2.0\metrics-annotation-2.2.0.jar;C:\Users\hs\.m2\repository\com\yammer\metrics\metrics-core\2.2.0\metrics-core-2.2.0.jar;C:\Users\hs\.m2\repository\org\apache\kafka\kafka_2.10\0.8.1.1\kafka_2.10-0.8.1.1.jar;C:\Users\hs\.m2\repository\org\apache\zookeeper\zookeeper\3.3.4\zookeeper-3.3.4.jar;C:\Users\hs\.m2\repository\log4j\log4j\1.2.15\log4j-1.2.15.jar;C:\Users\hs\.m2\repository\javax\mail\mail\1.4\mail-1.4.jar;C:\Users\hs\.m2\repository\javax\activation\activation\1.1\activation-1.1.jar;C:\Users\hs\.m2\repository\javax\jms\jms\1.1\jms-1.1.jar;C:\Users\hs\.m2\repository\com\sun\jdmk\jmxtools\1.2.1\jmxtools-1.2.1.jar;C:\Users\hs\.m2\repository\com\sun\jmx\jmxri\1.2.1\jmxri-1.2.1.jar;C:\Users\hs\.m2\repository\jline\jline\0.9.94\jline-0.9.94.jar;C:\Users\hs\.m2\repository\net\sf\jopt-simple\jopt-simple\3.2\jopt-simple-3.2.jar;C:\Users\hs\.m2\repository\org\scala-lang\scala-library\2.10.1\scala-library-2.10.1.jar;C:\Users\hs\.m2\repository\com\101tec\zkclient\0.3\zkclient-0.3.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA 13.1.6\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain com.agillic.dialogue.kafka.outbound.SpringKafkaTest
15:39:11.736 [main] INFO o.s.i.k.support.ProducerFactoryBean - Using producer properties => {metadata.broker.list=172.16.1.42:9092, compression.codec=0}
2015-02-19 15:39:12 INFO VerifiableProperties:68 - Verifying properties
2015-02-19 15:39:12 INFO VerifiableProperties:68 - Property compression.codec is overridden to 0
2015-02-19 15:39:12 INFO VerifiableProperties:68 - Property metadata.broker.list is overridden to 172.16.1.42:9092
15:39:12.164 [main] INFO o.s.b.f.config.PropertiesFactoryBean - Loading properties file from URL [jar:file:/C:/Users/hs/.m2/repository/org/springframework/integration/spring-integration-core/4.1.2.RELEASE/spring-integration-core-4.1.2.RELEASE.jar!/META-INF/spring.integration.default.properties]
15:39:12.208 [main] DEBUG o.s.i.k.o.KafkaProducerMessageHandler - org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler#5204db6b received message: GenericMessage [payload=foo, headers={timestamp=1424356752208, id=00c483d9-ecf8-2937-4a2c-985bd3afcae4, topic=test-cass, messagekey=3}]
Exception in thread "main" org.springframework.messaging.MessageHandlingException: error occurred in message handler [org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler#5204db6b]; nested exception is java.lang.NullPointerException
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:84)
at com.agillic.dialogue.kafka.outbound.SpringKafkaTest.main(SpringKafkaTest.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.NullPointerException
at org.springframework.integration.kafka.support.KafkaProducerContext.getTopicConfiguration(KafkaProducerContext.java:58)
at org.springframework.integration.kafka.support.KafkaProducerContext.send(KafkaProducerContext.java:190)
at org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler.handleMessageInternal(KafkaProducerMessageHandler.java:81)
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:78)
... 6 more
Process finished with exit code 1
Actually when we introduced KafkaHeaders we did appropriate documentation changes: https://github.com/spring-projects/spring-integration-kafka/blob/master/README.md. See Important note:
Since the last Milestone, we have introduced the KafkaHeaders interface with constants. The messageKey and topic default headers now require a kafka_ prefix. When migrating from an earlier version, you need to specify message-key-expression="headers.messageKey" and topic-expression="headers.topic" on the , or simply change the headers upstream to the new headers from KafkaHeaders using a or MessageBuilder. Or, of course, configure them on the adapter if you are using constant values.
UPDATE
Regarding NullPointerException: it's really an issue. Feel free to raise a JIRA ticket and we'll take care of that. We are even welcome for the contribution!

Resources