Why is saveAsTextFile hanging when I try to output an RDD to HDFS? - apache-spark

I'm trying to write an RDD[String] to HDFS (in spark-shell) using:
output.saveAsTextFile("hdfs://localhost:9000/datasets/result")
However, it just hangs - and the job doesn't even appear in the web UI. I have to kill the SparkSubmit process.
I'm reading data successful from HDFS using:
val input = sc.textFile("hdfs://localhost:9000/datasets/data.csv")
I can output.collect successfully, and writing to local files works as expected.
I'm using Spark 1.4 and Hadoop 2.6. Everything is running on a local machine.
Any ideas?
Comments made me realize that, I should switch on DEBUG level logs. Initial log extract below. There's something about the connection being closed. But, I read the data from HDFS at the start of the short script, so I'm confused.
Solution (kind of)
Must be something to do with local routing. I replaced localhost with 127.0.0.1 and it's working now.
Debug logs
15/06/30 17:04:42 DEBUG ClosureCleaner: +++ Cleaning closure <function1> (org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1$$anonfun$30}) +++
15/06/30 17:04:42 DEBUG ClosureCleaner: + declared fields: 1
15/06/30 17:04:42 DEBUG ClosureCleaner: public static final long org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1$$anonfun$30.serialVersionUID
15/06/30 17:04:42 DEBUG ClosureCleaner: + declared methods: 2
15/06/30 17:04:42 DEBUG ClosureCleaner: public final java.lang.Object org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1$$anonfun$30.apply(java.lang.Object)
15/06/30 17:04:42 DEBUG ClosureCleaner: public final scala.collection.Iterator org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1$$anonfun$30.apply(scala.collection.Iterator)
15/06/30 17:04:42 DEBUG ClosureCleaner: + inner classes: 1
15/06/30 17:04:42 DEBUG ClosureCleaner: org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1$$anonfun$30$$anonfun$apply$49
15/06/30 17:04:42 DEBUG ClosureCleaner: + outer classes: 0
15/06/30 17:04:42 DEBUG ClosureCleaner: + outer objects: 0
15/06/30 17:04:42 DEBUG ClosureCleaner: + populating accessed fields because this is the starting closure
15/06/30 17:04:42 DEBUG ClosureCleaner: + fields accessed by starting closure: 0
15/06/30 17:04:42 DEBUG ClosureCleaner: + there are no enclosing objects!
15/06/30 17:04:42 DEBUG ClosureCleaner: +++ closure <function1> (org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1$$anonfun$30) is now cleaned +++
15/06/30 17:04:42 DEBUG BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
15/06/30 17:04:42 DEBUG BlockReaderLocal: dfs.client.read.shortcircuit = false
15/06/30 17:04:42 DEBUG BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
15/06/30 17:04:42 DEBUG BlockReaderLocal: dfs.domain.socket.path =
15/06/30 17:04:42 DEBUG DFSClient: No KeyProvider found.
15/06/30 17:04:43 DEBUG Client: IPC Client (832019786) connection to localhost/127.0.0.1:9000 from user: closed
15/06/30 17:04:43 DEBUG Client: IPC Client (832019786) connection to localhost/127.0.0.1:9000 from user: stopped, remaining connections 0
15/06/30 17:04:48 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(Heartbeat(driver,[Lscala.Tuple2;#4eb1e256,BlockManagerId(driver, localhost, 51349)),true) from Actor[akka://sparkDriver/temp/$L]
15/06/30 17:04:48 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(Heartbeat(driver,[Lscala.Tuple2;#4eb1e256,BlockManagerId(driver, localhost, 51349)),true)
15/06/30 17:04:48 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.290877 ms) AkkaMessage(Heartbeat(driver,[Lscala.Tuple2;#4eb1e256,BlockManagerId(driver, localhost, 51349)),true) from Actor[akka://sparkDriver/temp/$L]
15/06/30 17:04:48 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(BlockManagerHeartbeat(BlockManagerId(driver, localhost, 51349)),true) from Actor[akka://sparkDriver/temp/$M]
15/06/30 17:04:48 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(BlockManagerHeartbeat(BlockManagerId(driver, localhost, 51349)),true)
15/06/30 17:04:48 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.257991 ms) AkkaMessage(BlockManagerHeartbeat(BlockManagerId(driver, localhost, 51349)),true) from Actor[akka://sparkDriver/temp/$M]
15/06/30 17:04:57 DEBUG RetryUtils: multipleLinearRandomRetry = null
15/06/30 17:04:57 DEBUG Client: getting client out of cache: org.apache.hadoop.ipc.Client#7d1fc150
15/06/30 17:04:57 DEBUG DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
15/06/30 17:04:57 DEBUG PairRDDFunctions: Saving as hadoop file of type (NullWritable, Text)
15/06/30 17:04:57 DEBUG Client: The ping interval is 60000 ms.
15/06/30 17:04:57 DEBUG Client: Connecting to localhost/81.200.64.50:9000
15/06/30 17:04:58 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(Heartbeat(driver,[Lscala.Tuple2;#7aba4180,BlockManagerId(driver, localhost, 51349)),true) from Actor[akka://sparkDriver/temp/$N]
15/06/30 17:04:58 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(Heartbeat(driver,[Lscala.Tuple2;#7aba4180,BlockManagerId(driver, localhost, 51349)),true)
15/06/30 17:04:58 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.277965 ms) AkkaMessage(Heartbeat(driver,[Lscala.Tuple2;#7aba4180,BlockManagerId(driver, localhost, 51349)),true) from Actor[akka://sparkDriver/temp/$N]
15/06/30 17:04:58 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(BlockManagerHeartbeat(BlockManagerId(driver, localhost, 51349)),true) from Actor[akka://sparkDriver/temp/$O]
15/06/30 17:04:58 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(BlockManagerHeartbeat(BlockManagerId(driver, localhost, 51349)),true)
15/06/30 17:04:58 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.258037 ms) AkkaMessage(BlockManagerHeartbeat(BlockManagerId(driver, localhost, 51349)),true) from Actor[akka://sparkDriver/temp/$O]

Related

I am receiving multiple response on client side

I am able to read the response from the server but I am getting this message on the client-side "cannot correlate response - no pending reply for cached:localhost:3002:46550:f6234e17-c486-4506-82c8-a757a08ba73d." after a minute, though the server is not sending any message back, From the server-side, I can see this message "Connection lost: " 127.0.0.1" printed out in the log.
#Bean
public AbstractClientConnectionFactory clientConnectionFactory() {
TcpNioClientConnectionFactory tcpNioClientConnectionFactory = new TcpNioClientConnectionFactory(host, port);
tcpNioClientConnectionFactory.setUsingDirectBuffers(true);
tcpNioClientConnectionFactory.setApplicationEventPublisher(applicationEventPublisher);
return new CachingClientConnectionFactory(tcpNioClientConnectionFactory, connectionPoolSize);
}
#Bean
public MessageChannel outboundChannel() {
return new DirectChannel();
}
#Bean
#ServiceActivator(inputChannel = "outboundChannel")
public MessageHandler outboundGateway(AbstractClientConnectionFactory clientConnectionFactory) {
TcpOutboundGateway tcpOutboundGateway = new TcpOutboundGateway();
tcpOutboundGateway.setConnectionFactory(clientConnectionFactory);
tcpOutboundGateway.setRemoteTimeout(Long.MAX_VALUE);
tcpOutboundGateway.setRequestTimeout(5_000);
return tcpOutboundGateway;
}
2022-03-22 13:14:46,878 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.channel.DirectChannel - preSend on channel 'bean 'outboundChannel'; defined in: 'class path resource [com/wibmo/aerionpg/config/TcpClientConfig.class]'; from source: 'com.wibmo.aerionpg.config.TcpClientConfig.outboundChannel()'', message: GenericMessage [payload=byte[602], headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#2936326d, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#2936326d, id=062c0c73-addf-e90c-4b80-aebc70ec009b, timestamp=1647935086878}]
2022-03-22 13:14:46,878 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.ip.tcp.TcpOutboundGateway - bean 'outboundGateway'; defined in: 'class path resource [com/wibmo/aerionpg/config/TcpClientConfig.class]'; from source: 'com.wibmo.aerionpg.config.TcpClientConfig.outboundGateway(org.springframework.integration.ip.tcp.connection.AbstractClientConnectionFactory)' received message: GenericMessage [payload=byte[602], headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#2936326d, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#2936326d, id=062c0c73-addf-e90c-4b80-aebc70ec009b, timestamp=1647935086878}]
2022-03-22 13:14:46,879 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.util.SimplePool - Obtained TcpNioConnection:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 from pool.
2022-03-22 13:14:46,879 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.ip.tcp.TcpOutboundGateway - Added pending reply Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29
2022-03-22 13:14:46,879 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 writing 604
2022-03-22 13:14:46,879 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Message sent GenericMessage [payload=byte[602], headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#2936326d, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#2936326d, id=062c0c73-addf-e90c-4b80-aebc70ec009b, timestamp=1647935086878}]
2022-03-22 13:14:47,382 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - null: Connection is open: 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29
2022-03-22 13:14:47,382 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - Host 127.0.0.1 port 6001 SelectionCount: 1
2022-03-22 13:14:47,382 [pool-2-thread-2][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Reading...
2022-03-22 13:14:47,382 [pool-2-thread-2][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Running an assembler
2022-03-22 13:14:47,383 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - Before read: 0/61440
2022-03-22 13:14:47,383 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - After read: 614/61440
2022-03-22 13:14:47,383 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - After flip: 0/614
2022-03-22 13:14:47,383 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Nio message assembler running...
2022-03-22 13:14:47,383 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 0 pending: true
2022-03-22 13:14:47,383 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail (convert): 0 pending: true
2022-03-22 13:14:47,383 [pool-2-thread-2][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - Read 614 into raw buffer
2022-03-22 13:14:47,383 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Sending 614 to pipe
2022-03-22 13:14:47,383 [pool-2-thread-3][][][][][][][] DEBUG org.springframework.integration.ip.tcp.serializer.ByteArrayCrLfSerializer - Available to read: 614
2022-03-22 13:14:47,383 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - null: Connection is open: 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29
2022-03-22 13:14:47,383 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - Host 127.0.0.1 port 6001 SelectionCount: 0
2022-03-22 13:14:47,383 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 0 pending: false
2022-03-22 13:14:47,383 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.TcpOutboundGateway - onMessage: Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29(GenericMessage [payload=byte[612], headers={ip_tcp_remotePort=6001, ip_connectionId=Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29, ip_actualConnectionId=127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=f9619787-4d0c-e894-ae05-f2eca0c4a1b4, ip_hostname=127.0.0.1, timestamp=1647935087383}])
2022-03-22 13:14:47,383 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 0 pending: false
2022-03-22 13:14:47,383 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Nio message assembler exiting... avail: 0
2022-03-22 13:14:47,383 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.ip.tcp.TcpOutboundGateway - Response GenericMessage [payload=byte[612], headers={ip_tcp_remotePort=6001, ip_connectionId=Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29, ip_actualConnectionId=127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=f9619787-4d0c-e894-ae05-f2eca0c4a1b4, ip_hostname=127.0.0.1, timestamp=1647935087383}]
2022-03-22 13:14:47,383 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.ip.tcp.TcpOutboundGateway - Removed pending reply Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29
2022-03-22 13:14:47,383 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.util.SimplePool - Releasing TcpNioConnection:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 back to the pool
2022-03-22 13:14:47,383 [http-nio-9000-exec-4][][][][][][][] DEBUG org.springframework.integration.channel.DirectChannel - postSend (sent=true) on channel 'bean 'outboundChannel'; defined in: 'class path resource [com/wibmo/aerionpg/config/TcpClientConfig.class]'; from source: 'com.wibmo.aerionpg.config.TcpClientConfig.outboundChannel()'', message: GenericMessage [payload=byte[602], headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#2936326d, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#2936326d, id=062c0c73-addf-e90c-4b80-aebc70ec009b, timestamp=1647935086878}]
2022-03-22 13:15:47,383 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - null: Connection is open: 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29
2022-03-22 13:15:47,383 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - Host 127.0.0.1 port 6001 SelectionCount: 1
2022-03-22 13:15:47,384 [pool-2-thread-3][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Reading...
2022-03-22 13:15:47,384 [pool-2-thread-3][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Running an assembler
2022-03-22 13:15:47,384 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - Before read: 0/61440
2022-03-22 13:15:47,384 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - After read: 14/61440
2022-03-22 13:15:47,384 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - After flip: 0/14
2022-03-22 13:15:47,384 [pool-2-thread-3][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - Read 14 into raw buffer
2022-03-22 13:15:47,384 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Sending 14 to pipe
2022-03-22 13:15:47,384 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - null: Connection is open: 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29
2022-03-22 13:15:47,384 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - Host 127.0.0.1 port 6001 SelectionCount: 0
2022-03-22 13:15:47,384 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - null: Connection is open: 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29
2022-03-22 13:15:47,384 [pool-2-thread-1][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioClientConnectionFactory - Host 127.0.0.1 port 6001 SelectionCount: 1
2022-03-22 13:15:47,384 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Nio message assembler running...
2022-03-22 13:15:47,384 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 14 pending: false
2022-03-22 13:15:47,384 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail (convert): 14 pending: false
2022-03-22 13:15:47,384 [pool-2-thread-2][][][][][][][] DEBUG org.springframework.integration.ip.tcp.serializer.ByteArrayCrLfSerializer - Available to read: 14
2022-03-22 13:15:47,385 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 0 pending: false
2022-03-22 13:15:47,385 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.TcpOutboundGateway - onMessage: Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29(GenericMessage [payload=byte[12], headers={ip_tcp_remotePort=6001, ip_connectionId=Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29, ip_actualConnectionId=127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=b9f23b01-6816-65f1-c1e9-f4cf1c21bccc, ip_hostname=127.0.0.1, timestamp=1647935147385}])
2022-03-22 13:15:47,385 [pool-2-thread-2][][][][][][][] ERROR org.springframework.integration.ip.tcp.TcpOutboundGateway - Cannot correlate response - no pending reply for Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29
2022-03-22 13:15:47,386 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 0 pending: false
2022-03-22 13:15:47,386 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Nio message assembler exiting... avail: 0
2022-03-22 13:15:47,386 [pool-2-thread-3][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Reading...
2022-03-22 13:15:47,386 [pool-2-thread-3][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Running an assembler
2022-03-22 13:15:47,386 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Nio message assembler running...
2022-03-22 13:15:47,386 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 0 pending: true
2022-03-22 13:15:47,386 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail (convert): 0 pending: true
2022-03-22 13:15:47,386 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - Before read: 0/61440
2022-03-22 13:15:47,386 [pool-2-thread-3][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.CachingClientConnectionFactory$CachedConnection - Connection Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 has already been released
2022-03-22 13:15:47,387 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - Published: TcpConnectionCloseEvent [source=TcpNioConnection:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29], [factory=unknown, connectionId=127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29] **CLOSED**
2022-03-22 13:15:47,387 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - After read: 0/61440
2022-03-22 13:15:47,387 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - After flip: 0/0
2022-03-22 13:15:47,387 [pool-2-thread-3][][][][][][][] DEBUG org.springframework.integration.ip.tcp.connection.TcpNioConnection - Read 0 into raw buffer
2022-03-22 13:15:47,387 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Sending 0 to pipe
2022-03-22 13:15:47,387 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 0 pending: false
2022-03-22 13:15:47,387 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 checking data avail: 0 pending: false
2022-03-22 13:15:47,387 [pool-2-thread-2][][][][][][][] TRACE org.springframework.integration.ip.tcp.connection.TcpNioConnection - 127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29 Nio message assembler exiting... avail: 0
2022-03-22 13:15:47,387 [pool-2-thread-3][][][][][][][] TRACE org.springframework.integration.ip.tcp.TcpOutboundGateway - onMessage: Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29(ErrorMessage [payload=java.io.EOFException: Connection is closed, headers={id=78de443c-cf7a-5e4d-89bc-0b8c4027b5a5, ip_connectionId=Cached:127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29, timestamp=1647935147387, ip_actualConnectionId=127.0.0.1:6001:54856:37e334df-5c89-4b51-af4f-727d0f484a29}])
It looks to me like the server is closing the socket 1 minute later, but before doing that, it is sending a message - 14 bytes; presumably 12 bytes + \r\n, given your configuration.
Starting with version 5.4, you can add an unsolicitedMessageChannel to the outbound gateway; these extra messages will be sent there, instead of generating that log message.
/**
* Set the channel name for unsolicited incoming messages, or late replies.
* #param unsolicitedMessageChannelName the channel name.
* #since 5.4
*/
public void setUnsolicitedMessageChannelName(String unsolicitedMessageChannelName) {
this.unsolicitedMessageChannelName = unsolicitedMessageChannelName;
}
/**
* Set the channel for unsolicited incoming messages, or late replies.
* #param unsolicitedMessageChannel the channel.
* #since 5.4
*/
public void setUnsolicitedMessageChannel(MessageChannel unsolicitedMessageChannel) {
this.unsolicitedMessageChannel = unsolicitedMessageChannel;
}
You can add message handler (e.g. #ServiceActivator) to consume from that channel to see what the content is, or you can set it to "nullChannel" to just discard it.
EDIT
Here is an example, using the Java DSL:
#SpringBootApplication
public class So71552834Application {
public static void main(String[] args) {
SpringApplication.run(So71552834Application.class, args);
}
#Bean
IntegrationFlow clientFlow() {
return IntegrationFlows.from(Gate.class)
.handle(Tcp.outboundGateway(Tcp.netClient("localhost", 1234))
.unsolictedMessageChannelName("unsolicited"))
.get();
}
#ServiceActivator(inputChannel = "unsolicited")
public void handleUnsolicited(String in) {
System.out.println("Unsolicited response: " + in);
}
// emulate the server side.
#Bean
IntegrationFlow serverFlow() {
return IntegrationFlows.from(Tcp.inboundGateway(Tcp.netServer(1234)))
.transform(Transformers.objectToString())
.<String>handle((p, h) -> p.toUpperCase())
.get();
}
#Bean
IntegrationFlow serverOut(AbstractServerConnectionFactory scf) {
return f -> f.handle(Tcp.outboundAdapter(scf));
}
#Bean
ApplicationRunner runner(Gate gate, IntegrationFlow serverOut, AbstractServerConnectionFactory scf) {
return args -> {
System.out.println(new String(gate.exchange("test")));
System.out.println("Hit enter to send an unsolicited message");
System.in.read();
String id = scf.getOpenConnectionIds().iterator().next();
serverOut.getInputChannel().send(MessageBuilder.withPayload("what's this?")
.setHeader(IpHeaders.CONNECTION_ID, id)
.build());
};
}
}
interface Gate {
byte[] exchange(String out);
}
TEST
Hit enter to send an unsolicited message
Unsolicited response: what's this?

Spring Integration Bridge with poller not working as expected for JMS

Using spring-integration 5.0.7 to throttle the bridging of msgs between two JMS queues.
The docs at: https://docs.spring.io/spring-integration/docs/5.0.7.RELEASE/reference/html/messaging-channels-section.html#bridge-namespace
suggest:
<int:bridge input-channel="pollable" output-channel="subscribable">
<int:poller max-messages-per-poll="10" fixed-rate="5000"/>
</int:bridge>
But schema validator complains "no nested poller allowed for subscribable input channel" on bridge elt.
But, if I put the poller on the input-channel-adapter as in:
<?xml version="1.0" encoding="UTF-8"?>
<beans:beans xmlns:int="http://www.springframework.org/schema/integration"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:int-jms="http://www.springframework.org/schema/integration/jms"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration
http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/jms
http://www.springframework.org/schema/integration/jms/spring-integration-jms.xsd
">
<int:channel id="inChannel" />
<int:channel id="outChannel" />
<int-jms:inbound-channel-adapter id="jmsIn" connection-factory="jmsConnectionFactory" destination-name="_dev.inQueue" channel="inChannel">
<int:poller fixed-delay="5000" max-messages-per-poll="2"/>
</int-jms:inbound-channel-adapter>
<int-jms:outbound-channel-adapter id="jmsOut" connection-factory="jmsConnectionFactory" destination-name="_dev.outQueue" channel="outChannel"/>
<int:bridge input-channel="inChannel" output-channel="outChannel">
</int:bridge>
</beans:beans>
Nothing is ever moved from input to output.
How can I bridge from one JMS queue to another with a rate-limit?
Update:
Turning on logging confirms nothing getting picked up from input channel but otherwise not helpful:
018-08-10 15:36:33.345 DEBUG 112066 --- [ask-scheduler-1] o.s.i.e.SourcePollingChannelAdapter : Received no Message during the poll, returning 'false'
2018-08-10 15:36:38.113 DEBUG 112066 --- [ask-scheduler-2] o.s.integration.jms.DynamicJmsTemplate : Executing callback on JMS Session: ActiveMQSession {id=ID:whitechapel-35247-1533940593148-3:2:1,started=true} java.lang.Object#5c278302
2018-08-10 15:36:38.116 DEBUG 112066 --- [ask-scheduler-2] o.s.i.e.SourcePollingChannelAdapter : Received no Message during the poll, returning 'false'
2018-08-10 15:36:43.115 DEBUG 112066 --- [ask-scheduler-1] o.s.integration.jms.DynamicJmsTemplate : Executing callback on JMS Session: ActiveMQSession {id=ID:whitechapel-35247-1533940593148-3:3:1,started=true} java.lang.Object#1c09a81e
2018-08-10 15:36:43.118 DEBUG 112066 --- [ask-scheduler-1] o.s.i.e.SourcePollingChannelAdapter : Received no Message during the poll, returning 'false'
Here is a Spring Boot app, using Java DSL configuration which is the exact equivalent of what you have in XML (minus the bridge); it works fine.
#SpringBootApplication
public class So51792909Application {
private static final Logger logger = LoggerFactory.getLogger(So51792909Application.class);
public static void main(String[] args) {
SpringApplication.run(So51792909Application.class, args);
}
#Bean
public ApplicationRunner runner(JmsTemplate template) {
return args -> {
for (int i = 0; i < 10; i++) {
template.convertAndSend("foo", "test");
}
};
}
#Bean
public IntegrationFlow flow(ConnectionFactory connectionFactory) {
return IntegrationFlows.from(Jms.inboundAdapter(connectionFactory)
.destination("foo"), e -> e
.poller(Pollers
.fixedDelay(5000)
.maxMessagesPerPoll(2)))
.handle(Jms.outboundAdapter(connectionFactory)
.destination("bar"))
.get();
}
#JmsListener(destination = "bar")
public void listen(String in) {
logger.info(in);
}
}
and
2018-08-10 19:38:52.534 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:38:52.543 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:38:57.566 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:38:57.582 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:02.608 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:02.622 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:07.640 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:07.653 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:12.672 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:12.687 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
As you can see, the consumer gets 2 messages every 5 seconds.
Your debug log implies there are no messages in the queue.
EDIT
I figured it out; the XML parser sets the JmsTemplate receiveTimeout to nowait (-1). Since you are not using a caching connection factory, we'll never get a message because the ActiveMQ client returns immediately if there's not already a message present in the client (see this answer). Since there's no caching going on, we get a new consumer on every poll (and do a no-wait receive each time).
The DSL leaves the JmsTemplate's default (Infinite wait - which is actually wrong since it blocks the poller thread indefinitely if there are no messages).
To fix the XML version, adding receive-timeout="1000" fixes it.
However, it's better to use a CachingConnectionFactory to avoid creating a new connection/session/consumer on each poll.
Unfortunately, configurating a CachingConnectionFactory turns off Spring Boot's auto-configuration. This is fixed in Boot 2.1.
I have opened an issue to resolve the inconsistency between the DSL and XML here.
If you stick with the DSL, I would recommend setting the receive timeout to something reasonable, rather than indefinite:
#Bean
public IntegrationFlow flow(ConnectionFactory connectionFactory) {
return IntegrationFlows.from(Jms.inboundAdapter(connectionFactory)
.configureJmsTemplate(t -> t.receiveTimeout(1000))
.destination("foo"), e -> e
.poller(Pollers
.fixedDelay(5000)
.maxMessagesPerPoll(2)))
.handle(Jms.outboundAdapter(connectionFactory)
.destination("bar"))
.get();
}
But, the best solution is to use a CachingConnectionFactory.

Slave configuration for listening to messages in spring batch using spring integration

Can I have a job with just the slaves and no master and listen to a rabbitmq queue? I want to listen to a queue and process the messages in chunk oriented manner using spring batch and spring integration in a spring boot app.
I want to use the chunkProcessorChunkHandler configuration explained in the RemoteChunking example for Spring batch by Michael Minella (https://www.youtube.com/watch?v=30Tdp1mfR0g), but without a master configuration.
Below is my configuration for the job.
#Configuration
#EnableIntegration
public class QueueIntegrationConfiguration {
#Autowired
private CassandraItemWriter cassandraItemWriter;
#Autowired
private VendorProcessor vendorProcessor;
#Autowired
ConnectionFactory connectionFactory;
#Bean
public AmqpInboundChannelAdapter inboundChannelAdapter(
SimpleMessageListenerContainer listenerContainer) {
AmqpInboundChannelAdapter adapter = new AmqpInboundChannelAdapter(listenerContainer);
adapter.setOutputChannel(inboundQueueChannel());
adapter.setAutoStartup(true);
return adapter;
}
#Bean
public SimpleMessageListenerContainer listenerContainer(ConnectionFactory connectionFactory, MessageConverter jsonMessageConverter) {
SimpleMessageListenerContainer container = new SimpleMessageListenerContainer(
connectionFactory);
container.setQueueNames("ProductStore_Partial");
container.setAutoStartup(true);
container.setMessageConverter(jsonMessageConverter);
return container;
}
#Bean
#ServiceActivator(inputChannel = "ProductStore_Partial")
public ChunkProcessorChunkHandler chunkProcessorChunkHandler()
throws Exception {
SimpleChunkProcessor chunkProcessor = new SimpleChunkProcessor(vendorProcessor,
cassandraItemWriter);
chunkProcessor.afterPropertiesSet();
ChunkProcessorChunkHandler<Vendor> chunkHandler = new ChunkProcessorChunkHandler<>();
chunkHandler.setChunkProcessor(chunkProcessor);
chunkHandler.afterPropertiesSet();
return chunkHandler;
}
#Bean
public QueueChannel inboundQueueChannel() {
return new QueueChannel().;
}
}
Below is my Application.java class for spring boot.
#SpringBootApplication
#EnableBatchProcessing
public class BulkImportProductApplication {
public static void main(String[] args) {
SpringApplication app = new SpringApplication(BulkImportProductApplication.class);
app.setWebEnvironment(false);
app.run(args).close();
}
}
From what I understand from spring integration, I have an AmqpInboundChannelAdapter for listening to messages from the queue. A ServiceActivator, an inboundQueueChannel, autowired ItemProcessor and ItemWriter. I am not sure what am I missing here.
The batch job starts, consumes one message from the queue and get a cancelOk and my job terminates without processing the message.
I am also sharing my debug logging if that would help.
2017-12-04 09:58:49.679 INFO 7450 --- [ main] c.a.s.p.b.BulkImportProductApplication : Started BulkImportProductApplication in 9.412 seconds (JVM running for 10.39)
2017-12-04 09:58:49.679 INFO 7450 --- [ main] s.c.a.AnnotationConfigApplicationContext : Closing org.springframework.context.annotation.AnnotationConfigApplicationContext#31c88ec8: startup date [Mon Dec 04 09:58:40 PST 2017]; root of context hierarchy
2017-12-04 09:58:49.679 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'org.springframework.integration.config.IdGeneratorConfigurer#0'
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'inboundChannelAdapter'
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'listenerContainer'
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'integrationHeaderChannelRegistry'
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'org.springframework.amqp.rabbit.config.internalRabbitListenerEndpointRegistry'
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean '_org.springframework.integration.errorLogger'
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'queueIntegrationConfiguration.chunkProcessorChunkHandler.serviceActivator.handler'
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'queueIntegrationConfiguration.chunkProcessorChunkHandler.serviceActivator'
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'lifecycleProcessor'
2017-12-04 09:58:49.680 INFO 7450 --- [ main] o.s.c.support.DefaultLifecycleProcessor : Stopping beans in phase 2147483647
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.c.support.DefaultLifecycleProcessor : Asking bean 'inboundChannelAdapter' of type [class org.springframework.integration.amqp.inbound.AmqpInboundChannelAdapter] to stop
2017-12-04 09:58:49.680 DEBUG 7450 --- [ main] o.s.a.r.l.SimpleMessageListenerContainer : Shutting down Rabbit listener container
2017-12-04 09:58:49.814 DEBUG 7450 --- [pool-1-thread-5] o.s.a.r.listener.BlockingQueueConsumer : Storing delivery for Consumer#7c52fc81: tags=[{}], channel=Cached Rabbit Channel: AMQChannel(amqp://admin#xxxx:5672/,2), conn: Proxy#26f1249d Shared Rabbit Connection: SimpleConnection#680bddf5 [delegate=amqp://admin#xxxx:5672/, localPort= 65035], acknowledgeMode=AUTO local queue size=0
2017-12-04 09:58:49.814 DEBUG 7450 --- [enerContainer-1] o.s.a.r.listener.BlockingQueueConsumer : Received message: (Body:'[B#358a5358(byte[618])' MessageProperties [headers={__TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor}, timestamp=null, messageId=null, userId=null, receivedUserId=null, appId=null, clusterId=null, type=null, correlationId=null, correlationIdString=null, replyTo=null, contentType=json, contentEncoding=UTF-8, contentLength=0, deliveryMode=null, receivedDeliveryMode=NON_PERSISTENT, expiration=null, priority=0, redelivered=false, receivedExchange=ProductStore, receivedRoutingKey=, receivedDelay=null, deliveryTag=2, messageCount=0, consumerTag=amq.ctag-nWGbRxjFiaeTEoZylv6Hrg, consumerQueue=null])
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_receivedDeliveryMode] WILL be mapped, matched pattern=*
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_contentEncoding] WILL be mapped, matched pattern=*
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_receivedExchange] WILL be mapped, matched pattern=*
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_deliveryTag] WILL be mapped, matched pattern=*
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[json__TypeId__] WILL be mapped, matched pattern=*
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_redelivered] WILL be mapped, matched pattern=*
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[contentType] WILL be mapped, matched pattern=*
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[__TypeId__] WILL be mapped, matched pattern=*
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] o.s.integration.channel.QueueChannel : preSend on channel 'inboundQueueChannel', message: GenericMessage [payload=byte[618], headers={amqp_receivedDeliveryMode=NON_PERSISTENT, amqp_contentEncoding=UTF-8, amqp_receivedExchange=ProductStore, amqp_deliveryTag=2, json__TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor, amqp_redelivered=false, id=a4868670-240f-ddf2-8a8c-ac4b8d234cdd, amqp_consumerTag=amq.ctag-nWGbRxjFiaeTEoZylv6Hrg, contentType=json, __TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor, timestamp=1512410329815}]
2017-12-04 09:58:49.815 DEBUG 7450 --- [enerContainer-1] o.s.integration.channel.QueueChannel : postSend (sent=true) on channel 'inboundQueueChannel', message: GenericMessage [payload=byte[618], headers={amqp_receivedDeliveryMode=NON_PERSISTENT, amqp_contentEncoding=UTF-8, amqp_receivedExchange=ProductStore, amqp_deliveryTag=2, json__TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor, amqp_redelivered=false, id=a4868670-240f-ddf2-8a8c-ac4b8d234cdd, amqp_consumerTag=amq.ctag-nWGbRxjFiaeTEoZylv6Hrg, contentType=json, __TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor, timestamp=1512410329815}]
2017-12-04 09:58:49.853 INFO 7450 --- [ main] o.s.a.r.l.SimpleMessageListenerContainer : Waiting for workers to finish.
2017-12-04 09:58:49.853 DEBUG 7450 --- [pool-1-thread-6] o.s.a.r.listener.BlockingQueueConsumer : Received cancelOk for tag amq.ctag-nWGbRxjFiaeTEoZylv6Hrg (null); Consumer#7c52fc81: tags=[{}], channel=Cached Rabbit Channel: AMQChannel(amqp://admin#xxxx:5672/,2), conn: Proxy#26f1249d Shared Rabbit Connection: SimpleConnection#680bddf5 [delegate=amqp://admin#xxxx:5672/, localPort= 65035], acknowledgeMode=AUTO local queue size=0
2017-12-04 09:58:49.853 DEBUG 7450 --- [enerContainer-1] o.s.a.r.l.SimpleMessageListenerContainer : Cancelling Consumer#7c52fc81: tags=[{}], channel=Cached Rabbit Channel: AMQChannel(amqp://admin#xxxx:5672/,2), conn: Proxy#26f1249d Shared Rabbit Connection: SimpleConnection#680bddf5 [delegate=amqp://admin#xxxx:5672/, localPort= 65035], acknowledgeMode=AUTO local queue size=0
2017-12-04 09:58:49.853 DEBUG 7450 --- [enerContainer-1] o.s.a.r.listener.BlockingQueueConsumer : Closing Rabbit Channel: Cached Rabbit Channel: AMQChannel(amqp://admin#xxxx:5672/,2), conn: Proxy#26f1249d Shared Rabbit Connection: SimpleConnection#680bddf5 [delegate=amqp://admin#xxxx:5672/, localPort= 65035]
2017-12-04 09:58:49.853 DEBUG 7450 --- [enerContainer-1] o.s.a.r.c.CachingConnectionFactory : Closing cached Channel: AMQChannel(amqp://admin#xxxx:5672/,2)
2017-12-04 09:58:50.027 INFO 7450 --- [ main] o.s.a.r.l.SimpleMessageListenerContainer : Successfully waited for workers to finish.
2017-12-04 09:58:50.027 DEBUG 7450 --- [ main] o.s.c.support.DefaultLifecycleProcessor : Bean 'inboundChannelAdapter' completed its stop procedure
What am I missing here? Why is my message getting processed? Please correct me if I'm missing out something here? Also feel free to ask any other configuration that you feel would help analyze the situation here.
EDIT: After removing the code that closes the application context manually( app.run(args).close() ), I was able to receive the messages, but looks like they are lost after a successful retrieve. sharing the debug log for this behavior.
2017-12-04 14:39:11.297 DEBUG 1498 --- [pool-1-thread-5] o.s.a.r.listener.BlockingQueueConsumer : Storing delivery for Consumer#7219ac49: tags=[{amq.ctag-Z8siptJMdxGU6sXdOHkVCA=ProductStore_Partial}], channel=Cached Rabbit Channel: AMQChannel(amqp://admin#xxxx:5672/,2), conn: Proxy#6df20ade Shared Rabbit Connection: SimpleConnection#7ba63fe5 [delegate=amqp://admin#xxxx:5672/, localPort= 51172], acknowledgeMode=AUTO local queue size=0
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] o.s.a.r.listener.BlockingQueueConsumer : Received message: (Body:'[B#347c8f87(byte[624])' MessageProperties [headers={__TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor}, timestamp=null, messageId=null, userId=null, receivedUserId=null, appId=null, clusterId=null, type=null, correlationId=null, correlationIdString=null, replyTo=null, contentType=json, contentEncoding=UTF-8, contentLength=0, deliveryMode=null, receivedDeliveryMode=NON_PERSISTENT, expiration=null, priority=0, redelivered=false, receivedExchange=ProductStore, receivedRoutingKey=, receivedDelay=null, deliveryTag=2, messageCount=0, consumerTag=amq.ctag-Z8siptJMdxGU6sXdOHkVCA, consumerQueue=ProductStore_Partial])
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_receivedDeliveryMode] WILL be mapped, matched pattern=*
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_contentEncoding] WILL be mapped, matched pattern=*
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_receivedExchange] WILL be mapped, matched pattern=*
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_deliveryTag] WILL be mapped, matched pattern=*
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[json__TypeId__] WILL be mapped, matched pattern=*
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[amqp_redelivered] WILL be mapped, matched pattern=*
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[contentType] WILL be mapped, matched pattern=*
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] s.i.m.AbstractHeaderMapper$HeaderMatcher : headerName=[__TypeId__] WILL be mapped, matched pattern=*
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] o.s.integration.channel.QueueChannel : preSend on channel 'inboundQueueChannel', message: GenericMessage [payload=byte[624], headers={amqp_receivedDeliveryMode=NON_PERSISTENT, amqp_contentEncoding=UTF-8, amqp_receivedExchange=ProductStore, amqp_deliveryTag=2, json__TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor, amqp_consumerQueue=ProductStore_Partial, amqp_redelivered=false, id=540399a5-62a6-7178-2524-e274bad4ed13, amqp_consumerTag=amq.ctag-Z8siptJMdxGU6sXdOHkVCA, contentType=json, __TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor, timestamp=1512427151297}]
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] o.s.integration.channel.QueueChannel : postSend (sent=true) on channel 'inboundQueueChannel', message: GenericMessage [payload=byte[624], headers={amqp_receivedDeliveryMode=NON_PERSISTENT, amqp_contentEncoding=UTF-8, amqp_receivedExchange=ProductStore, amqp_deliveryTag=2, json__TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor, amqp_consumerQueue=ProductStore_Partial, amqp_redelivered=false, id=540399a5-62a6-7178-2524-e274bad4ed13, amqp_consumerTag=amq.ctag-Z8siptJMdxGU6sXdOHkVCA, contentType=json, __TypeId__=com.art.service.product.bulkimportproduct.data.model.Vendor, timestamp=1512427151297}]
2017-12-04 14:39:11.297 DEBUG 1498 --- [enerContainer-1] o.s.a.r.listener.BlockingQueueConsumer : Retrieving delivery for Consumer#7219ac49: tags=[{amq.ctag-Z8siptJMdxGU6sXdOHkVCA=ProductStore_Partial}], channel=Cached Rabbit Channel: AMQChannel(amqp://admin#xxxx:5672/,2), conn: Proxy#6df20ade Shared Rabbit Connection: SimpleConnection#7ba63fe5 [delegate=amqp://admin#xxxx:5672/, localPort= 51172], acknowledgeMode=AUTO local queue size=0
This goes on repeating and new messages are consumed, but the messages are not getting processed and written to the data-store using the itemWriter provided. Now come to think of it, since I have not provided the tasklet/step bean reference anywhere in this code, is that something I am missing out here?
app.run(args).close();
You are explicitly closing the application context, which shuts everything down.
Closing org.springframework.context.annotation.AnnotationConfigApplicationContext.

How to submit a spark Job Programmatically

While submitting the application with spark-submit it works.
But while trying to submit it Programmatically using the command below
mvn exec:java -Dexec.mainClass="org.cybergen.SubmitJobExample" -Dexec.args="/opt/spark/current/README.md Please"
Am getting the following error while trying to do so
Application Log
15/05/12 17:19:46 INFO AppClient$ClientActor: Connecting to master spark://cyborg:7077...
15/05/12 17:19:46 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkMaster#cyborg:7077] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
15/05/12 17:20:06 INFO AppClient$ClientActor: Connecting to master spark://cyborg:7077...
15/05/12 17:20:06 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkMaster#cyborg:7077] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
Spark Master Log
15/05/12 17:33:22 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster#cyborg:7077] <- [akka.tcp://sparkDriver#10.18.26.116:49592]: Error [org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = 2596819202403185464] [
java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = 2596819202403185464
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at akka.serialization.Serialization.deserialize(Serialization.scala:98)
at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:63)
at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at akka.serialization.Serialization.deserialize(Serialization.scala:98)
at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
]
15/05/12 17:33:22 INFO Master: akka.tcp://sparkDriver#10.18.26.116:49592 got disassociated, removing it.
15/05/12 17:33:22 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver#10.18.26.116:49592] has failed, address is now gated for [5000] ms. Reason is: [org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = 2596819202403185464].
15/05/12 17:33:22 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/reliableEndpointWriter-akka.tcp%3A%2F%2FsparkDriver%4010.18.26.116%3A49592-6/endpointWriter/endpointReader-akka.tcp%3A%2F%2FsparkDriver%4010.18.26.116%3A49592-0#1749840468] was not delivered. [10] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
15/05/12 17:33:22 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40127.0.0.1%3A50366-7#-1224275483] was not delivered. [11] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
15/05/12 17:33:42 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster#cyborg:7077] <- [akka.tcp://sparkDriver#10.18.26.116:49592]: Error [org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = 2596819202403185464] [
java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = 2596819202403185464
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
SparkTestJob : is the spark job class
class SparkTestJob(val filePath:String="",val filter:String ="") extends Serializable{
def runWordCount() :Long = {
val conf = new SparkConf()
.setAppName("word count for the word"+filter)
.setMaster("spark://cyborg:7077")
.setJars(Seq("/tmp/spark-example-1.0-SNAPSHOT-driver.jar"))
.setSparkHome("/opt/spark/current")
val sc = new SparkContext(conf)
val file = sc.textFile(filePath)
file.filter(line => line.contains(filter)).count()
}
}
SubmitJobExample is the object which initiates the SparkTestJob Class
object SubmitJobExample {
def main(args: Array[String]):Unit={
if(args.length==2){
val fileName = args(0)
val filterByWord = args(1)
println("Reading file "+fileName+" for word "+filterByWord)
val jobObject = new SparkTestJob(fileName,filterByWord)
println("word count for the file "+fileName+" is "+jobObject.runWordCount())
}else{
val jobObject = new SparkTestJob("/opt/spark/current/README.md","Please")
println("word count for the file /opt/spark/current/README.md is "+jobObject.runWordCount())
}
}
}
Your code looks fine to me. To debug serialization problems run with -Dsun.io.serialization.extendedDebugInfo=true. This will print extra output upon NotSerializableException and you will see what it's trying to serialize.
The Actual problem was a mismatch of spark versions by one of the dependencies Changing all the dependencies to the same spark version Fixed the problem.
Reason why it worked while performing spark-submit is because of the java JAR-class-path precedence which used the correct spark jar version.

Unable to rename temporary file in SFTP remote directory

In continuation to the post - http://forum.spring.io/forum/spring-projects/integration/119697-unable-to-rename-file-in-sftp-remote-directory-please-help
I am using the sftp:outbound-channel-adapter to upload a file into one of the sftp server. Once file is uploaded, API is unable to rename the temporary file sample_test.pgp.writing to sample_test.pgp.
Before uploading the file I verified in the sftp remote folder for file exist and remote folder was completely empty.
When I looked at the debug level log, I could see below message and it is failing at the end with the invalid path error message.
[main] DEBUG: com.ftp.util.FileUploadUtil - Upload for file /sample_test.pgp triggered
[main] DEBUG: org.springframework.integration.channel.DirectChannel - preSend on channel 'ftp.uploadgateway.request.channel', message: [Payload=/sample_test.pgp][Headers={timestamp=1406654118428, id=bbba360d-492d-4348-b2e7-566aec7f4209}]
[main] DEBUG: org.springframework.integration.filter.MessageFilter - org.springframework.integration.filter.MessageFilter#3970ae0 received message: [Payload=/sample_test.pgp][Headers={timestamp=1406654118428, id=bbba360d-492d-4348-b2e7-566aec7f4209}]
[main] DEBUG: org.springframework.integration.channel.DirectChannel - preSend on channel 'upload.file.to.sftp', message: [Payload=/sample_test.pgp][Headers={timestamp=1406654118428, id=bbba360d-492d-4348-b2e7-566aec7f4209}]
[main] DEBUG: org.springframework.integration.channel.DirectChannel - preSend on channel 'logger', message: [Payload=/sample_test.pgp][Headers={timestamp=1406654118428, id=bbba360d-492d-4348-b2e7-566aec7f4209}]
[main] DEBUG: org.springframework.integration.handler.LoggingHandler - org.springframework.integration.handler.LoggingHandler#0 received message: [Payload=/sample_test.pgp][Headers={timestamp=1406654118428, id=bbba360d-492d-4348-b2e7-566aec7f4209}]
[main] INFO : org.springframework.integration.handler.LoggingHandler - [Payload=/sample_test.pgp][Headers={timestamp=1406654118428, id=bbba360d-492d-4348-b2e7-566aec7f4209}]
[main] DEBUG: org.springframework.integration.channel.DirectChannel - postSend (sent=true) on channel 'logger', message: [Payload=/sample_test.pgp][Headers={timestamp=1406654118428, id=bbba360d-492d-4348-b2e7-566aec7f4209}]
[main] DEBUG: org.springframework.integration.file.remote.handler.FileTransferringMessageHandler - org.springframework.integration.file.remote.handler.FileTransferringMessageHandler#0 received message: [Payload=/sample_test.pgp][Headers={timestamp=1406654118428, id=bbba360d-492d-4348-b2e7-566aec7f4209}]
[main] INFO : com.jcraft.jsch - Connecting to remote.sever.com port 10022
[main] INFO : com.jcraft.jsch - Connection established
[main] INFO : com.jcraft.jsch - Remote version string: SSH-2.0-SSHD
[main] INFO : com.jcraft.jsch - Local version string: SSH-2.0-JSCH-0.1.49
[main] INFO : com.jcraft.jsch - CheckCiphers: aes256-ctr,aes192-ctr,aes128-ctr,aes256-cbc,aes192-cbc,aes128-cbc,3des-ctr,arcfour,arcfour128,arcfour256
[main] INFO : com.jcraft.jsch - CheckKexes: diffie-hellman-group14-sha1
[main] INFO : com.jcraft.jsch - diffie-hellman-group14-sha1 is not available.
[main] INFO : com.jcraft.jsch - SSH_MSG_KEXINIT sent
[main] INFO : com.jcraft.jsch - SSH_MSG_KEXINIT received
[main] INFO : com.jcraft.jsch - kex: server: diffie-hellman-group14-sha1,diffie-hellman-group1-sha1,diffie-hellman-group-exchange-sha1
[main] INFO : com.jcraft.jsch - kex: server: ssh-rsa
[main] INFO : com.jcraft.jsch - kex: server: aes128-cbc,aes192-cbc,aes256-cbc,3des-cbc,blowfish-cbc
[main] INFO : com.jcraft.jsch - kex: server: aes128-cbc,blowfish-cbc,aes192-cbc,aes256-cbc,arcfour,arcfour128,arcfour256
[main] INFO : com.jcraft.jsch - kex: server: hmac-sha1,hmac-md5,hmac-sha1-96,hmac-md5-96,hmac-sha256,hmac-sha256#ssh.com
[main] INFO : com.jcraft.jsch - kex: server: hmac-sha1,hmac-md5,hmac-sha1-96,hmac-md5-96,hmac-sha256,hmac-sha256#ssh.com
[main] INFO : com.jcraft.jsch - kex: server: none,zlib
[main] INFO : com.jcraft.jsch - kex: server: none,zlib
[main] INFO : com.jcraft.jsch - kex: server:
[main] INFO : com.jcraft.jsch - kex: server:
[main] INFO : com.jcraft.jsch - kex: client: diffie-hellman-group1-sha1,diffie-hellman-group-exchange-sha1
[main] INFO : com.jcraft.jsch - kex: client: ssh-rsa,ssh-dss
[main] INFO : com.jcraft.jsch - kex: client: aes128-ctr,aes128-cbc,3des-ctr,3des-cbc,blowfish-cbc,aes192-cbc,aes256-cbc
[main] INFO : com.jcraft.jsch - kex: client: aes128-ctr,aes128-cbc,3des-ctr,3des-cbc,blowfish-cbc,aes192-cbc,aes256-cbc
[main] INFO : com.jcraft.jsch - kex: client: hmac-md5,hmac-sha1,hmac-sha2-256,hmac-sha1-96,hmac-md5-96
[main] INFO : com.jcraft.jsch - kex: client: hmac-md5,hmac-sha1,hmac-sha2-256,hmac-sha1-96,hmac-md5-96
[main] INFO : com.jcraft.jsch - kex: client: none
[main] INFO : com.jcraft.jsch - kex: client: none
[main] INFO : com.jcraft.jsch - kex: client:
[main] INFO : com.jcraft.jsch - kex: client:
[main] INFO : com.jcraft.jsch - kex: server->client aes128-cbc hmac-md5 none
[main] INFO : com.jcraft.jsch - kex: client->server aes128-cbc hmac-md5 none
[main] INFO : com.jcraft.jsch - SSH_MSG_KEXDH_INIT sent
[main] INFO : com.jcraft.jsch - expecting SSH_MSG_KEXDH_REPLY
[main] INFO : com.jcraft.jsch - ssh_rsa_verify: signature true
[main] WARN : com.jcraft.jsch - Permanently added 'remote.sever.com' (RSA) to the list of known hosts.
[main] INFO : com.jcraft.jsch - SSH_MSG_NEWKEYS sent
[main] INFO : com.jcraft.jsch - SSH_MSG_NEWKEYS received
[main] INFO : com.jcraft.jsch - SSH_MSG_SERVICE_REQUEST sent
[main] INFO : com.jcraft.jsch - SSH_MSG_SERVICE_ACCEPT received
[main] INFO : com.jcraft.jsch - Authentications that can continue: publickey,keyboard-interactive,password
[main] INFO : com.jcraft.jsch - Next authentication method: publickey
[main] INFO : com.jcraft.jsch - Authentications that can continue: keyboard-interactive,password
[main] INFO : com.jcraft.jsch - Next authentication method: keyboard-interactive
[main] INFO : com.jcraft.jsch - Authentication succeeded (keyboard-interactive).
[main] DEBUG: org.springframework.integration.util.SimplePool - Obtained new org.springframework.integration.sftp.session.SftpSession#6e75d758.
[main] DEBUG: org.springframework.integration.sftp.session.SftpSession - Initial File rename failed, possibly because file already exists. Will attempt to delete file: /inbox/sample_test.pgp and execute rename again.
[main] DEBUG: org.springframework.integration.file.remote.session.CachingSessionFactory - Releasing Session back to the pool.
[main] DEBUG: org.springframework.integration.util.SimplePool - Releasing org.springframework.integration.sftp.session.SftpSession#6e75d758 back to the pool
[main] DEBUG: com.ftp.service.CtrlMPOJO - ERROR UPLOADING FILES EXCEPTION IS
org.springframework.integration.MessageDeliveryException: Error handling message for file [/sample_test.pgp]
at org.springframework.integration.file.remote.handler.FileTransferringMessageHandler.handleMessageInternal(FileTransferringMessageHandler.java:183)
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:73)
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:115)
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:102)
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:77)
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:157)
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:128)
at org.springframework.integration.core.MessagingTemplate.doSend(MessagingTemplate.java:288)
at org.springframework.integration.core.MessagingTemplate.send(MessagingTemplate.java:149)
at org.springframework.integration.filter.MessageFilter.handleRequestMessage(MessageFilter.java:107)
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:134)
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:73)
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:115)
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:102)
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:77)
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:157)
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:128)
at org.springframework.integration.core.MessagingTemplate.doSend(MessagingTemplate.java:288)
at org.springframework.integration.core.MessagingTemplate.send(MessagingTemplate.java:149)
at org.springframework.integration.core.MessagingTemplate.convertAndSend(MessagingTemplate.java:189)
at org.springframework.integration.gateway.MessagingGatewaySupport.send(MessagingGatewaySupport.java:183)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.invokeGatewayMethod(GatewayProxyFactoryBean.java:309)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.doInvoke(GatewayProxyFactoryBean.java:269)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.invoke(GatewayProxyFactoryBean.java:260)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202)
at $Proxy5.uploadFilesToFTP(Unknown Source)
at com.ftp.util.FileUploadUtil.scanDirectoryAndUpload(FileUploadUtil.java:123)
at com.ftp.service.CtrlMPOJO.main(CtrlMPOJO.java:160)
Caused by: org.springframework.integration.MessagingException: Failed to write to '/inbox/sample_test.pgp.writing' while uploading the file
at org.springframework.integration.file.remote.handler.FileTransferringMessageHandler.sendFileToRemoteDirectory(FileTransferringMessageHandler.java:266)
at org.springframework.integration.file.remote.handler.FileTransferringMessageHandler.handleMessageInternal(FileTransferringMessageHandler.java:172)
... 28 more
Caused by: org.springframework.core.NestedIOException: Failed to delete file /inbox/sample_test.pgp; nested exception is org.springframework.core.NestedIOException: Failed to remove file: 2: Specified file path is invalid.
at org.springframework.integration.sftp.session.SftpSession.rename(SftpSession.java:157)
at org.springframework.integration.file.remote.session.CachingSessionFactory$CachedSession.rename(CachingSessionFactory.java:137)
at org.springframework.integration.file.remote.handler.FileTransferringMessageHandler.sendFileToRemoteDirectory(FileTransferringMessageHandler.java:262)
... 29 more
Caused by: org.springframework.core.NestedIOException: Failed to remove file: 2: Specified file path is invalid.
at org.springframework.integration.sftp.session.SftpSession.remove(SftpSession.java:71)
at org.springframework.integration.sftp.session.SftpSession.rename(SftpSession.java:151)
... 31 more
It works if I set the use-temporary-file-name=false but I do not want to set this flag incase if there is any file watcher job which may pick up incomplete file loaded at the remote server end.
Here is the configuration i have
<int:gateway id="file.upload.gateway"
service-interface="ftp.outbound.FTPUploadGateway"
default-request-channel="ftp.uploadgateway.request.channel"
default-reply-channel="ftp.uploadgateway.response.channel" />
<int:filter
input-channel="ftp.uploadgateway.request.channel"
output-channel="ftp.file.exist.outbound.channel"
discard-channel="upload.file.to.sftp"
expression="${ftp.outbound.remote.file.check.flag:false}">
</int:filter>
<sftp:outbound-channel-adapter id="sftpOutboundAdapter"
session-factory="sftpSessionFactory"
channel="upload.file.to.sftp"
charset="UTF-8"
remote-directory="${ftp.outbound.remote.directory}"
use-temporary-file-name="${ftp.outbound.use.temporary.filename:true}"
remote-filename-generator-expression="${ftp.outbound.remote.filename.expression}"/>
Here are the property values
ftp.outbound.remote.file.check.flag=false
ftp.outbound.remote.directory=/inbox/
ftp.outbound.use.temporary.filename=true
ftp.outbound.remote.filename.expression=payload.getName()
Please show the complete configuration.
Is your sftp user chrooted? If not /inbox/... is trying to manipulate the root directory on the server, and he likely doesn't have permissions for that.
Try removing the leading / from the remote dir.

Resources