failed to create SFTP Session - spring-integration

I'm using the below code to connect to FTP server(s) for transferring files, server details are entered in postgres DB using a HTML interface and the spring integration code should take the config from the DB and rotate through the list and pull the file from a specific folder.
It try to connect then it says connected but nothing is transferred and then after couple of minutes it disconnects then tries again.
I have two branches in the DB to rotate but it also tries for the first one only.
Error of disconnect is "java.lang.IllegalStateException: failed to create SFTP Session"
Here is the full log.
2018-10-23 11:53:51.694 INFO 1940 --- [ask-scheduler-1] com.jcraft.jsch : Connecting to cai-notes-fs.cai.ei port 21
2018-10-23 11:53:51.834 INFO 1940 --- [ask-scheduler-1] com.jcraft.jsch : Connection established
2018-10-23 11:56:01.227 INFO 1940 --- [ask-scheduler-1] com.jcraft.jsch : Disconnecting from cai-notes-fs.cai.ei port 21
2018-10-23 11:56:01.242 ERROR 1940 --- [ask-scheduler-1] o.s.integration.handler.LoggingHandler : org.springframework.messaging.MessagingException: Problem occurred while synchronizing remote to local directory; nested exception is org.springframework.messaging.MessagingException: Failed to execute on session; nested exception is java.lang.IllegalStateException: failed to create SFTP Session
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.synchronizeToLocalDirectory(AbstractInboundFileSynchronizer.java:331)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizingMessageSource.doReceive(AbstractInboundFileSynchronizingMessageSource.java:260)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizingMessageSource.doReceive(AbstractInboundFileSynchronizingMessageSource.java:65)
at org.springframework.integration.endpoint.AbstractFetchLimitingMessageSource.doReceive(AbstractFetchLimitingMessageSource.java:43)
at org.springframework.integration.endpoint.AbstractMessageSource.receive(AbstractMessageSource.java:154)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:343)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:197)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.integration.aop.AbstractMessageSourceAdvice.invoke(AbstractMessageSourceAdvice.java:43)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy121.receive(Unknown Source)
at org.springframework.integration.endpoint.SourcePollingChannelAdapter.receiveMessage(SourcePollingChannelAdapter.java:236)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.doPoll(AbstractPollingEndpoint.java:249)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:343)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:197)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:294)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:98)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy120.call(Unknown Source)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller.lambda$run$0(AbstractPollingEndpoint.java:378)
at org.springframework.integration.util.ErrorHandlingTaskExecutor.lambda$execute$0(ErrorHandlingTaskExecutor.java:53)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.integration.util.ErrorHandlingTaskExecutor.execute(ErrorHandlingTaskExecutor.java:51)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller.run(AbstractPollingEndpoint.java:372)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:93)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.springframework.messaging.MessagingException: Failed to execute on session; nested exception is java.lang.IllegalStateException: failed to create SFTP Session
at org.springframework.integration.file.remote.RemoteFileTemplate.execute(RemoteFileTemplate.java:445)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.synchronizeToLocalDirectory(AbstractInboundFileSynchronizer.java:286)
... 43 more
Caused by: java.lang.IllegalStateException: failed to create SFTP Session
at org.springframework.integration.sftp.session.DefaultSftpSessionFactory.getSession(DefaultSftpSessionFactory.java:393)
at org.springframework.integration.sftp.session.DefaultSftpSessionFactory.getSession(DefaultSftpSessionFactory.java:57)
at org.springframework.integration.file.remote.session.DelegatingSessionFactory.getSession(DelegatingSessionFactory.java:111)
at org.springframework.integration.file.remote.session.DelegatingSessionFactory.getSession(DelegatingSessionFactory.java:105)
at org.springframework.integration.file.remote.RemoteFileTemplate.execute(RemoteFileTemplate.java:431)
... 44 more
Caused by: java.lang.IllegalStateException: failed to connect
at org.springframework.integration.sftp.session.SftpSession.connect(SftpSession.java:272)
at org.springframework.integration.sftp.session.DefaultSftpSessionFactory.getSession(DefaultSftpSessionFactory.java:388)
... 48 more
Caused by: com.jcraft.jsch.JSchException: Session.connect: java.net.SocketException: Connection reset
at com.jcraft.jsch.Session.connect(Session.java:565)
at com.jcraft.jsch.Session.connect(Session.java:183)
at org.springframework.integration.sftp.session.SftpSession.connect(SftpSession.java:263)
... 49 more
2018-10-23 11:56:11.248 INFO 1940 --- [ask-scheduler-1] com.jcraft.jsch : Connecting to cai-notes-fs.cai.ei port 21
2018-10-23 11:56:11.414 INFO 1940 --- [ask-scheduler-1] com.jcraft.jsch : Connection established
//
// Source code recreated from a .class file by IntelliJ IDEA
// (powered by Fernflower decompiler)
//
Here is the correct code
package fefo.springframeworkftp.spring4ftpapp.configuration;
import com.jcraft.jsch.ChannelSftp;
import fefo.springframeworkftp.spring4ftpapp.model.Branch;
import fefo.springframeworkftp.spring4ftpapp.repository.BranchRepository;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.channel.NullChannel;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.dsl.Pollers;
import org.springframework.integration.dsl.SourcePollingChannelAdapterSpec;
import org.springframework.integration.expression.FunctionExpression;
import org.springframework.integration.file.remote.*;
import org.springframework.integration.file.remote.aop.RotatingServerAdvice;
import org.springframework.integration.file.remote.session.DelegatingSessionFactory;
import org.springframework.integration.file.remote.session.SessionFactory;
import org.springframework.integration.scheduling.PollerMetadata;
import org.springframework.integration.sftp.dsl.Sftp;
import org.springframework.integration.sftp.dsl.SftpInboundChannelAdapterSpec;
import org.springframework.integration.sftp.session.DefaultSftpSessionFactory;
import org.springframework.messaging.MessageChannel;
import org.springframework.stereotype.Component;
import java.io.File;
import java.time.Instant;
import java.time.ZoneId;
import java.time.format.DateTimeFormatter;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.function.Consumer;
#Configuration
#Component
public class SFTIntegration {
public static final String TIMEZONE_UTC = "UTC";
public static final String TIMESTAMP_FORMAT_OF_FILES = "yyyyMMddHHmmssSSS";
public static final String TEMPORARY_FILE_SUFFIX = ".part";
public static final int POLLER_FIXED_PERIOD_DELAY = 5000;
public static final int MAX_MESSAGES_PER_POLL = 100;
private static final Logger LOG = LoggerFactory.getLogger(SFTIntegration.class);
private static final String CHANNEL_INTERMEDIATE_STAGE = "intermediateChannel";
/* pulling the server config from postgres DB*/
private final BranchRepository branchRepository;
#Value("${app.temp-dir}")
private String localTempPath;
public SFTIntegration(BranchRepository branchRepository) {
this.branchRepository = branchRepository;
}
/**
* The default poller with 5s, 100 messages, RotatingServerAdvice and transaction.
*
* #return default poller.
*/
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata poller(){
return Pollers
.fixedDelay(POLLER_FIXED_PERIOD_DELAY)
.advice(advice())
.maxMessagesPerPoll(MAX_MESSAGES_PER_POLL)
.transactional()
.get();
}
/**
* The direct channel for the flow.
*
* #return MessageChannel
*/
#Bean
public MessageChannel stockIntermediateChannel() {
return new DirectChannel();
}
/**
* Get the files from a remote directory. Add a timestamp to the filename
* and write them to a local temporary folder.
*
* #return IntegrationFlow
*/
#Bean
public IntegrationFlow fileInboundFlowFromSFTPServer(){
final SftpInboundChannelAdapterSpec sourceSpec = Sftp.inboundAdapter(delegatingSFtpSessionFactory())
.preserveTimestamp(true)
.patternFilter("*.csv")
.deleteRemoteFiles(true)
.maxFetchSize(MAX_MESSAGES_PER_POLL)
.remoteDirectory("/")
.localDirectory(new File(localTempPath))
.temporaryFileSuffix(TEMPORARY_FILE_SUFFIX)
.localFilenameExpression(new FunctionExpression<String>(s -> {
final int fileTypeSepPos = s.lastIndexOf('.');
return DateTimeFormatter
.ofPattern(TIMESTAMP_FORMAT_OF_FILES)
.withZone(ZoneId.of(TIMEZONE_UTC))
.format(Instant.now())
+ "_"
+ s.substring(0, fileTypeSepPos)
+ s.substring(fileTypeSepPos);
}));
// Poller definition
final Consumer<SourcePollingChannelAdapterSpec> stockInboundPoller = endpointConfigurer -> endpointConfigurer
.id("stockInboundPoller")
.autoStartup(true)
.poller(poller());
return IntegrationFlows
.from(sourceSpec, stockInboundPoller)
.transform(File.class, p -> {
// log step
LOG.info("flow=stockInboundFlowFromAFT, message=incoming file: " + p);
return p;
})
.channel(CHANNEL_INTERMEDIATE_STAGE)
.get();
}
#Bean
public IntegrationFlow stockIntermediateStageChannel() {
return IntegrationFlows
.from(CHANNEL_INTERMEDIATE_STAGE)
.transform(p -> {
//log step
LOG.info("flow=stockIntermediateStageChannel, message=rename file: " + p);
return p;
})
//TODO
.channel(new NullChannel())
.get();
}
public DefaultSftpSessionFactory createNewSftpSessionFactory(final Branch branch){
final DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(false);
factory.setHost(branch.getHost());
factory.setUser(branch.getUsern());
factory.setPort(branch.getFtpPort());
factory.setPassword(branch.getPassword());
factory.setAllowUnknownKeys(true);
return factory;
}
#Bean
public DelegatingSessionFactory<ChannelSftp.LsEntry> delegatingSFtpSessionFactory(){
final List<Branch> branchConnections = new ArrayList<>();
branchRepository.findAll().forEach(branchConnections::add);
if(branchConnections.isEmpty()){
return null;
}
final Map<Object, SessionFactory<ChannelSftp.LsEntry>> factories = new LinkedHashMap<>(10);
for (Branch br : branchConnections) {
// create a factory for every key containing server type, url and port
if (factories.get(br.getId()) == null) {
factories.put(br.getId(), createNewSftpSessionFactory(br));
}
}
// use the first SF as the default
return new DelegatingSessionFactory<>(factories, factories.values().iterator().next());
}
#Bean
public RotatingServerAdvice advice(){
final List<Branch> branchConnections = new ArrayList<>();
branchRepository.findAll().forEach(branchConnections::add);
LOG.info("Found " + branchConnections.size() + " server entries for FEFO Stock.");
final List<RotatingServerAdvice.KeyDirectory> keyDirectories = new ArrayList<>();
for (final Branch br : branchConnections) {
keyDirectories
.add(new RotatingServerAdvice.KeyDirectory(br.getId().toString(), br.getFolderPath()));
}
/*final RotatingServerAdvice rot = new RotatingServerAdvice(
new MyStandardRotationPolicy(delegatingSFtpSessionFactory(), keyDirectories, true,
getFilter(), partnerConfigRepo));
return rot;*/
return new RotatingServerAdvice(delegatingSFtpSessionFactory(), keyDirectories, true);
}
}

Related

How to put files with SftpOutboundGateway MPUT command properly?

I want to upload all files from the local folder ~/sftp-outbound/Export
to a SFTP server. The folder contains two files:
foo1.TEST.txt
foo2.TEST.txt
I am doing it with the MPUT command of a SftpOutboundGateway within a Flow/Subflow DSL style (actually there are more gateway methods which I have removed for better focus and readability).
My configuration is this:
#Bean
public TransferChannel myChannel() {
LOG.debug("myChannel");
TransferChannel channel = new TransferChannel();
channel.setHost(myEnv.getSftpHost());
channel.setPort(myEnv.getSftpPort());
channel.setUser(myEnv.getSftpUser());
channel.setPrivateKey(myEnv.getSftpPrivateKey());
channel.setPassword(myEnv.getSftpPassword());
return channel;
}
#Bean
public TransferContext myContext(TransferChannel myChannel) {
LOG.debug("myContext");
TransferContext context = new TransferContext();
context.setEnabled(env.isEnabled());
context.setChannel(myChannel);
context.setPreserveTimestamp(true);
context.setLocalDir(env.getLocalDir());
context.setLocalFilename(env.getLocalFilename());
context.setRemoteDir(env.getRemoteDir());
return context;
}
#Bean
public SessionFactory<LsEntry> myFactory(TransferChannel myChannel) {
LOG.debug("myFactory");
DefaultSftpSessionFactory sf = new DefaultSftpSessionFactory();
sf.setHost(myChannel.getHost());
sf.setPort(myChannel.getPort());
sf.setUser(myChannel.getUser());
if (myChannel.getPrivateKey() != null) {
sf.setPrivateKey(myChannel.getPrivateKey());
} else {
sf.setPassword(myChannel.getPassword());
}
sf.setAllowUnknownKeys(true);
return new CachingSessionFactory<LsEntry>(sf);
}
#Bean
public IntegrationFlow myFlow(SessionFactory<LsEntry> myFactory, TransferContext myContext) {
LOG.debug("myFlow");
return IntegrationFlows.from(myGateway.class, g -> g
.header("method", args -> args.getMethod().getName()))
.log()
.route(Message.class, m -> m.getHeaders().get("method", String.class),
r -> r
.subFlowMapping("mput", f2 -> f2
.handle(Sftp.outboundGateway(
remoteFileTemplate(myFactory,
new SpelExpressionParser().parseExpression(
"headers['" + FileHeaders.REMOTE_DIRECTORY + "']")),
Command.MPUT, "payload"))
))
.get();
}
#Bean
public RemoteFileTemplate<LsEntry> remoteFileTemplate(SessionFactory<LsEntry> sessionFactory,
Expression directory) {
RemoteFileTemplate<LsEntry> template = new SftpRemoteFileTemplate(sessionFactory);
template.setRemoteDirectoryExpression(directory);
template.setAutoCreateDirectory(false);
template.afterPropertiesSet();
return template;
}
public interface MyGateway {
List<String> mput(String localDir,
#Header(FileHeaders.REMOTE_DIRECTORY) String remoteDirectory);
}
The call of the MyGateway method is:
#Autowired
private MyGateway gate;
...
String localFilename = "~/sftp-outbound/Export";
LOG.debug("runAsTask mput={}", localFilename);
jobLOG.info("put files to SFTP: {}", localFilename);
List<String> result = gate.mput(localFilename, env.getRemoteDir());
LOG.debug("runAsTask, files transferred, result={}", result);
It comes with the following LOG output:
2022-10-21 00:02:00.000 INFO [] --- [pool-125-thread-1] job-ExportData : put files to SFTP: ~/sftp-outbound/Export
2022-10-21 00:02:00.001 INFO [] --- [pool-125-thread-1] o.s.integration.handler.LoggingHandler : GenericMessage [payload=~/cdb/sftp-outbound/Export, headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#ce6c62d4, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#ce6c62d4, id=02f7972e-ef06-2297-b1dd-fed5cd75a2d2, method=mput, file_remoteDirectory=data/export, timestamp=1666303320001}]
2022-10-21 00:02:00.008 DEBUG [] --- [pool-125-thread-1] .l.c.c.j.ExportDataTask : runAsTask, files transferred, result=[data/export/02f7972e-ef06-2297-b1dd-fed5cd75a2d2.msg]
One sees two things:
there is only one file transferred instead of two
the remote filename is
02f7972e-ef06-2297-b1dd-fed5cd75a2d2.msg instead of
foo1.TEST.txt or foo2.TEST.txt
What am I doing wrong?
EDIT 1
Trying to find the problem I slightly changed the Integration Flow configuration, setting the remote directory inline to make the RemoteFileTemplate bean obsolete:
.subFlowMapping("mput", f2 -> f2
.handle(Sftp.outboundGateway(myFactory, Command.MPUT, "payload")
.autoCreateDirectory(false)
.remoteDirectoryExpression(myContext.getRemoteDir()))
Following your answer, Artem, I changed the SftpOutboundGateway method's parameter to type java.io.File as follows:
public interface MyGateway {
List<String> mput(File localDir);
}
The mput gateway method call is now:
List<String> result = gate.mput(new File(localFilename));
But this did not resolve my problem. Now I get the following error:
.subFlowMapping("mput", f2 -> f2
.handle(Sftp.outboundGateway(myFactory, Command.MPUT, "payload")
.autoCreateDirectory(false)
.remoteDirectoryExpression(myContext.getRemoteDir()))
2022-10-22 00:11:30.321 ERROR [] --- [pool-18-thread-1] .l.c.c.j.MyExportTask : MyExports failed! Exception: {}
org.springframework.expression.spel.SpelEvaluationException: EL1008E: Property or field 'data' cannot be found on object of type 'org.springframework.integration.support.MutableMessage' - maybe not public or not valid?
at org.springframework.expression.spel.ast.PropertyOrFieldReference.readProperty(PropertyOrFieldReference.java:217)
at org.springframework.expression.spel.ast.PropertyOrFieldReference.getValueInternal(PropertyOrFieldReference.java:104)
at org.springframework.expression.spel.ast.PropertyOrFieldReference.getValueInternal(PropertyOrFieldReference.java:91)
at org.springframework.expression.spel.ast.OpDivide.getValueInternal(OpDivide.java:49)
at org.springframework.expression.spel.ast.OpMinus.getValueInternal(OpMinus.java:98)
at org.springframework.expression.spel.ast.SpelNodeImpl.getTypedValue(SpelNodeImpl.java:117)
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:375)
at org.springframework.integration.util.AbstractExpressionEvaluator.evaluateExpression(AbstractExpressionEvaluator.java:171)
at org.springframework.integration.util.AbstractExpressionEvaluator.evaluateExpression(AbstractExpressionEvaluator.java:129)
at org.springframework.integration.handler.ExpressionEvaluatingMessageProcessor.processMessage(ExpressionEvaluatingMessageProcessor.java:107)
at org.springframework.integration.file.remote.RemoteFileTemplate.doSend(RemoteFileTemplate.java:325)
at org.springframework.integration.file.remote.RemoteFileTemplate.lambda$send$0(RemoteFileTemplate.java:301)
at org.springframework.integration.file.remote.RemoteFileTemplate.execute(RemoteFileTemplate.java:439)
at org.springframework.integration.file.remote.RemoteFileTemplate.send(RemoteFileTemplate.java:301)
at org.springframework.integration.file.remote.RemoteFileTemplate.send(RemoteFileTemplate.java:289)
at org.springframework.integration.file.remote.gateway.AbstractRemoteFileOutboundGateway.put(AbstractRemoteFileOutboundGateway.java:807)
at org.springframework.integration.file.remote.gateway.AbstractRemoteFileOutboundGateway.lambda$doPut$11(AbstractRemoteFileOutboundGateway.java:793)
at org.springframework.integration.file.remote.RemoteFileTemplate.invoke(RemoteFileTemplate.java:471)
at org.springframework.integration.file.remote.gateway.AbstractRemoteFileOutboundGateway.doPut(AbstractRemoteFileOutboundGateway.java:792)
at org.springframework.integration.file.remote.RemoteFileTemplate.send(RemoteFileTemplate.java:301)
at org.springframework.integration.file.remote.RemoteFileTemplate.send(RemoteFileTemplate.java:289)
at org.springframework.integration.file.remote.gateway.AbstractRemoteFileOutboundGateway.put(AbstractRemoteFileOutboundGateway.java:807)
at org.springframework.integration.file.remote.gateway.AbstractRemoteFileOutboundGateway.lambda$doPut$11(AbstractRemoteFileOutboundGateway.java:793)
at org.springframework.integration.file.remote.RemoteFileTemplate.invoke(RemoteFileTemplate.java:471)
at org.springframework.integration.file.remote.gateway.AbstractRemoteFileOutboundGateway.doMput(AbstractRemoteFileOutboundGateway.java:854)
at org.springframework.integration.file.remote.gateway.AbstractRemoteFileOutboundGateway.handleRequestMessage(AbstractRemoteFileOutboundGateway.java:594)
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:134)
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:62)
at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:115)
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:133)
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:106)
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:72)
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:570)
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:520)
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:187)
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:166)
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:47)
at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:109)
at org.springframework.integration.router.AbstractMessageRouter.doSend(AbstractMessageRouter.java:213)
at org.springframework.integration.router.AbstractMessageRouter.handleMessageInternal(AbstractMessageRouter.java:195)
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:62)
at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:115)
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:133)
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:106)
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:72)
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:570)
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:520)
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:187)
at org.springframework.messaging.core.GenericMessagingTemplate.doSendAndReceive(GenericMessagingTemplate.java:233)
at org.springframework.messaging.core.GenericMessagingTemplate.doSendAndReceive(GenericMessagingTemplate.java:47)
at org.springframework.messaging.core.AbstractMessagingTemplate.sendAndReceive(AbstractMessagingTemplate.java:46)
at org.springframework.integration.core.MessagingTemplate.sendAndReceive(MessagingTemplate.java:97)
at org.springframework.integration.core.MessagingTemplate.sendAndReceive(MessagingTemplate.java:38)
at org.springframework.messaging.core.AbstractMessagingTemplate.convertSendAndReceive(AbstractMessagingTemplate.java:96)
at org.springframework.messaging.core.AbstractMessagingTemplate.convertSendAndReceive(AbstractMessagingTemplate.java:86)
at org.springframework.integration.gateway.MessagingGatewaySupport.doSendAndReceive(MessagingGatewaySupport.java:514)
at org.springframework.integration.gateway.MessagingGatewaySupport.sendAndReceive(MessagingGatewaySupport.java:488)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.sendOrSendAndReceive(GatewayProxyFactoryBean.java:648)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.invokeGatewayMethod(GatewayProxyFactoryBean.java:573)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.doInvoke(GatewayProxyFactoryBean.java:540)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.doInvoke(GatewayProxyFactoryBean.java:540)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.invoke(GatewayProxyFactoryBean.java:529)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy1232.mput(Unknown Source)
at com.lhsystems.cdb.cdbjob.job.MyExportTask.runAsTask(MyExportTask.java:73)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:84)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:93)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:840)
I have no clue what causes that exception. The changed SubflowMapping even does not use a SPEL expression anymore. Any help is highly welcome!
So, you send a String localFilename to that gateway.
The logic there fails to this then:
else if (payload instanceof String) {
file = new File((String) payload);
}
And according to your observation and execution result we end up here:
else if (!file.isDirectory()) {
return doPut(requestMessage);
}
Some way it does not recognize the file against your ~/sftp-outbound/Export as a dir and just performs a plain PUT with a single file.
Try to resolve your ~/sftp-outbound/Export as a File object and send already that one to this gateway.

Android Logcat strange errors

I am new to Android app developing. While running my app on Genymotion (I have tried with multiple devices), the gradle builds without error, but the emulator says "Unfortunately, Fictiopedia has stopped.", and the Logcat shows lots of errors I can't understand. It's an app that creates its own words by randomly displaying meaningful parts of actual English words and their definition accordingly. (I deleted a large part of the code with words and their definition to make it fit in this question box)
Thanks for helping.
Here, what the Logcat displays and my code:
LOGCAT:
12-27 11:09:05.099 1946-1946/? E/libprocessgroup: failed to make and chown /acct/uid_10059: Read-only file system
12-27 11:09:05.099 1946-1946/? W/Zygote: createProcessGroup failed, kernel missing CONFIG_CGROUP_CPUACCT?
12-27 11:09:05.099 1946-1946/? I/art: Late-enabling -Xcheck:jni
12-27 11:09:05.143 1946-1956/? I/art: Debugger is no longer active
12-27 11:09:05.360 1946-1946/? I/InstantRun: starting instant run server: is main process
12-27 11:09:05.454 1946-1946/? W/art: Before Android 4.1, method android.graphics.PorterDuffColorFilter android.support.graphics.drawable.VectorDrawableCompat.updateTintFilter(android.graphics.PorterDuffColorFilter, android.content.res.ColorStateList, android.graphics.PorterDuff$Mode) would have incorrectly overridden the package-private method in android.graphics.drawable.Drawable
12-27 11:09:05.690 1946-1946/? W/ResourceType: Failure getting entry for 0x7f060054 (t=5 e=84) (error -75)
12-27 11:09:05.691 1946-1946/? W/ResourceType: Failure getting entry for 0x7f060054 (t=5 e=84) (error -75)
12-27 11:09:05.692 1946-1946/? D/AndroidRuntime: Shutting down VM
--------- beginning of crash
12-27 11:09:05.693 1946-1946/? E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.example.lauramessner.fictiopedia, PID: 1946
java.lang.RuntimeException: Unable to start activity ComponentInfo{com.example.lauramessner.fictiopedia/com.example.lauramessner.fictiopedia.MainActivity}: android.view.InflateException: Binary XML file line #0: Error inflating class ImageView
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2325)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2387)
at android.app.ActivityThread.access$800(ActivityThread.java:151)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1303)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5254)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
Caused by: android.view.InflateException: Binary XML file line #0: Error inflating class ImageView
at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:763)
at android.view.LayoutInflater.rInflate(LayoutInflater.java:806)
at android.view.LayoutInflater.inflate(LayoutInflater.java:504)
at android.view.LayoutInflater.inflate(LayoutInflater.java:414)
at android.view.LayoutInflater.inflate(LayoutInflater.java:365)
at android.support.v7.app.AppCompatDelegateImplV9.setContentView(AppCompatDelegateImplV9.java:287)
at android.support.v7.app.AppCompatActivity.setContentView(AppCompatActivity.java:139)
at com.example.lauramessner.fictiopedia.MainActivity.onCreate(MainActivity.java:26)
at android.app.Activity.performCreate(Activity.java:5990)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1106)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2278)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2387) 
at android.app.ActivityThread.access$800(ActivityThread.java:151) 
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1303) 
at android.os.Handler.dispatchMessage(Handler.java:102) 
at android.os.Looper.loop(Looper.java:135) 
at android.app.ActivityThread.main(ActivityThread.java:5254) 
at java.lang.reflect.Method.invoke(Native Method) 
at java.lang.reflect.Method.invoke(Method.java:372) 
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903) 
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698) 
Caused by: android.content.res.Resources$NotFoundException: Resource ID #0x7f060054
at android.content.res.Resources.getValue(Resources.java:1266)
at android.support.v7.widget.AppCompatDrawableManager.loadDrawableFromDelegates(AppCompatDrawableManager.java:330)
at android.support.v7.widget.AppCompatDrawableManager.getDrawable(AppCompatDrawableManager.java:195)
at android.support.v7.widget.AppCompatDrawableManager.getDrawable(AppCompatDrawableManager.java:188)
at android.support.v7.content.res.AppCompatResources.getDrawable(AppCompatResources.java:100)
at android.support.v7.widget.AppCompatImageHelper.loadFromAttributes(AppCompatImageHelper.java:58)
at android.support.v7.widget.AppCompatImageView.<init>(AppCompatImageView.java:78)
at android.support.v7.widget.AppCompatImageView.<init>(AppCompatImageView.java:68)
at android.support.v7.app.AppCompatViewInflater.createView(AppCompatViewInflater.java:106)
at android.support.v7.app.AppCompatDelegateImplV9.createView(AppCompatDelegateImplV9.java:1024)
at android.support.v7.app.AppCompatDelegateImplV9.onCreateView(AppCompatDelegateImplV9.java:1081)
at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:725)
at android.view.LayoutInflater.rInflate(LayoutInflater.java:806) 
at android.view.LayoutInflater.inflate(LayoutInflater.java:504) 
at android.view.LayoutInflater.inflate(LayoutInflater.java:414) 
at android.view.LayoutInflater.inflate(LayoutInflater.java:365) 
at android.support.v7.app.AppCompatDelegateImplV9.setContentView(AppCompatDelegateImplV9.java:287) 
at android.support.v7.app.AppCompatActivity.setContentView(AppCompatActivity.java:139) 
at com.example.lauramessner.fictiopedia.MainActivity.onCreate(MainActivity.java:26) 
at android.app.Activity.performCreate(Activity.java:5990) 
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1106) 
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2278) 
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2387) 
at android.app.ActivityThread.access$800(ActivityThread.java:151) 
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1303) 
at android.os.Handler.dispatchMessage(Handler.java:102) 
at android.os.Looper.loop(Looper.java:135) 
at android.app.ActivityThread.main(ActivityThread.java:5254) 
at java.lang.reflect.Method.invoke(Native Method) 
at java.lang.reflect.Method.invoke(Method.java:372) 
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903) 
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698) 
12-27 11:09:10.334 1946-1946/? I/Process: Sending signal. PID: 1946 SIG: 9
MY CODE:
package com.example.lauramessner.fictiopedia;
import android.content.pm.ActivityInfo;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.widget.ImageButton;
import android.widget.TextView;
public class MainActivity extends AppCompatActivity {
public void happen(View b){
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
final TextView pref = (TextView) findViewById(R.id.pref);
final TextView root = (TextView) findViewById(R.id.root);
final TextView suff = (TextView) findViewById(R.id.suff);
final TextView predef = (TextView) findViewById(R.id.predef);
final TextView rootdef = (TextView) findViewById(R.id.rootdef);
final TextView suffdef = (TextView) findViewById(R.id.suffdef);
final String[] prefixes = {"", "A", "An", "Ab", "Ad", "Ambi", "Ana", "Ante", "Anti", "Apo", "Auto","Bene"};
final String[] roots = {"", "ami", "ann", "anthrop", "aqua"};
final String[] suffixes = {"", "agog", "agogue", "cide", "ectomy"};
ImageButton pushMe = findViewById(R.id.pushMe);
pushMe.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
final int rando = (int) (Math.random());
suff.setText(prefixes[rando]);
root.setText(roots[rando]);
pref.setText(suffixes[rando]);
if (pref.getText().toString().contains("")) {
predef.setText("");
}
if (pref.getText().toString().contains("A")) {
predef.setText("not ");
}
if (pref.getText().toString().contains("An")) {
predef.setText("without ");
}
if (pref.getText().toString().contains("Ab")) {
predef.setText("away from ");
}
if (pref.getText().toString().contains("Ad")) {
predef.setText("toward ");
}
if (pref.getText().toString().contains("Ambi")) {
predef.setText("both ");
}
if (pref.getText().toString().contains("Ana")) {
predef.setText("again ");
}
if (pref.getText().toString().contains("Ante")) {
predef.setText("before ");
}
if (pref.getText().toString().contains("Anti")) {
predef.setText("against ");
}
if (pref.getText().toString().contains("Apo")) {
predef.setText("away from ");
}
if (pref.getText().toString().contains("Auto")) {
predef.setText("self ");
}
if (pref.getText().toString().contains("Bene")) {
predef.setText("good ");
}
if (root.getText().toString().contains("")) {
rootdef.setText("");
}
if (root.getText().toString().contains("ami")) {
rootdef.setText("love");
}
if (root.getText().toString().contains("ann")) {
rootdef.setText("year");
}
if (root.getText().toString().contains("anthrop")) {
rootdef.setText("human");
}
if (root.getText().toString().contains("aqua")) {
rootdef.setText("water");
}
if (suff.getText().toString().contains("")) {
suffdef.setText("");
}
if (suff.getText().toString().contains("agog")) {
suffdef.setText("Leader ");
}
if (suff.getText().toString().contains("agogue")) {
suffdef.setText("Leader ");
}
if (suff.getText().toString().contains("cide")) {
suffdef.setText("Act of killing ");
}
if (suff.getText().toString().contains("ectomy")) {
suffdef.setText("Surgical removal of ");
}
}
});
}
}

While creating sequenceFile getting ERROR nativeio.NativeIO: Unable to initialize NativeIO libraries

I am using standalone spark and while running a program which writes a sequence file using the pair RDD and I am getting below error:
ERROR nativeio.NativeIO: Unable to initialize NativeIO libraries
java.lang.NoSuchFieldError: workaroundNonThreadSafePasswdCalls
at org.apache.hadoop.io.nativeio.NativeIO.initNative(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO.<clinit>(NativeIO.java:58)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:653)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:286)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:385)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:364)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:555)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:892)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:393)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:354)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:476)
at org.apache.hadoop.mapred.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:58)
at org.apache.spark.SparkHadoopWriter.open(SparkHadoopWriter.scala:89)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$13.apply(PairRDDFunctions.scala:980)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$13.apply(PairRDDFunctions.scala:974)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
at org.apache.spark.scheduler.Task.run(Task.scala:54)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Following is the the code that I am using:
import java.util.ArrayList;
import java.util.List;
import scala.Tuple2;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.PairFunction;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.SequenceFileOutputFormat;
public class BasicSaveSequenceFile {
public static class ConvertToWritableTypes implements PairFunction<Tuple2<String, Integer>, Text, IntWritable> {
public Tuple2<Text, IntWritable> call(Tuple2<String, Integer> record) {
return new Tuple2(new Text(record._1), new IntWritable(record._2));
}
}
public static void main(String[] args) throws Exception {
if (args.length != 2) {
throw new Exception("Usage BasicSaveSequenceFile [sparkMaster] [output]");
}
String master = args[0];
String fileName = args[1];
JavaSparkContext sc = new JavaSparkContext(
master, "basicloadsequencefile", System.getenv("SPARK_HOME"), System.getenv("JARS"));
List<Tuple2<String, Integer>> input = new ArrayList();
input.add(new Tuple2("coffee", 1));
input.add(new Tuple2("coffee", 2));
input.add(new Tuple2("pandas", 3));
JavaPairRDD<String, Integer> rdd = sc.parallelizePairs(input);
JavaPairRDD<Text, IntWritable> result = rdd.mapToPair(new ConvertToWritableTypes());
result.saveAsHadoopFile(fileName, Text.class, IntWritable.class, SequenceFileOutputFormat.class);
}
}
I have added all the dependencies in pom.xml. Can you please help?
I upgraded the version of spark_core jar to 1.3.1 from 1.1.0 in pom.xml. Also, removed dependency on hadoop-common jar from pom.xml.
It worked for me!!!

How to connect hbase through Spark-sql context?

I'm working on a scenario where i want to access hbase from spark sql and read it as data frame. below is the code.
import java.io.IOException;
import java.util.HashMap;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.MasterNotRunningException;
import org.apache.hadoop.hbase.ZooKeeperConnectionException;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.SQLContext;
import com.google.protobuf.ServiceException;
public class hbasesql {
public static void main(String args[])
{
SparkConf conf = new SparkConf().setAppName("HbaseStreamTest").setMaster("yarn-client");
conf.set("spark.executor.extraClassPath", "/etc/hbase/conf/hbase-site.xml");
conf.set("hbase.zookeeper.quorum", "=servername:2181");
conf.set("hbase.zookeeper.property.clientPort", "2181");
Configuration hconf = HBaseConfiguration.create();
hconf.addResource(new Path("/etc/hbase/conf/hbase-site.xml"));
hconf.addResource(new Path("/etc/hbase/conf/core-site.xml"));
final JavaSparkContext context = new JavaSparkContext(conf);
SQLContext sqlContext = new org.apache.spark.sql.SQLContext(context);
try {
HBaseAdmin.checkHBaseAvailable(hconf);
System.out.println("HBase is running"); //This works
} catch (ServiceException e) {
System.out.println("HBase is not running");
e.printStackTrace();
} catch (MasterNotRunningException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (ZooKeeperConnectionException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String sqlMapping = "KEY_FIELD STRING :key" + ", sql_firstname STRING c1:FIRSTNAME" + ","
+ "sql_lastname STRING c1:LASTNAME" ;
HashMap<String, String> colMap = new HashMap<String, String>();
colMap.put("hbase.columns.mapping", sqlMapping);
colMap.put("hbase.table", "testtable");
// DataFrame dfJail =
DataFrame df = sqlContext.read().format("org.apache.hadoop.hbase.spark").options(colMap).load();
//DataFrame df = sqlContext.load("org.apache.hadoop.hbase.spark", colMap);
// This is useful when issuing SQL text queries directly against the
// sqlContext object.
df.registerTempTable("temp_emp");
DataFrame result = sqlContext.sql("SELECT count(*) from temp_emp");
System.out.println("df " + df);
System.out.println("result " + result);
df.show();
}
}
But while running the same i'm getting the below exception.
17/02/22 00:44:28 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
17/02/22 00:44:28 INFO storage.BlockManagerMasterEndpoint: Registering block manager servername:32831 with 2.1 GB RAM, BlockManagerId(2, servername, 32831)
Exception in thread "main" java.lang.NullPointerException
at org.apache.hadoop.hbase.spark.HBaseRelation.<init>(DefaultSource.scala:175)
at org.apache.hadoop.hbase.spark.DefaultSource.createRelation(DefaultSource.scala:78)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
at hbasesql.main(hbasesql.java:65)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/02/22 00:44:28 INFO spark.SparkContext: Invoking stop() from shutdown hook
I guess sqlcontext from sparkjavacontext is not able to load hbase-site.xml configuration in to it. or is there any other problem?
Any help is very much appreciated.
Thanks in advance :)

JavaStreamingContext nullpointer exception while fetching data from Cassandra

I want to read file data and check if file line data is present in Cassandra if it's present then needs to merge otherwise fresh insert to C*.
File data just contains name,address in json format, in Cassandra student table have UUID as primary key and there is secondry index on name
Once data is merged to cassandra I want to send new UUID or existing UUID to KAfka.
When I run on locally or single machine on mesos cluster(keeping line sparkConf.setMaster("local[4]");) this program works but when I submit to mesos master with 4 slaves(commenting line //sparkConf.setMaster("local[4]"); on cluster) there is nullpointer while selecting data from Cassandra on javastreaming context
I made streaming context static as earliar it was throwing serialization exception as it was getting accessed inside map transformation for file dstream.
Is there something wrong with the approach or ? is it because I am trying build Cassandra RDD withing DStream map tranformation which causing issue?
import kafka.producer.KeyedMessage;
import com.datastax.spark.connector.japi.CassandraStreamingJavaUtil;
import com.google.gson.Gson;
import com.google.gson.JsonObject;
import com.google.gson.JsonParser;
import java.util.Properties;
import java.util.UUID;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.streaming.Duration;
import org.apache.spark.streaming.api.java.JavaDStream;
import org.apache.spark.streaming.api.java.JavaStreamingContext;
import org.cloudera.spark.streaming.kafka.JavaDStreamKafkaWriter;
import org.cloudera.spark.streaming.kafka.JavaDStreamKafkaWriterFactory;
import static com.datastax.spark.connector.japi.CassandraJavaUtil.mapRowTo;
import static com.datastax.spark.connector.japi.CassandraJavaUtil.mapToRow;
public class DStreamExample {
public DStreamExample() {
}
private static JavaStreamingContext ssc;
public static void main(final String[] args) {
final SparkConf sparkConf = new SparkConf();
sparkConf.setAppName("SparkJob");
sparkConf.setMaster("local[4]"); // for local
sparkConf.set("spark.cassandra.connection.host", cassandra_hosts);
ssc = new JavaStreamingContext(sparkConf,new Duration(2000));
final JavaDStream<Student> studentFileDStream = ssc.textFileStream(
"/usr/local/fileDir/").map(line -> {
final Gson gson = new Gson();
final JsonParser parser = new JsonParser();
final JsonObject jsonObject = parser.parse(line)
.getAsJsonObject();
// generating new UUID
studentFile.setId(UUID.randomUUID());
final Student studentFile = gson.fromJson(jsonObject,
Student.class);
try{
//NullPointer at this line while running on cluster
final JavaRDD<Student> cassandraStudentRDD =
CassandraStreamingJavaUtil.javaFunctions(ssc)
.cassandraTable("keyspace", "student",
mapRowTo(Student.class)).where("name=?",
studentFile.getName());
//If student name is found in cassandra table then assign UUID to fileStudent object
//This way i wont create multiple records for same name student
final Student studentCassandra = cassandraStudentRDD.first();
studentFile.setId(studentCassandra.getId());
}catch(Exception e){
}
return studentFile;
});
//Save student to Cassandra
CassandraStreamingJavaUtil.javaFunctions(studentFileDStream)
.writerBuilder("keyspace", "student", mapToRow(Student.class))
.saveToCassandra();
final JavaDStreamKafkaWriter<Student> writer =
JavaDStreamKafkaWriterFactory.fromJavaDStream(studentFileDStream);
final Properties properties = new Properties();
properties.put("metadata.broker.list", "server:9092");
properties.put("serializer.class", "kafka.serializer.StringEncoder");
//Just send studnet UUID_PUT to kafka
writer.writeToKafka(properties,
student ->
new KeyedMessage<>("TOPICNAME", student.getId() + "_PUT"));
ssc.start();
ssc.awaitTermination();
}
}
class Student {
private String address;
private UUID id;
private String name;
public Student() {
}
public String getAddress() {
return address;
}
public void setAddress(String address) {
this.address = address;
}
public UUID getId() {
return id;
}
public void setId(UUID id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
Exception Stacktrac::
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, servername): java.lang.NullPointerException
at com.datastax.spark.connector.japi.CassandraStreamingJavaUtil.javaFunctions(CassandraStreamingJavaUtil.java:39)
at com.ebates.ps.batch.sparkpoc.DStreamPOCExample.lambda$main$d2c4cc2c$1(DStreamPOCExample.java:109)
at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1027)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.cloudera.spark.streaming.kafka.RDDKafkaWriter$$anonfun$writeToKafka$1.apply(RDDKafkaWriter.scala:47)
at org.cloudera.spark.streaming.kafka.RDDKafkaWriter$$anonfun$writeToKafka$1.apply(RDDKafkaWriter.scala:45)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:898)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:898)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:88)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1822)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1835)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1848)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1919)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:898)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:896)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:896)

Resources