PhantomJS & Chrome driver both not working with linux - linux

I have already followed similar questions and answers but none of them helped me.
I have setup code on bit bucket, Integrated everything in jenkins. Added Chromedriver and Phantom JS pakage to folder /opt/
Added /opt/ permission to 777.
I tried to execute build with both [Chromedriver and Phantomjs] one by one but having same error.
Stack Trace :
java.lang.IllegalStateException: The driver is not executable: /opt/chromedriver
at com.google.common.base.Preconditions.checkState(Preconditions.java:197)
at org.openqa.selenium.remote.service.DriverService.checkExecutable(DriverService.java:126)
at org.openqa.selenium.remote.service.DriverService.findExecutable(DriverService.java:117)
at org.openqa.selenium.chrome.ChromeDriverService.access$000(ChromeDriverService.java:32)
at org.openqa.selenium.chrome.ChromeDriverService$Builder.findDefaultExecutable(ChromeDriverService.java:118)
at org.openqa.selenium.remote.service.DriverService$Builder.build(DriverService.java:291)
at org.openqa.selenium.chrome.ChromeDriverService.createDefaultService(ChromeDriverService.java:82)
at org.openqa.selenium.chrome.ChromeDriver.<init>(ChromeDriver.java:117)
at Testcases.BaseTest.Setup(BaseTest.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:80)
at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:564)
at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:213)
at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:138)
at org.testng.SuiteRunner.privateRun(SuiteRunner.java:277)
at org.testng.SuiteRunner.run(SuiteRunner.java:240)
at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
at org.testng.TestNG.runSuitesSequentially(TestNG.java:1198)
at org.testng.TestNG.runSuitesLocally(TestNG.java:1123)
at org.testng.TestNG.run(TestNG.java:1031)
at org.apache.maven.surefire.testng.TestNGExecutor.run(TestNGExecutor.java:77)
at org.apache.maven.surefire.testng.TestNGDirectoryTestSuite.execute(TestNGDirectoryTestSuite.java:110)
at org.apache.maven.surefire.testng.TestNGProvider.invoke(TestNGProvider.java:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)

Related

I can not run successfuly for cucumber report

I can run successfully in terminal window by command “mvn clean verify -Dcucumber.options="--tags #ValidateLoggedInUser"”
Because some the other test cases have error. So I need to debug to check the step and reason.
But there is error when run it in *.feature file.
java.lang.NullPointerException at
com.cucumber.listener.Reporter.loadXMLConfig(Reporter.java:56) at
com.dxc.vpc.automation.glue.CucumberUtilities.extentReport(CucumberUtilities.java:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
cucumber.runtime.Utils$1.call(Utils.java:40) at
cucumber.runtime.Timeout.timeout(Timeout.java:16) at
cucumber.runtime.Utils.invoke(Utils.java:34) at
cucumber.runtime.java.JavaHookDefinition.execute(JavaHookDefinition.java:60)
at cucumber.runtime.Runtime.runHookIfTagsMatch(Runtime.java:224) at
cucumber.runtime.Runtime.runHooks(Runtime.java:212) at
cucumber.runtime.Runtime.runBeforeHooks(Runtime.java:202) at
cucumber.runtime.model.CucumberScenario.run(CucumberScenario.java:40)
at
cucumber.runtime.model.CucumberFeature.run(CucumberFeature.java:165)
at cucumber.runtime.Runtime.run(Runtime.java:122) at
cucumber.api.cli.Main.run(Main.java:36) at
cucumber.api.cli.Main.main(Main.java:18)

Spark 2 action fails in oozie workflow: ./assembly/target/scala-2.11/jars' does not exist; make sure Spark is built

I am able to run my spark job with spark-submit, but am not able to run it from an oozie workflow.
I get the following output error:
`java.lang.IllegalStateException: Library directory '/mnt/resource/hadoop/yarn/local/usercache/sshuser/appcache/application_1521646255340_0010/container_1521646255340_0010_01_000002/./assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.
at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:260)
at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:359)
at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:556)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:845)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:170)
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1174)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1233)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:782)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:312)
at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:233)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:58)
at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:62)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:242)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)`
I have followed these instructions: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.0/bk_spark-component-guide/content/ch_oozie-spark-action.html#spark-config-oozie-spark2

spark-shell - failure to initialize terminal (Windows 10, official cmd.exe)

I'm trying to run Spark on Windows 10 with hadoop on the official Microsoft terminal cmd.exe.
I don't have problem with Hadoop. The installation and stating is OK.
I'm using Java 8 x64 (jdk1.8.0_92)
When I start Spark with the command spark-shell, I got the Java error bellow :
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.NoClassDefFoundError: Could not initialize class scala.tools.fusesource_embedded.jansi.internal.Kernel32
at scala.tools.fusesource_embedded.jansi.internal.WindowsSupport.getConsoleMode(WindowsSupport.java:50)
at scala.tools.jline_embedded.WindowsTerminal.getConsoleMode(WindowsTerminal.java:204)
at scala.tools.jline_embedded.WindowsTerminal.init(WindowsTerminal.java:82)
at scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:101)
at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:158)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:229)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:221)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:209)
at scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.(JLineReader.scala:61)
at scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.(JLineReader.scala:33)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:865)
at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:862)
at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:871)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
at scala.util.Try$.apply(Try.scala:192)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223)
at scala.collection.immutable.Stream.collect(Stream.scala:435)
at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:877)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply(ILoop.scala:916)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:916)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
at org.apache.spark.repl.Main$.main(Main.scala:49)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I get exactly the same stack trace when opening the scala console in Netbeans. If I type any scala expression below the stack trace, it works fine though.

Cannot launch the Groovy Console

I have just did a fresh install of Groovy 2.4.3 in OSX 10.10.3 by means of the GVM tool.
I also installed, using GVM, related libraries and tools such as groovyserv, grails and gradle.
The Java version I am using is 1.8.0_25.
Everything seems fine with the exception that I cannot start the Groovy Console by means of the command groovyConsole since I keep obtaining this exception:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.groovy.tools.GroovyStarter.rootLoader(GroovyStarter.java:106)
at org.codehaus.groovy.tools.GroovyStarter.main(GroovyStarter.java:128)
Caused by: javax.xml.parsers.FactoryConfigurationError: Provider for javax.xml.parsers.SAXParserFactory cannot be found
at javax.xml.parsers.SAXParserFactory.newInstance(Unknown Source)
at org.apache.ivy.core.settings.XmlSettingsParser.doParse(XmlSettingsParser.java:160)
at org.apache.ivy.core.settings.XmlSettingsParser.parse(XmlSettingsParser.java:150)
at org.apache.ivy.core.settings.IvySettings.load(IvySettings.java:417)
at org.apache.ivy.core.settings.IvySettings$load.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:110)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:122)
at groovy.grape.GrapeIvy.<init>(GrapeIvy.groovy:96)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at java.lang.Class.newInstance(Class.java:438)
at groovy.grape.Grape.getInstance(Grape.java:117)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.getProperty(MetaClassImpl.java:1844)
at groovy.lang.MetaClassImpl.getProperty(MetaClassImpl.java:3734)
at org.codehaus.groovy.runtime.callsite.ClassMetaClassGetPropertySite.getProperty(ClassMetaClassGetPropertySite.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGetProperty(AbstractCallSite.java:293)
at groovy.ui.ConsoleIvyPlugin.addListener(ConsoleIvyPlugin.groovy:41)
at groovy.ui.ConsoleIvyPlugin$addListener.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:110)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:122)
at groovy.ui.Console.<init>(Console.groovy:239)
at groovy.ui.Console.<init>(Console.groovy:221)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:57)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:232)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:244)
at groovy.ui.Console.main(Console.groovy:206)
... 6 more
What is the best way to fix this?
I had this one before and it was due to an environment classpath variable set to point to an invalid jar
I had also same issue. In my case JAVA_HOME path was wrong. So I set JAVA_HOME then its working without any problem.
After setting JAVA_HOME, its launched:

cassandra snapshot : Unable to create hard link from errno 1

i'm using cassandra 1.2.8 in ubuntu 12.4.
i used snapshot command already and backing up process was ok!
but now when i use this command i get error like below:
Exception in thread "main" FSWriteError in /var/lib/cassandra/data/english/word_doc /snapshots/1393155090911/english-word_doc-ic-10-Index.db
at org.apache.cassandra.io.util.FileUtils.createHardLink(FileUtils.java:80)
at org.apache.cassandra.io.sstable.SSTableReader.createLinks(SSTableReader.java:1063)
at org.apache.cassandra.db.ColumnFamilyStore.snapshotWithoutFlush(ColumnFamilyStore.java:1566)
at org.apache.cassandra.db.ColumnFamilyStore.snapshot(ColumnFamilyStore.java:1611)
at org.apache.cassandra.db.Table.snapshot(Table.java:194)
at org.apache.cassandra.service.StorageService.takeSnapshot(StorageService.java:2205)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:74)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:277)
at com.sun.jmx.mbeanserver.StandardMBeanIntrospector.invokeM2(StandardMBeanIntrospector.java:112)
at com.sun.jmx.mbeanserver.StandardMBeanIntrospector.invokeM2(StandardMBeanIntrospector.java:46)
at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:226)
at com.sun.jmx.mbeanserver.PerInterface.invoke(PerInterface.java:138)
at com.sun.jmx.mbeanserver.MBeanSupport.invoke(MBeanSupport.java:253)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:856)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:805)
at javax.management.remote.rmi.RMIConnectionImpl.doOperation(RMIConnectionImpl.java:1476)
at javax.management.remote.rmi.RMIConnectionImpl.access$300(RMIConnectionImpl.java:98)
at javax.management.remote.rmi.RMIConnectionImpl$PrivilegedOperation.run(RMIConnectionImpl.java:1317)
at javax.management.remote.rmi.RMIConnectionImpl.doPrivilegedOperation(RMIConnectionImpl.java:1409)
at javax.management.remote.rmi.RMIConnectionImpl.invoke(RMIConnectionImpl.java:839)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:322)
at sun.rmi.transport.Transport$1.run(Transport.java:177)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:173)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:553)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:808)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:667)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.IOException: Unable to create hard link from /var/lib/cassandra/data/english/word_doc/english-word_doc-ic-10-Index.db to /var/lib/cassandra/data/english/word_doc/snapshots/1393155090911/english-word_doc-ic-10-Index.db (errno 1)
at org.apache.cassandra.utils.CLibrary.createHardLink(CLibrary.java:174)
at org.apache.cassandra.io.util.FileUtils.createHardLink(FileUtils.java:76)
... 40 more
note: my current data base is a back up version that i restore in my cassandra!
thanks for your help :)
edit: i have two keyspace,A and B. B is a back up version that i restore and store more data on it, but A is not. i can create back up from A with out any problem,but for B i get above error!
this seems to be an ubuntu 12.04 security issue on hard-links.
use this command to disable that:
sudo sysctl -w kernel.yama.protected_nonaccess_hardlinks=0
and after restore, reset it to its original value:
sudo sysctl -w kernel.yama.protected_nonaccess_hardlinks=1
After struggling with this issue again I found that I had a few root-owned files inside /var/lib/cassandra/data/.../etcetera, which triggered the hard link access restriction.
Recursively chowning all directories and files inside /var/lib/cassandra/data solved the problem.

Resources