sqoop import - GSS initiate failed - Failed to find any Kerberos tgt - security

sqoop import - errors out
sqoop list-tables/ eval works fine.
Distribution: Cloudera
Security issue:
GSS initiate failed [Caused by GSSException: No valid credentials
provided (Mechanism level: Failed to find any Kerberos tgt)]
WARN security.UserGroupInformation: PriviledgedActionException
as:pars7611 (auth:KERBEROS) cause:java.io.IOException: Failed on local
exception: java.io.IOException: javax.security.sasl.SaslException: GSS
initiate failed [Caused by GSSException: No valid credentials provided
(Mechanism level: Failed to find any Kerberos tgt)]; ERROR
tool.ImportTool: Import failed: java.io.IOException: Failed on local
exception: java.io.IOException: javax.security.sasl.SaslException: GSS
initiate failed [Caused by GSSException: No valid credentials provided
(Mechanism level: Failed to find any Kerberos tgt)];

The error
GSS initiate failed [Caused by GSSException: No valid credentials
provided (Mechanism level: Failed to find any Kerberos tgt)]
is usually caused by no Kerberos ticket. Please use kinit to get a Kerberos ticket before running sqoop command.
That being said, it was puzzling that it can do list-tables, evals. The metastore should be protected by Kerberos authentication too. Please check whether the hive metastore is secured by Kerberos, that is, the hive.metastore.sasl.enabled is set to be true or not.

Related

Spark Thrift 3.2.2 impersonate user facing error with metastore authen. SASL negotiation failure, GSS initiate failed

On hadoop kerberized cluster. If im not impersonate user on spark thrift server. It work well. But when i do it. Im facing an error about authentication with metastore.
I flow this document
https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.6.4/bk_spark-component-guide/content/config-sts-user-imp.html
https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.6.4/bk_data-access/content/ref-5422cb60-d1d5-425a-b719-ec7bd03ee5d3.1.html
Step 1:
Set hive.server2.enable.doAs = true in Advanced spark-hive-site-override
Add spark.jars = /usr/hdp/current/spark-thriftserver/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/current/spark-thriftserver/lib/datanucleus-core-3.2.10.jar,/usr/hdp/current/spark-thriftserver/lib/datanucleus-rdbms-3.2.9.jar in Custom spark-thrift-sparkconf
Step 2: in Advanced hiveserver2-site
Set hive.security.authorization.enabled = true
Set hive.server2.enable.doAs = true
Set hive.metastore.pre.event.listeners = org.apache.hadoop.hive.ql.security.authorization.AuthorizationPreEventListener
Set hive.security.metastore.authorization.manager = org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider
Step 3:
I created a user keytab and princinpal and kinit
Run cli: beeline -u 'jdbc:hive2://:/default;principal=spark3/#;auth=KERBEROS;transportMode=binary'
Result:
Connecting to jdbc:hive2://<host>:<port>/default;principal=spark3/<HOST>#<REAM>;auth=KERBEROS;transportMode=binary
Connected to: Spark SQL (version 3.2.2)
Driver: Hive JDBC (version 3.1.0.3.1.4.0-315)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 3.1.0.3.1.4.0-315 by Apache Hive
Run cli: show databases;
And I'm facing an error like this
Error: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
....
Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
....
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
....
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
....
Caused by: java.lang.reflect.InvocationTargetException
....
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: GSS initiate failed
I checked log of spark thrift see like that
22/10/07 15:07:31 INFO ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V10
22/10/07 15:07:31 INFO HiveSessionImpl: Operation log session directory is created: /tmp/spark3/operation_logs/64eb19a6-1bdc-4ed8-81c9-8881c4251e75
22/10/07 15:07:31 INFO metastore: Trying to connect to metastore with URI thrift://<host>:<port>
22/10/07 15:07:32 INFO metastore: Opened a connection to metastore, current connections: 1
22/10/07 15:07:32 INFO metastore: Connected to metastore.
22/10/07 15:07:39 INFO SparkExecuteStatementOperation: Submitting query 'show databases' with fdcf90cb-74bb-4574-99b7-bfd981ce8010
22/10/07 15:07:39 INFO SparkExecuteStatementOperation: Running query with fdcf90cb-74bb-4574-99b7-bfd981ce8010
22/10/07 15:07:39 INFO metastore: Closed a connection to metastore, current connections: 0
22/10/07 15:07:39 INFO metastore: Trying to connect to metastore with URI thrift://<host>:<port>
22/10/07 15:07:39 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
22/10/07 15:07:39 WARN metastore: Failed to connect to the MetaStore Server...
22/10/07 15:07:39 INFO metastore: Waiting 5 seconds before next connection attempt.
22/10/07 15:07:44 INFO metastore: Trying to connect to metastore with URI thrift://<host>:<port>
22/10/07 15:07:44 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
I test connect to spark thrift server successed, but when i run query. Im facing error above. Where am i wrong?
Spark Thrift Server is built upon a single spark application, unfortunately, it does not support impersonation yet.
Maybe you can try Apache Kyuubi https://github.com/apache/incubator-kyuubi

spark job submitted via Livy throws GSSException: No valid credentials provided (Mechanism level: Failed to find any kerberos tgt)

I am trying to launch my spark batch job using livy. From the logs , i see that the start running but fails when it tries to access hive metastore with the following kerberos error:
GSSException: No valid credentials provided (Mechanism level: Failed
to find any kerberos tgt)
The same job runs fine when i launch it using a spark-submit command. However in the spark-submit command i pass the keytab and principal (--keytab, --principal).
I tried passing the keytab and principal in the livy rest call using the parameters spark.yarn.keytab and spark.yarn.principal. adding these options throw the following error:
Error: only one of --proxy-user or --principal can be provided
even though I do not provide proxyUser parameter in my curl request.
kindly let me know if you know how to resolve this issue

Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled

I am trying to configure hadoop pseudo node secure cluster (to ensure proper working) in Azure using Azure Domain Service.
OS - Windows Server 2012 R2 Datacenter
Hadoop Version - 2.7.2
I can able to run
hadoop fs -ls /
Example MapReduce job works fine
yarn jar %HADOOP_HOME%\share\hadoop\mapreduce\hadoop-mapreduce-examples-*.jar pi 16 10000
But when i run,
hdfs fsck /
it gives,
Connecting to namenode via https://node1:50470/fsck?ugi=Kumar&path=%2F
Exception in thread "main" java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos credentails)
at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:335)
at org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:73)
at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:152)
at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:149)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:377)
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos credentails)
at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
at org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77)
at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214)
at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
at org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:161)
at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:333)
... 10 more
When i access namenode web ui, it shows
GSSException: Failure unspecified at GSS-API level (Mechanism level: Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)
But the same configuration works fine in Local Windows Active Directory
Someone help me to resolve this error and get it work successfully.

Accessing HBase through Spark with Security enabled

I'm trying to run a spark job that accesses HBase with security enabled. When i run the following command:
/usr/local/spark-2/bin/spark-submit --keytab
/etc/hadoop/conf/spark.keytab --principal
spark/hadoop-master#platalyticsrealm --class
com.platalytics.example.spark.App --master yarn --driver-class-path
/root/hbase-1.2.2/conf /home/vm6/project-1-jar-with-dependencies.jar
I get the following error:
2016-08-07 20:43:57,617 WARN [hconnection-0x24b5fa45-metaLookup-shared--pool2-t1] ipc.RpcClientImpl: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
2016-08-07 20:43:57,619 ERROR [hconnection-0x24b5fa45-metaLookup-shared--pool2-t1] ipc.RpcClientImpl: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094)
at org.apache.hadoop.hbase.client.ClientSmallScanner$SmallScannerCallable.call(ClientSmallScanner.java:201)
at org.apache.hadoop.hbase.client.ClientSmallScanner$SmallScannerCallable.call(ClientSmallScanner.java:180)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:210)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:360)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:334)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 25 more
I have Spark running on Yarn with security enabled. I have kinit'd from console and have provided necessarry principals and keytabs. Can you please help me find out the issue?

Setting up inter-node encryption in Cassandra

I am new to Cassandra and looking to setup internode encryption in Cassandra 1.2.8.
I have successfully created a keypair for the keystore and truststore following the steps outlined here:
http://docs.oracle.com/javase/6/docs/technotes/guides/security/jsse/JSSERefGuide.html#CreateKeystore
In the Cassandra.yaml file, I have adjusted the server encryption options to the following:
server_encryption_options:
internode_encryption: all
keystore: conf/keystore
keystore_password: password
truststore: conf/truststore
truststore_password: password
However, when I start the Cassandra server, I receive the following error:
ERROR 18:49:20,883 Fatal configuration error
org.apache.cassandra.exceptions.ConfigurationException: Unable to create ssl socket
at org.apache.cassandra.net.MessagingService.getServerSocket(MessagingService.java:410)
at org.apache.cassandra.net.MessagingService.listen(MessagingService.java:390)
at org.apache.cassandra.service.StorageService.joinTokenRing(StorageService.java:589)
at org.apache.cassandra.service.StorageService.initServer(StorageService.java:554)
at org.apache.cassandra.service.StorageService.initServer(StorageService.java:451)
at org.apache.cassandra.service.CassandraDaemon.setup(CassandraDaemon.java:348)
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:447)
at org.apache.cassandra.service.CassandraDaemon.main(CassandraDaemon.java:490)
Caused by: java.io.IOException: Error creating the initializing the SSL Context
at org.apache.cassandra.security.SSLFactory.createSSLContext(SSLFactory.java:124)
at org.apache.cassandra.security.SSLFactory.getServerSocket(SSLFactory.java:53)
at org.apache.cassandra.net.MessagingService.getServerSocket(MessagingService.java:406)
... 7 more
Caused by: java.io.FileNotFoundException: conf\truststore\dev (The system cannot find the path specified)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(Unknown Source)
at java.io.FileInputStream.<init>(Unknown Source)
at org.apache.cassandra.security.SSLFactory.createSSLContext(SSLFactory.java:105)
... 9 more
Unable to create ssl socket
Fatal configuration error; unable to start server. See log for stacktrace.
ERROR 18:49:20,887 Exception in thread Thread[StorageServiceShutdownHook,5,main]
java.lang.NullPointerException
at org.apache.cassandra.service.StorageService.stopRPCServer(StorageService.java:321)
at org.apache.cassandra.service.StorageService.shutdownClientServers(StorageService.java:370)
at org.apache.cassandra.service.StorageService.access$000(StorageService.java:88)
at org.apache.cassandra.service.StorageService$1.runMayThrow(StorageService.java:519)
at org.apache.cassandra.utils.WrappedRunnable.run(WrappedRunnable.java:28)
at java.lang.Thread.run(Unknown Source)
Please note the server runs without issues if the server encryption options is set back to none. Any thoughts/guidance would be appreciated.
Read the exception carefully:
Caused by: java.io.FileNotFoundException: conf\truststore\dev
(The system cannot find the path specified)
You've created the key/trust stores but you haven't pointed cassandra to them. In cassandra.yaml you need to enable SSL but you also need to specify the path to these two files. E.g:
server_encryption_options:
internode_encryption: all
keystore: C:\some\location
keystore_password: password
truststore: C:\some\other\location
truststore_password: password
Also remember to supply the key/trust store passwords instead of the example in cassandra.yaml.

Resources