WebSphere 7: GSSCredential expired, must login again - websphere-7

I'm recieving this strange error when I'm trying to send a WS request (below). I'm using WebSphere App Server 7.0.0.23. Since I am not familiar with such low level WebSphere funcationaliry, any help regarding this issue would be appreciated. Thanx!
Caused by: javax.xml.ws.WebServiceException: java.io.IOException: Unable to deserialize the Subjects in this Context, cause: org.ietf.jgss.GSSException, major code: 8, minor code: 0
major string: Credential expired
minor string: GSSCredential expired, must login again.
at org.apache.axis2.jaxws.ExceptionFactory.createWebServiceException(ExceptionFactory.java:175)
at org.apache.axis2.jaxws.ExceptionFactory.makeWebServiceException(ExceptionFactory.java:70)
at org.apache.axis2.jaxws.ExceptionFactory.makeWebServiceException(ExceptionFactory.java:128)
at org.apache.axis2.jaxws.core.controller.impl.AxisInvocationController.execute(AxisInvocationController.java:586)
at org.apache.axis2.jaxws.core.controller.impl.AxisInvocationController.doInvoke(AxisInvocationController.java:130)
at org.apache.axis2.jaxws.core.controller.impl.InvocationControllerImpl.invoke(InvocationControllerImpl.java:93)
at org.apache.axis2.jaxws.client.proxy.JAXWSProxyHandler.invokeSEIMethod(JAXWSProxyHandler.java:364)
at org.apache.axis2.jaxws.client.proxy.JAXWSProxyHandler.invoke(JAXWSProxyHandler.java:185)
... 72 more
Caused by: java.io.IOException: Unable to deserialize the Subjects in this Context, cause: org.ietf.jgss.GSSException, major code: 8, minor code: 0
major string: Credential expired
minor string: GSSCredential expired, must login again.
at com.ibm.ws.security.context.ContextImpl.readObject(ContextImpl.java:1116)
at sun.reflect.GeneratedMethodAccessor413.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:600)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1030)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1855)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1759)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1335)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:354)
at com.ibm.ws.websvcs.utils.SecurityContextMigrator$6.run(SecurityContextMigrator.java:389)
at java.security.AccessController.doPrivileged(AccessController.java:251)
at com.ibm.ws.websvcs.utils.SecurityContextMigrator.migrateThreadToContext(SecurityContextMigrator.java:386)
at org.apache.axis2.util.ThreadContextMigratorUtil.performMigrationToContext(ThreadContextMigratorUtil.java:163)
at org.apache.axis2.jaxws.core.controller.impl.AxisInvocationController.preExecute(AxisInvocationController.java:608)
at org.apache.axis2.jaxws.core.controller.impl.AxisInvocationController.execute(AxisInvocationController.java:570)
... 78 more
Caused by: com.ibm.websphere.security.WSSecurityException: org.ietf.jgss.GSSException, major code: 8, minor code: 0
major string: Credential expired
minor string: GSSCredential expired, must login again.
at com.ibm.ws.security.context.ContextImpl.doLogin(ContextImpl.java:781)
at com.ibm.ws.security.context.ContextImpl.deserializeSubjects(ContextImpl.java:1144)
at com.ibm.ws.security.context.ContextImpl.readObject(ContextImpl.java:1110)
... 92 more

The current credential has expired or your keytab has been discontinued. Export a new keytab for your server.

Related

wildfly 25 JSF Security

I'm fully aware that wildfly 25 has dropped legacy security realms.
So I tried to move from wildfly 20.0.1 to wildfly 25.0.1.
According to the quickstart ee-security, I did
/subsystem=elytron/policy=jacc:add(jacc-policy={})
I also I had to remove in my jboss-web.xml the value :
<security-domain>jaspitest</security-domain>
Otherwise I do get :
{
"WFLYCTL0412: Required services that are not installed:" => ["jboss.security.security-domain.jaspitest"],
"WFLYCTL0180: Services with missing/unavailable dependencies" => [
"jboss.deployment.unit.\"unite_compte.war\".component.SocieteGestionSIXDAOImpl.CREATE is missing [jboss.security.security-domain.jaspitest]",
I also have my own IdentityStore.
When I try to access the site, the login page appears as expected. When I submit the credentials my IdentityStore is called and the validate(Credential) method returns a valid CredentialValidationResult.
Unfortunately, I do get an Exception :
17:05:14,710 WARNING [javax.enterprise.resource.webcontainer.jsf.lifecycle] (default task-3) #{loginView.submit}: java.lang.IllegalStateException: java.io.IOException: java.io.IOException: ELY01177: Authorization failed.: javax.faces.FacesException: #{loginView.submit}: java.lang.IllegalStateException: java.io.IOException: java.io.IOException: ELY01177: Authorization failed.
Caused by: java.io.IOException: ELY01177: Authorization failed.
at org.wildfly.security.jakarta.authentication#1.17.1.Final//org.wildfly.security.auth.jaspi.impl.JaspiAuthenticationContext$1.handleOne(JaspiAuthenticationContext.java:188)
at org.wildfly.security.jakarta.authentication#1.17.1.Final//org.wildfly.security.auth.jaspi.impl.JaspiAuthenticationContext$1.lambda$handle$0(JaspiAuthenticationContext.java:100)
at org.wildfly.security.jakarta.authentication#1.17.1.Final//org.wildfly.security.auth.jaspi.impl.SecurityActions.doPrivileged(SecurityActions.java:39)
at org.wildfly.security.jakarta.authentication#1.17.1.Final//org.wildfly.security.auth.jaspi.impl.JaspiAuthenticationContext$1.handle(JaspiAuthenticationContext.java:99)
What shall I do to make it work ?
As the quickstart says, you have to update the Wildlfy configuration as well. Specifically, you have to run the configure-elytron.cli script of the quickstart
More info: https://github.com/wildfly/quickstart/tree/main/ee-security#configure-the-server

cucumber defaults publishing to some URL?

I am trying to generate a report using Cucumber-jvm 6.11.0, and it works fine on my machine, when I put these properties in junit-platform.properties :
cucumber.publish.enabled=true
cucumber.plugin=pretty, json:build/reports/cucumber/report.json
cucumber.junit-platform.naming-strategy=long
However, when I run it on Jenkins, I get an ConnectException during the publication :
java.lang.RuntimeException: java.net.ConnectException: Connection timed out (Connection timed out)
at io.cucumber.core.plugin.MessageFormatter.writeMessage(MessageFormatter.java:36)
at io.cucumber.core.eventbus.AbstractEventPublisher.send(AbstractEventPublisher.java:51)
at io.cucumber.core.eventbus.AbstractEventBus.send(AbstractEventBus.java:12)
at io.cucumber.core.runtime.SynchronizedEventBus.send(SynchronizedEventBus.java:47)
at io.cucumber.core.runtime.CucumberExecutionContext.emitTestRunFinished(CucumberExecutionContext.java:102)
at io.cucumber.core.runtime.CucumberExecutionContext.finishTestRun(CucumberExecutionContext.java:74)
at io.cucumber.junit.platform.engine.CucumberEngineExecutionContext.finishTestRun(CucumberEngineExecutionContext.java:98)
at io.cucumber.junit.platform.engine.CucumberEngineDescriptor.after(CucumberEngineDescriptor.java:37)
at io.cucumber.junit.platform.engine.CucumberEngineDescriptor.after(CucumberEngineDescriptor.java:10)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:149)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:149)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
...
Caused by: java.net.ConnectException: Connection timed out (Connection timed out)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at java.base/sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1963)
at java.base/sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1958)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1957)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1525)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1509)
at java.base/java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:527)
at java.base/sun.net.www.protocol.https.HttpsURLConnectionImpl.getResponseCode(HttpsURLConnectionImpl.java:329)
at io.cucumber.core.plugin.UrlOutputStream.getResponseBody(UrlOutputStream.java:111)
at io.cucumber.core.plugin.UrlOutputStream.sendRequest(UrlOutputStream.java:83)
I tried with different combination of properties, and I see it starts happening the moment I enable the publishing, with only :
cucumber.publish.enabled=true
I am not finding the default behavior in the documentation, once we enable the publishing : where does it get published by default ? does it really try to upload it through http ? (I guess the proxy is not configured when running on Jenkins, while it is found when running on my machine, hence the different behavior)
How come I still get this error when I simply try to write the html or json report on disk ?
When you enable report publishing, it uploads test result to Cucumber cloud service and you get the unique URL that you (or anyone you share that link with) can use to access your report.
The report is self-destructive in 24 hours. You can find more details in official Cucumber blog.

Knox SSO integration with Keycloak error - Required Subject Missing

I am working on integrating Knox with Keycloak with OIDC, for the SSO and security functionalities in Hadoop Cluster.
I have congigured everthing, and now while accessing the Knox URL, it is redirecting to the Keycloak URL. After authenticating the user successfully in Keycloak, it redirects it to the Knox URL(which is configured).
But once it is redirecting, Getting the below error:
2020-11-11 08:13:48,098 ERROR knox.gateway (CommonIdentityAssertionFilter.java:doFilter(79)) - Required subject/identity not available. Check authentication/federation provider for proper configuration.
2020-11-11 08:13:48,100 ERROR knox.gateway (AbstractGatewayFilter.java:doFilter(63)) - Failed to execute filter: java.lang.IllegalStateException: Required Subject Missing
2020-11-11 08:13:48,100 ERROR knox.gateway (GatewayFilter.java:doFilter(169)) - Gateway processing failed: javax.servlet.ServletException: java.lang.IllegalStateException: Required Subject Missing
javax.servlet.ServletException: java.lang.IllegalStateException: Required Subject Missing
at org.apache.knox.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:64)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.knox.gateway.GatewayFilter.doFilter(GatewayFilter.java:167)
at org.apache.knox.gateway.GatewayServlet.doFilter(GatewayServlet.java:158)
..........
Caused by: java.lang.IllegalStateException: Required Subject Missing
at org.apache.knox.gateway.identityasserter.common.filter.CommonIdentityAssertionFilter.doFilter(CommonIdentityAssertionFilter.java:80)
at org.apache.knox.gateway.GatewayFilter$Holder.doFilter(GatewayFilter.java:349)
at org.apache.knox.gateway.GatewayFilter$Chain.doFilter(GatewayFilter.java:263)
at org.apache.knox.gateway.filter.XForwardedHeaderFilter.doFilter(XForwardedHeaderFilter.java:50)
at org.apache.knox.gateway.filter.AbstractGatewayFilter.doFilter(AbstractGatewayFilter.java:58)
... 48 more
Any suggestions will be very much helpful.
Thanks
Jithesh

Spark/Phoenix with Kerberos on YARN

I have a Spark (1.4.1) application that runs on a non-kerberized cluster and I copied it to another instance that has Kerberos running. The application takes data from HDFS and puts it into Phoenix.
However, it does not work:
ERROR ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:734)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:887)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:856)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1200)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:50918)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1564)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1502)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1524)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1553)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1704)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:441)
at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:463)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:815)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1215)
at org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:112)
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1902)
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:744)
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:186)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:304)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:296)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:294)
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1243)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1893)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1862)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1862)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:99)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:57)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:45)
at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)
at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)
at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)
at com.bosch.asc.utils.HBaseUtils$.scanPhoenix(HBaseUtils.scala:123)
at com.bosch.asc.SMTProcess.addLookup(SMTProcess.scala:1125)
at com.bosch.asc.SMTProcess.saveMountTraceLogToPhoenix(SMTProcess.scala:1039)
at com.bosch.asc.SMTProcess.runETL(SMTProcess.scala:87)
at com.bosch.asc.SMTProcessMonitor$delayedInit$body.apply(SMTProcessMonitor.scala:20)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.bosch.asc.SMTProcessMonitor$.main(SMTProcessMonitor.scala:5)
at com.bosch.asc.SMTProcessMonitor.main(SMTProcessMonitor.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:486)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 70 more
I have added
export _JAVA_OPTIONS="-Djava.security.krb5.conf=/etc/hadoop/krb5.conf"
in my Spark submission script, but to no avail. Do I have to change the code itself to allow for authentication? I had previously assumed that the ticket is just shared between applications, and the code itself does not change.
In case it helps: in the shell I do not see a spark.authenticate option set when I execute:
sc.getConf.getAll.foreach(println)
See: http://spark.apache.org/docs/latest/security.html
I have very little experience with Kerberos, so any help is greatly appreciated.
Assuming that your cluster was properly kerberized, initialize your credentials with:
kinit -kt /path/to/keytab/file user/domain#realm
I think the reason is that on 4.4 the Phoenix/Spark library does not handle Kerberos principals and keytabs: https://issues.apache.org/jira/browse/PHOENIX-2817.
I tried to read data from an existing Phoenix table and I got that there was no suitable driver found and the jdbc connection string did not contain the keytab and principal (even though hbase-site.xml was correctly added and the HBase configuration I passed to Phoenix had these values) as shown here: https://phoenix.apache.org/index.html#Connection.
I was facing the same issue after lot of trail n error , I was able to fix this issue, please follow the below link for answer+explanation
Spark Streaming and Phoenix Kerberos issue

java.lang.SecurityException: Permission Denial

I am trying write data in mysql (wamp server) using android app. It was working perfectly.
Now it shows follwoing error:
java.lang.SecurityException: Permission Denial: get/set setting for
user asks to run as user -2 but is calling from user 0; this
requires android.permission.INTERACT_ACROSS_USERS_FULL
at com.android.server.am.ActivityManagerService.handleIncomingUser(ActivityManagerService.java:15168)
at android.app.ActivityManager.handleIncomingUser(ActivityManager.java:2498)
at com.android.providers.settings.SettingsProvider.call(SettingsProvider.java:688)
at android.content.ContentProvider$Transport.call(ContentProvider.java:325)
at android.content.ContentProviderNative.onTransact(ContentProviderNative.java:275)
at android.os.Binder.execTransact(Binder.java:404)
at dalvik.system.NativeStart.run(Native Method)
I am using android-studio, windows 7, 32 bit.
Can you please help me to solve this.

Resources