Broadleaf admin error - broadleaf-commerce

I followed the developer documentation to host broadleaf commerce and broadleaf admin on standalone tomcat, standalone solr and mysql. I deployed the ROOT.war and admin.war on tomcat and I can see the tables are created successfully on mysql. The issue is when I hit http://localhost:8080/admin/login, I get errors in tomcat logs and I cant access the login page. I am unable to identify the real cause of the issue. The error I am getting is
org.springframework.dao.InvalidDataAccessResourceUsageException:
Unknown column 'localeimpl0_.CREATED_BY' in 'field list'; SQL [n/a];
nested exception is org.hibernate.exception.SQLGrammarException:
Unknown column 'localeimpl0_.CREATED_BY' in 'field list'
at org.springframework.orm.jpa.vendor.HibernateJpaDialect.convertHibernateAccessException(HibernateJpaDialect.java:261) ~[spring-orm-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.orm.jpa.vendor.HibernateJpaDialect.translateExceptionIfPossible(HibernateJpaDialect.java:244) ~[spring-orm-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.translateExceptionIfPossible(AbstractEntityManagerFactoryBean.java:488) ~[spring-orm-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.dao.support.ChainedPersistenceExceptionTranslator.translateExceptionIfPossible(ChainedPersistenceExceptionTranslator.java:59) ~[spring-tx-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.dao.support.DataAccessUtils.translateIfNecessary(DataAccessUtils.java:213) ~[spring-tx-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:147) ~[spring-tx-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) ~[spring-aop-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:673) ~[spring-aop-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.broadleafcommerce.common.locale.dao.LocaleDaoImpl$$EnhancerBySpringCGLIB$$ca72ef81.findDefaultLocale(<generated>) ~[broadleaf-common-5.2.2-GA.jar:na]
at org.broadleafcommerce.common.locale.service.LocaleServiceImpl.findDefaultLocale(LocaleServiceImpl.java:47) ~[broadleaf-common-5.2.2-GA.jar:na]
at org.broadleafcommerce.common.locale.service.LocaleServiceImpl$$FastClassBySpringCGLIB$$e9131ff4.invoke(<generated>) ~[broadleaf-common-5.2.2-GA.jar:na]

Related

wildfly 25 JSF Security

I'm fully aware that wildfly 25 has dropped legacy security realms.
So I tried to move from wildfly 20.0.1 to wildfly 25.0.1.
According to the quickstart ee-security, I did
/subsystem=elytron/policy=jacc:add(jacc-policy={})
I also I had to remove in my jboss-web.xml the value :
<security-domain>jaspitest</security-domain>
Otherwise I do get :
{
"WFLYCTL0412: Required services that are not installed:" => ["jboss.security.security-domain.jaspitest"],
"WFLYCTL0180: Services with missing/unavailable dependencies" => [
"jboss.deployment.unit.\"unite_compte.war\".component.SocieteGestionSIXDAOImpl.CREATE is missing [jboss.security.security-domain.jaspitest]",
I also have my own IdentityStore.
When I try to access the site, the login page appears as expected. When I submit the credentials my IdentityStore is called and the validate(Credential) method returns a valid CredentialValidationResult.
Unfortunately, I do get an Exception :
17:05:14,710 WARNING [javax.enterprise.resource.webcontainer.jsf.lifecycle] (default task-3) #{loginView.submit}: java.lang.IllegalStateException: java.io.IOException: java.io.IOException: ELY01177: Authorization failed.: javax.faces.FacesException: #{loginView.submit}: java.lang.IllegalStateException: java.io.IOException: java.io.IOException: ELY01177: Authorization failed.
Caused by: java.io.IOException: ELY01177: Authorization failed.
at org.wildfly.security.jakarta.authentication#1.17.1.Final//org.wildfly.security.auth.jaspi.impl.JaspiAuthenticationContext$1.handleOne(JaspiAuthenticationContext.java:188)
at org.wildfly.security.jakarta.authentication#1.17.1.Final//org.wildfly.security.auth.jaspi.impl.JaspiAuthenticationContext$1.lambda$handle$0(JaspiAuthenticationContext.java:100)
at org.wildfly.security.jakarta.authentication#1.17.1.Final//org.wildfly.security.auth.jaspi.impl.SecurityActions.doPrivileged(SecurityActions.java:39)
at org.wildfly.security.jakarta.authentication#1.17.1.Final//org.wildfly.security.auth.jaspi.impl.JaspiAuthenticationContext$1.handle(JaspiAuthenticationContext.java:99)
What shall I do to make it work ?
As the quickstart says, you have to update the Wildlfy configuration as well. Specifically, you have to run the configure-elytron.cli script of the quickstart
More info: https://github.com/wildfly/quickstart/tree/main/ee-security#configure-the-server

cucumber defaults publishing to some URL?

I am trying to generate a report using Cucumber-jvm 6.11.0, and it works fine on my machine, when I put these properties in junit-platform.properties :
cucumber.publish.enabled=true
cucumber.plugin=pretty, json:build/reports/cucumber/report.json
cucumber.junit-platform.naming-strategy=long
However, when I run it on Jenkins, I get an ConnectException during the publication :
java.lang.RuntimeException: java.net.ConnectException: Connection timed out (Connection timed out)
at io.cucumber.core.plugin.MessageFormatter.writeMessage(MessageFormatter.java:36)
at io.cucumber.core.eventbus.AbstractEventPublisher.send(AbstractEventPublisher.java:51)
at io.cucumber.core.eventbus.AbstractEventBus.send(AbstractEventBus.java:12)
at io.cucumber.core.runtime.SynchronizedEventBus.send(SynchronizedEventBus.java:47)
at io.cucumber.core.runtime.CucumberExecutionContext.emitTestRunFinished(CucumberExecutionContext.java:102)
at io.cucumber.core.runtime.CucumberExecutionContext.finishTestRun(CucumberExecutionContext.java:74)
at io.cucumber.junit.platform.engine.CucumberEngineExecutionContext.finishTestRun(CucumberEngineExecutionContext.java:98)
at io.cucumber.junit.platform.engine.CucumberEngineDescriptor.after(CucumberEngineDescriptor.java:37)
at io.cucumber.junit.platform.engine.CucumberEngineDescriptor.after(CucumberEngineDescriptor.java:10)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:149)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:149)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
...
Caused by: java.net.ConnectException: Connection timed out (Connection timed out)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at java.base/sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1963)
at java.base/sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1958)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1957)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1525)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1509)
at java.base/java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:527)
at java.base/sun.net.www.protocol.https.HttpsURLConnectionImpl.getResponseCode(HttpsURLConnectionImpl.java:329)
at io.cucumber.core.plugin.UrlOutputStream.getResponseBody(UrlOutputStream.java:111)
at io.cucumber.core.plugin.UrlOutputStream.sendRequest(UrlOutputStream.java:83)
I tried with different combination of properties, and I see it starts happening the moment I enable the publishing, with only :
cucumber.publish.enabled=true
I am not finding the default behavior in the documentation, once we enable the publishing : where does it get published by default ? does it really try to upload it through http ? (I guess the proxy is not configured when running on Jenkins, while it is found when running on my machine, hence the different behavior)
How come I still get this error when I simply try to write the html or json report on disk ?
When you enable report publishing, it uploads test result to Cucumber cloud service and you get the unique URL that you (or anyone you share that link with) can use to access your report.
The report is self-destructive in 24 hours. You can find more details in official Cucumber blog.

Liferay: GA4 on an empty MySQL 5.7 results in a fatal exception

I am trying to install Liferay GA4 and a Master build for development purposes. However I keep falling into a fatal exception with MySQL 5.7.
As described at: https://issues.liferay.com/browse/LPS-73410
In an empty database, MySQL 5.7, when the servers is brought up the follow exception is raised. (seem on both drivers com.mysql.jdbc.Driver and com.mysql.cj.jdbc.Driver)
liferay | 21:45:35,927 ERROR [localhost-startStop-1][MainServlet:275] com.liferay.portal.kernel.events.ActionException: com.liferay.portal.verify.VerifyException: com.liferay.portal.verify.VerifyException: java.sql.SQLSyntaxErrorException: Table 'XXXXX.EVENTS' doesn't exist
liferay | com.liferay.portal.kernel.events.ActionException: com.liferay.portal.verify.VerifyException: com.liferay.portal.verify.VerifyException: java.sql.SQLSyntaxErrorException: Table 'XXXXX.EVENTS' doesn't exist
I was wondering if this is something I can get around by some procedure done directly into the database...Any thoughts?
I have found my way out this issue with new JDBC defaults.
jdbc.default.driverClassName=com.mysql.cj.jdbc.Driver
jdbc.default.url=jdbc:mysql://${database.host}/${database.schema}?useUnicode=true&characterEncoding=UTF-8&useFastDateParsing=false&useSSL=false&nullNamePatternMatchesAll=true&&nullCatalogMeansCurrent=true
From: https://www.e-systems.tech/web/guest/blog/-/blogs/liferay-with-mysql-5-7-driver-changes

Alfresco CMIS unauthorized

im doing an integration of Liferay and alfresco, my goal is to use alfresco 5.2 to store content created in Liferay DXP.
To do that ,i have added this line to portal-ext.porperties
dl.store.impl=com.liferay.portal.store.cmis.CMISStore
and added the config file com.liferay.portal.store.cmis.configuration.CMISStoreConfiguration.cfg with this content:
repositoryUrl=http://localhost:8080/alfresco/api/-default-/public/cmis/versions/1.1/atom
credentialsUsername=admin credentialsPassword=password
systemRootDir=Liferay
Everything worked fine but suddenly im getting this error while rebooting liferay:
21:19:37,536 ERROR [localhost-startStop-1][com_liferay_portal_store_cmis:97] [com.liferay.portal.store.cmis.CMISStore(1549)] The activate method has thrown an exception
org.apache.chemistry.opencmis.commons.exceptions.CmisUnauthorizedException: Unauthorized
at org.apache.chemistry.opencmis.client.bindings.spi.atompub.AbstractAtomPubService.convertStatusCode(AbstractAtomPubService.java:477)
at org.apache.chemistry.opencmis.client.bindings.spi.atompub.AbstractAtomPubService.read(AbstractAtomPubService.java:645)
at org.apache.chemistry.opencmis.client.bindings.spi.atompub.AbstractAtomPubService.getRepositoriesInternal(AbstractAtomPubService.java:808)
at org.apache.chemistry.opencmis.client.bindings.spi.atompub.RepositoryServiceImpl.getRepositoryInfos(RepositoryServiceImpl.java:65)
at org.apache.chemistry.opencmis.client.bindings.impl.RepositoryServiceImpl.getRepositoryInfos(RepositoryServiceImpl.java:90)
at org.apache.chemistry.opencmis.client.runtime.SessionFactoryImpl.getRepositories(SessionFactoryImpl.java:135)
at org.apache.chemistry.opencmis.client.runtime.SessionFactoryImpl.getRepositories(SessionFactoryImpl.java:112)
at com.liferay.portal.store.cmis.CMISStore.createSession(CMISStore.java:591)
at com.liferay.portal.store.cmis.CMISStore.activate(CMISStore.java:475)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.felix.scr.impl.inject.BaseMethod.invokeMethod(BaseMethod.java:224)
at org.apache.felix.scr.impl.inject.BaseMethod.access$500(BaseMethod.java:39)
at org.apache.felix.scr.impl.inject.BaseMethod$Resolved.invoke(BaseMethod.java:617)
at org.apache.felix.scr.impl.inject.BaseMethod.invoke(BaseMethod.java:501)
at org.apache.felix.scr.impl.inject.ActivateMethod.invoke(ActivateMethod.java:302)
at org.apache.felix.scr.impl.inject.ActivateMethod.invoke(ActivateMethod.java:294)
at org.apache.felix.scr.impl.manager.SingleComponentManager.createImplementationObject(SingleComponentManager.java:297)
at org.apache.felix.scr.impl.manager.SingleComponentManager.createComponent(SingleComponentManager.java:108)
at org.apache.felix.scr.impl.manager.SingleComponentManager.getService(SingleComponentManager.java:906)
at org.apache.felix.scr.impl.manager.SingleComponentManager.getServiceInternal(SingleComponentManager.java:879)
at org.apache.felix.scr.impl.manager.SingleComponentManager.getService(SingleComponentManager.java:823)
at org.eclipse.osgi.internal.serviceregistry.ServiceFactoryUse$1.run(ServiceFactoryUse.java:212)
at java.security.AccessController.doPrivileged(Native Method)
What does unauthorized mean in this context?

Spark/Phoenix with Kerberos on YARN

I have a Spark (1.4.1) application that runs on a non-kerberized cluster and I copied it to another instance that has Kerberos running. The application takes data from HDFS and puts it into Phoenix.
However, it does not work:
ERROR ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:734)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:734)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:887)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:856)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1200)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:50918)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1564)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1502)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1524)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1553)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1704)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:441)
at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:463)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:815)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1215)
at org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:112)
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1902)
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:744)
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:186)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:304)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:296)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:294)
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1243)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1893)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1862)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1862)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:99)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:57)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:45)
at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)
at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)
at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)
at com.bosch.asc.utils.HBaseUtils$.scanPhoenix(HBaseUtils.scala:123)
at com.bosch.asc.SMTProcess.addLookup(SMTProcess.scala:1125)
at com.bosch.asc.SMTProcess.saveMountTraceLogToPhoenix(SMTProcess.scala:1039)
at com.bosch.asc.SMTProcess.runETL(SMTProcess.scala:87)
at com.bosch.asc.SMTProcessMonitor$delayedInit$body.apply(SMTProcessMonitor.scala:20)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.bosch.asc.SMTProcessMonitor$.main(SMTProcessMonitor.scala:5)
at com.bosch.asc.SMTProcessMonitor.main(SMTProcessMonitor.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:486)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 70 more
I have added
export _JAVA_OPTIONS="-Djava.security.krb5.conf=/etc/hadoop/krb5.conf"
in my Spark submission script, but to no avail. Do I have to change the code itself to allow for authentication? I had previously assumed that the ticket is just shared between applications, and the code itself does not change.
In case it helps: in the shell I do not see a spark.authenticate option set when I execute:
sc.getConf.getAll.foreach(println)
See: http://spark.apache.org/docs/latest/security.html
I have very little experience with Kerberos, so any help is greatly appreciated.
Assuming that your cluster was properly kerberized, initialize your credentials with:
kinit -kt /path/to/keytab/file user/domain#realm
I think the reason is that on 4.4 the Phoenix/Spark library does not handle Kerberos principals and keytabs: https://issues.apache.org/jira/browse/PHOENIX-2817.
I tried to read data from an existing Phoenix table and I got that there was no suitable driver found and the jdbc connection string did not contain the keytab and principal (even though hbase-site.xml was correctly added and the HBase configuration I passed to Phoenix had these values) as shown here: https://phoenix.apache.org/index.html#Connection.
I was facing the same issue after lot of trail n error , I was able to fix this issue, please follow the below link for answer+explanation
Spark Streaming and Phoenix Kerberos issue

Resources