Migrating HSQL to Mysql but 404 Error on site - broadleaf-commerce

I have problem on migrating hsql to mysql. First i already run it on ant tomcat with hsql then i want to use it with mysql. I followed all steps on this documentaion http://www.broadleafcommerce.com/docs/c ... l-tutorial
But it has error when accessing the site it give error 404. and when i checked on database. all tables are created but their is no data on some of it.
here the some log.
[artifact:mvn] [ERROR] 15:34:02 SchemaExport - HHH000231: Schema export unsuccessful
[artifact:mvn] org.hibernate.tool.hbm2ddl.ImportScriptException: Error during statement execution (file: '/sql/load_i18n_countries.sql'): INSERT INTO BLC_ISO_COUNTRY (ALPHA_2, NAME, ALPHA_3, NUMERIC_CODE, STATUS) VALUES ('AX', 'Ă…land Islands', 'ALA', '248', 'OFFICIALLY_ASSIGNED')
[artifact:mvn] at org.hibernate.tool.hbm2ddl.SchemaExport.importScript(SchemaExport.java:451)
[artifact:mvn] at org.hibernate.tool.hbm2ddl.SchemaExport.execute(SchemaExport.java:378)
[artifact:mvn] at org.hibernate.tool.hbm2ddl.SchemaExport.create(SchemaExport.java:304)
[artifact:mvn] at org.hibernate.tool.hbm2ddl.SchemaExport.create(SchemaExport.java:293)
[artifact:mvn] at org.hibernate.internal.SessionFactoryImpl.(SessionFactoryImpl.java:498)
[artifact:mvn] at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1742)
[artifact:mvn] at org.hibernate.ejb.EntityManagerFactoryImpl.(EntityManagerFactoryImpl.java:94)
[artifact:mvn] at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:905)
[artifact:mvn] at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:890)
[artifact:mvn] at org.springframework.orm.jpa.vendor.SpringHibernateEjbPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateEjbPersistenceProvider.java:51)
Please help me. Thank you.

This is likely due to MySQL not running in UTF-8 mode. See this link in the Broadleaf docs: http://www.broadleafcommerce.com/docs/core/current/broadleaf-concepts/key-aspects-and-configuration/database-configuration/mysql.
Basically, ensure that your my.cnf configuration tells MySQL to run with the following:
[mysqld]
lower_case_table_names=1
character-set-server=utf8
collation-server=utf8_general_ci
You will need to restart MySQL for this to take effect. After you do this, you might need to create another schema that will be defaulted to UTF-8. If you don't configure this, MySQL usually starts up in MYISAM which is incompatible with UTF-8. I'm not sure if you can change an existing schema from MYISAM to UTF-8; probably easier if you just create a new schema.

Related

Google Cloud Spanner not implementing check constraint

When trying to create a table with gcloud CLI (using the spanner emulator)
gcloud spanner databases ddl update db_name --instance=instance_name --ddl=$data
I'm attempting to add a check constraint where $data evaluates to:
CREATE TABLE my_table (
my_key STRING(36),
some_column STRING(100),
CONSTRAINT ck_my_table_some_column_enum CHECK (some_column = 'this' OR some_column = 'that'),
) PRIMARY KEY (my_key)
and keep getting the following error:
ERROR: (gcloud.spanner.databases.ddl.update) HttpError accessing <http://localhost:9020/v1/projects/spanner-emulator-test/instances/instance_name/databases/db_name/ddl?alt=json>: response: <{'content-type': 'application/json', 'trailer': 'Grpc-Trailer-Content-Type', 'date': 'Tue, 31 Aug 2021 22:53:52 GMT', 'transfer-encoding': 'chunked', 'status': 501}>, content <{"error":"Check Constraint is not implemented.","code":12,"message":"Check Constraint is not implemented."}>
This may be due to network connectivity issues. Please check your network settings, and the status of the service you are trying to reach.
First inclination might be to verify the emulator is running given the last part of the message. It is. And I've tried immediately executing the same DDL without the check constraint and it works perfectly - table is created.
DBeaver
I've also tried creating this table using the DBeaver db tool and get the following error:
org.jkiss.dbeaver.model.sql.DBSQLException: SQL Error [12]: UNIMPLEMENTED: io.grpc.StatusRuntimeException: UNIMPLEMENTED: Check Constraint is not implemented.
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:133)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:513)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$0(SQLQueryJob.java:444)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:431)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:816)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3440)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:118)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:116)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:4718)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: com.google.cloud.spanner.jdbc.JdbcSqlExceptionFactory$JdbcSqlExceptionImpl: UNIMPLEMENTED: io.grpc.StatusRuntimeException: UNIMPLEMENTED: Check Constraint is not implemented.
at com.google.cloud.spanner.jdbc.JdbcSqlExceptionFactory.of(JdbcSqlExceptionFactory.java:222)
at com.google.cloud.spanner.jdbc.AbstractJdbcStatement.execute(AbstractJdbcStatement.java:259)
at com.google.cloud.spanner.jdbc.JdbcStatement.executeStatement(JdbcStatement.java:110)
at com.google.cloud.spanner.jdbc.JdbcStatement.execute(JdbcStatement.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.execute(JDBCStatementImpl.java:330)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:130)
... 12 more
Caused by: com.google.cloud.spanner.SpannerException: UNIMPLEMENTED: io.grpc.StatusRuntimeException: UNIMPLEMENTED: Check Constraint is not implemented.
at com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:284)
at com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:61)
at com.google.cloud.spanner.SpannerExceptionFactory.fromApiException(SpannerExceptionFactory.java:299)
at com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:174)
at com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:110)
at com.google.cloud.spanner.DatabaseAdminClientImpl.lambda$updateDatabaseDdl$7(DatabaseAdminClientImpl.java:337)
at com.google.api.core.ApiFutures$ApiFunctionToGuavaFunction.apply(ApiFutures.java:240)
at com.google.common.util.concurrent.AbstractCatchingFuture$CatchingFuture.doFallback(AbstractCatchingFuture.java:223)
at com.google.common.util.concurrent.AbstractCatchingFuture$CatchingFuture.doFallback(AbstractCatchingFuture.java:211)
at com.google.common.util.concurrent.AbstractCatchingFuture.run(AbstractCatchingFuture.java:124)
at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30)
at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1213)
at com.google.common.util.concurrent.AbstractFuture.addListener(AbstractFuture.java:724)
at com.google.common.util.concurrent.FluentFuture$TrustedFuture.addListener(FluentFuture.java:110)
at com.google.common.util.concurrent.ForwardingListenableFuture.addListener(ForwardingListenableFuture.java:45)
at com.google.api.core.ApiFutureToListenableFuture.addListener(ApiFutureToListenableFuture.java:52)
at com.google.common.util.concurrent.AbstractCatchingFuture.create(AbstractCatchingFuture.java:41)
at com.google.common.util.concurrent.Futures.catching(Futures.java:297)
at com.google.api.core.ApiFutures.catching(ApiFutures.java:99)
at com.google.api.gax.longrunning.OperationFutureImpl.<init>(OperationFutureImpl.java:97)
at com.google.cloud.spanner.DatabaseAdminClientImpl.updateDatabaseDdl(DatabaseAdminClientImpl.java:335)
at com.google.cloud.spanner.connection.DdlClient.executeDdl(DdlClient.java:89)
at com.google.cloud.spanner.connection.DdlClient.executeDdl(DdlClient.java:84)
at com.google.cloud.spanner.connection.SingleUseTransaction.lambda$executeDdlAsync$1(SingleUseTransaction.java:272)
at io.grpc.Context$2.call(Context.java:596)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Suppressed: com.google.cloud.spanner.connection.AbstractBaseUnitOfWork$SpannerAsyncExecutionException: Execution failed for statement: CREATE TABLE my_table (
my_key STRING(36),
some_column STRING(100),
CONSTRAINT ck_my_table_some_column_enum CHECK (some_column = 'this' OR some_column = 'that'),
) PRIMARY KEY (my_key)
at com.google.cloud.spanner.connection.AbstractBaseUnitOfWork.executeStatementAsync(AbstractBaseUnitOfWork.java:233)
at com.google.cloud.spanner.connection.AbstractBaseUnitOfWork.executeStatementAsync(AbstractBaseUnitOfWork.java:145)
at com.google.cloud.spanner.connection.SingleUseTransaction.executeDdlAsync(SingleUseTransaction.java:281)
at com.google.cloud.spanner.connection.ConnectionImpl.executeDdlAsync(ConnectionImpl.java:1157)
at com.google.cloud.spanner.connection.ConnectionImpl.execute(ConnectionImpl.java:803)
at com.google.cloud.spanner.jdbc.AbstractJdbcStatement.execute(AbstractJdbcStatement.java:250)
at com.google.cloud.spanner.jdbc.JdbcStatement.executeStatement(JdbcStatement.java:110)
at com.google.cloud.spanner.jdbc.JdbcStatement.execute(JdbcStatement.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.execute(JDBCStatementImpl.java:330)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:130)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:513)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$0(SQLQueryJob.java:444)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:431)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:816)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3440)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:118)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:116)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:4718)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: io.grpc.StatusRuntimeException: UNIMPLEMENTED: Check Constraint is not implemented.
at io.grpc.Status.asRuntimeException(Status.java:535)
at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:533)
at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at com.google.cloud.spanner.spi.v1.SpannerErrorInterceptor$1$1.onClose(SpannerErrorInterceptor.java:100)
at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:557)
at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:69)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:738)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:717)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
... 3 more
Check constraints is available in the latest version of emulator. Can you please check and get back to us if there are issues.
https://console.cloud.google.com/gcr/images/cloud-spanner-emulator/global/emulator#sha256:ef0fd2ec74bb17b6c31e5010fb699fd0009b9829721d5159e6a11b6a40f881f1/details
Well... it looks like check constraints don't seem to work with the emulator. I tried this with a test db in the console and it worked. If anyone comes up with a solution to this, however, or if there's something I need to configure, please let me know! Thank you.

How to execute Snowflake Stored Procedure from Python?

I have created stored procedure in snowflake which is executed fine in snowflake UI and also from server by using snowsql. Now I want to execute procedure from python program, I tried to execute from python, here are the steps that I have followed:
establish the connection to snowflake ( successfully able to connect.)
cs = ctx.cursor()
Used appropriate role,warehouse,database and schema.
tried to execute procedure like this:
cs.execute("call test_proc('value1', 'value2')")
x = cs.fetchall()
print(x)
But getting an erorr:
snowflake.connector.errors.ProgrammingError: 002140 (42601): SQL
compilation error: Unknown function test_proc
Can you please help me to resolve this problem.
Thanks,
When connecting to Snowflake using Python connector you could define DATABASE/SCHEMA
conn = snowflake.connector.connect(
user=USER,
password=PASSWORD,
account=ACCOUNT,
warehouse=WAREHOUSE,
database=DATABASE,
schema=SCHEMA
);
Once you have it set up, you could call your stored procedure without using fully-qualified name:
cs.execute("call test_proc('value1', 'value2')");
Alternative way is:
Using the Database, Schema, and Warehouse
Specify the database and schema in which you want to create tables. Also specify the warehouse that will provide resources for executing DML statements and queries.
For example, to use the database testdb, schema testschema and warehouse tiny_warehouse (created earlier):
conn.cursor().execute("USE WAREHOUSE tiny_warehouse_mg")
conn.cursor().execute("USE DATABASE testdb_mg")
conn.cursor().execute("USE SCHEMA testdb_mg.testschema_mg")
Actually, I have to have command like this
cs.execute("call yourdbname.schemaname.test_proc('value1', 'value2')")
and It is working as expected.
Thanks

AnalysisException when dropping a hive database using spark

Environment
spark 3.0.0
hive metastore (standalone) 3.0.0
mysql 8 as the metastore db
Problem
Every time I try to drop a database in the metastore via spark, I get AnalysisException and I don't know what is causing it or whether the drop operation is succeeding in it's entirety
Example
spark.sql(f"CREATE DATABASE IF NOT EXISTS myDb LOCATION 'shared-metastore-location/myDb.db'")
################
# DB creation succeeds and I can view the db in the metastore, add tables etc
################
spark.sql(f"DROP DATABASE myDb CASCADE")
################
AnalysisException: 'org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to clean up java.sql.SQLException: The table does not comply with the requirements by an external plugin.\n\tat com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:129)\n\tat com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)\n\tat com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)\n\tat com.mysql.cj.jdbc.StatementImpl.executeUpdateInternal(StatementImpl.java:1335)\n\tat com.mysql.cj.jdbc.StatementImpl.executeLargeUpdate(StatementImpl.java:2108)\n\tat com.mysql.cj.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1245)\n\tat com.zaxxer.hikari.pool.ProxyStatement.executeUpdate(ProxyStatement.java:117)\n\tat com.zaxxer.hikari.pool.HikariProxyStatement.executeUpdate(HikariProxyStatement.java)\n\tat org.apache.hadoop.hive.metastore.txn.TxnHandler.cleanupRecords(TxnHandler.java:2741)\n\tat org.apache.hadoop.hive.metastore.AcidEventListener.onDropDatabase(AcidEventListener.java:52)\n\tat org.apache.hadoop.hive.metastore.MetaStoreListenerNotifier$21.notify(MetaStoreListenerNotifier.java:85)\n\tat org.apache.hadoop.hive.metastore.MetaStoreListenerNotifier.notifyEvent(MetaStoreListenerNotifier.java:264)\n\tat org.apache.hadoop.hive.metastore.MetaStoreListenerNotifier.notifyEvent(MetaStoreListenerNotifier.java:326)\n\tat org.apache.hadoop.hive.metastore.MetaStoreListenerNotifier.notifyEvent(MetaStoreListenerNotifier.java:364)\n\tat org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_database_core(HiveMetaStore.java:1537)\n\tat org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_database(HiveMetaStore.java:1575)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:498)\n\tat org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)\n\tat org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)\n\tat com.sun.proxy.$Proxy32.drop_database(Unknown Source)\n\tat org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_database.getResult(ThriftHiveMetastore.java:14352)\n\tat org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_database.getResult(ThriftHiveMetastore.java:14336)\n\tat org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)\n\tat org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:111)\n\tat org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)\n\tat org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:119)\n\tat org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\n);'
Despite the Exception, the database does disappear after I run this code. And if I try to run the drop command a second time, I get a different exception saying that the database doesn't exist. But, I have no idea whether the operation has succeeded in its entirety or whether it's leaving a mess behind. I'm not familiar enough with Hive to know what should be deleted in the metastore to completely delete a db
I seem to get the same result whether I have tables in the db or not. I've also tried dropping the table without cascade. Same result

Hive: create database fail with 'database already exists'

I have a test suite which runs several spark unit tests. Each of these tests shares the same underlying spark context.
During the running of these tests I check if a db exists and if not I create it:
def dbExists(db: String) = spark.sql(s"show databases like '$db'").count > 0
if (!dbExists(db)) spark.sql(s"create database $db")
For some reasons, one of the tests is failing. Debugging I saw that for a certain db dbExists(db) returns false and the creation command fails with
ERROR RetryingHMSHandler:159 - AlreadyExistsException(message:Database db already exists)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:891)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
Everytime a test start, I clean the environment running drop database db cascade for each db that is not the default one.
The only explanation I can give is that some corrupted metadata is in the catalogue and spark sql thinks that the db exists, while it is not anymore.
The problem happens also inside a container with a fresh git clone of the project, meaning that it is not a previous run of the application that might pollute environment.
I run with hive support enabled.
Try this:
You are absolutely right, It is important to check the existence of the database before creation. This should work and would be much easier for the hive to check.
def dbExists(db: String) = spark.sql(s"show databases like '$db'").count > 0
spark.sql(s"create database if not exists $db")
This should work for you.

Presto connect to Netezza

is there any way I can connect to netezza database using Presto?
2018-08-08T13:42:06.124-0400 ERROR Query-20180808_174205_00000_qwr8d-165 org.postgresql.Driver Connection error:
org.postgresql.util.PSQLException: A connection could not be made using the requested protocol null.
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:57)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:194)
at org.postgresql.Driver.makeConnection(Driver.java:450)
at org.postgresql.Driver.connect(Driver.java:252)
at com.facebook.presto.plugin.jdbc.DriverConnectionFactory.openConnection(DriverConnectionFactory.java:59)
at com.facebook.presto.plugin.jdbc.BaseJdbcClient.getSchemaNames(BaseJdbcClient.java:119)
at com.facebook.presto.plugin.jdbc.JdbcMetadata.listSchemaNames(JdbcMetadata.java:72)
at com.facebook.presto.metadata.MetadataManager.listSchemaNames(MetadataManager.java:290)
at com.facebook.presto.connector.informationSchema.InformationSchemaMetadata.calculatePrefixesWithSchemaName(InformationSchemaMetadata.java:257)
at com.facebook.presto.connector.informationSchema.InformationSchemaMetadata.getTableLayouts(InformationSchemaMetadata.java:222)
at com.facebook.presto.metadata.MetadataManager.getLayouts(MetadataManager.java:350)
at com.facebook.presto.sql.planner.iterative.rule.PickTableLayout.planTableScan(PickTableLayout.java:203)
at com.facebook.presto.sql.planner.iterative.rule.PickTableLayout.access$200(PickTableLayout.java:61)
at com.facebook.presto.sql.planner.iterative.rule.PickTableLayout$PickTableLayoutWithoutPredicate.apply(PickTableLayout.java:186)
at com.facebook.presto.sql.planner.iterative.rule.PickTableLayout$PickTableLayoutWithoutPredicate.apply(PickTableLayout.java:153)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.transform(IterativeOptimizer.java:165)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreNode(IterativeOptimizer.java:138)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreGroup(IterativeOptimizer.java:103)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreChildren(IterativeOptimizer.java:185)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreGroup(IterativeOptimizer.java:105)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreChildren(IterativeOptimizer.java:185)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreGroup(IterativeOptimizer.java:105)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.optimize(IterativeOptimizer.java:94)
at com.facebook.presto.sql.planner.LogicalPlanner.plan(LogicalPlanner.java:140)
at com.facebook.presto.sql.planner.LogicalPlanner.plan(LogicalPlanner.java:129)
at com.facebook.presto.execution.SqlQueryExecution.doAnalyzeQuery(SqlQueryExecution.java:341)
at com.facebook.presto.execution.SqlQueryExecution.analyzeQuery(SqlQueryExecution.java:326)
at com.facebook.presto.execution.SqlQueryExecution.start(SqlQueryExecution.java:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
If I tried to connect using netezza driver I am getting following error.
etc/config.properties
plugin.bundles=com.facebook.presto:presto-netezza:0.117
2018-08-08T13:15:37.716-0400 ERROR main com.facebook.presto.server.PrestoServer Could not resolve artifact: io.airlift.resolver.DefaultArtifact#558127d2
java.lang.RuntimeException: Could not resolve artifact: io.airlift.resolver.DefaultArtifact#558127d2
at com.facebook.presto.server.PluginManager.createClassLoader(PluginManager.java:284)
at com.facebook.presto.server.PluginManager.buildClassLoaderFromCoordinates(PluginManager.java:274)
at com.facebook.presto.server.PluginManager.buildClassLoader(PluginManager.java:239)
at com.facebook.presto.server.PluginManager.loadPlugin(PluginManager.java:154)
at com.facebook.presto.server.PluginManager.loadPlugins(PluginManager.java:142)
at com.facebook.presto.server.PrestoServer.run(PrestoServer.java:117)
at com.facebook.presto.server.PrestoServer.main(PrestoServer.java:67)
There is no (official) Presto plugin for Netezza yet (that I know of).
You can write one, or pay someone else to write one.

Resources