What kind of SQL does IgniteRDD.sql support - apache-spark

It looks that IgniteRDD.sql only supports ANSI SQL, not SparkSQL or HiveQL?
When I am using IgniteRDD.sql(sqlText) which throws exception for bad sql, it is something that traces to org.h2.jdbc.JdbcSQLException which means that h2 parse is paring the sql?
Is my understanding correct?
Exception in thread "main" javax.cache.CacheException: Failed to parse query: select __VAL from Integer
at org.apache.ignite.internal.processors.query.h2.IgniteH2Indexing.queryTwoStep(IgniteH2Indexing.java:1137)
at org.apache.ignite.internal.processors.query.GridQueryProcessor$2.applyx(GridQueryProcessor.java:732)
at org.apache.ignite.internal.processors.query.GridQueryProcessor$2.applyx(GridQueryProcessor.java:730)
at org.apache.ignite.internal.util.lang.IgniteOutClosureX.apply(IgniteOutClosureX.java:36)
at org.apache.ignite.internal.processors.query.GridQueryProcessor.executeQuery(GridQueryProcessor.java:1666)
at org.apache.ignite.internal.processors.query.GridQueryProcessor.queryTwoStep(GridQueryProcessor.java:730)
at org.apache.ignite.internal.processors.cache.IgniteCacheProxy.query(IgniteCacheProxy.java:700)
at org.apache.ignite.spark.IgniteRDD.sql(IgniteRDD.scala:147)
at com.xyz.ignite.spark.IgniteSparkTest$.main(IgniteSparkTest.scala:33)
at com.xyz.ignite.spark.IgniteSparkTest.main(IgniteSparkTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:88)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
at java.lang.reflect.Method.invoke(Method.java:613)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: org.h2.jdbc.JdbcSQLException: Column "__VAL" not found; SQL statement:
select __VAL from Integer [42122-191]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:345)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.expression.ExpressionColumn.optimize(ExpressionColumn.java:147)
at org.h2.command.dml.Select.prepare(Select.java:852)
at org.h2.command.Parser.prepareCommand(Parser.java:257)

IgniteRDD.sql and objectSql are not working on the top of Spark SQL - they just call Ignite's SQL engine.
Here is documentations with possibile instructions

Related

Databricks: I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2

I met with an issue when I was trying to use autoloader to read json files from Azure ADLS Gen2. I am getting this issue for specific files only. I checked the file are good and not corrupted.
Following is the issue:
Caused by: java.lang.IllegalArgumentException: ***requirement failed: Literal must have a corresponding value to string, but class Integer found.***
at scala.Predef$.require(Predef.scala:281)
at at ***com.databricks.sql.io.FileReadException: Error while reading file /mnt/Source/kafka/customer_raw/filtered_data/year=2022/month=11/day=9/hour=15/part-00000-31413bcf-0a8f-480f-8d45-6970f4c4c9f7.c000.json.***
at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anon$2.logFileNameAndThrow(FileScanRDD.scala:598)
at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.hasNext(FileScanRDD.scala:422)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(null:-1)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:759)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
java.lang.IllegalArgumentException: requirement failed: Literal must have a corresponding value to string, but class Integer found.
at scala.Predef$.require(Predef.scala:281)
at org.apache.spark.sql.catalyst.expressions.Literal$.validateLiteralValue(literals.scala:274)
org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.sat java.lang.Thread.run(Thread.java:750)
I am using Delta Live Pipeline. Here is the code:
#dlt.table(name = tablename,
comment = "Create Bronze Table",
table_properties={
"quality": "bronze"
}
)
def Bronze_Table_Create():
return
spark
.readStream
.schema(schemapath)
.format("cloudFiles")
.option("cloudFiles.format","json)
.option("cloudFile.schemaLocation, schemalocation)
.option("cloudFiles.inferColumnTypes", "false")
.option("cloudFiles.schemaEvolutionMode", "rescue")
.load(sourcelocation
I got the issue resolved. The issues was by mistake we have duplicate columns in the schema files. Because of that it was showing that error. However, the error is totally mis-leading, that's why didn't able to rectify it.

AWS Glue reading data from Sybase table

While loading data from Sybase DB in AWS Glue I encounter an error:
Py4JJavaError: An error occurred while calling o261.load.
: java.sql.SQLException: The identifier that starts with '__SPARK_GEN_JDBC_SUBQUERY_NAME' is too long. Maximum length is 30.
The code I use is:
spark.read.format("jdbc").
option("driver", "net.sourceforge.jtds.jdbc.Driver").
option("url", jdbc_url).
option("query", query).
option("user", db_username).
option("password", db_password).
load()
Is there any way to set this identifier as a custom one in order to have it shorter? What's interesting I am able to load all the data from a particular table by replacing query option with option("dbtable", table) but invoking a custom query is impossible.
Best Regards

Force removal of External Table in Synapse Analytics

I have created an external table in my synapse workspace setting a wrong Location.
DROP TABLE IF EXISTS nameTable
I am trying to remove it but I am getting this error:
Error: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: java.net.UnknownHostException: "name_of_host"
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:507)
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:104)
org.apache.spark.sql.catalyst.catalog.SessionCatalog.dropTable(SessionCatalog.scala:671)
org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:216)
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:202)
org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:202)
org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3377)
org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:90)
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:144)
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:80)
org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withAction(Dataset.scala:3376)
org.apache.spark.sql.Dataset.<init>(Dataset.scala:202)
org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:87)
org.apache.spark.sql.SparkSession.sql(SparkSession.scala:656)
org.apache.livy.repl.SQLInterpreter.execute(SQLInterpreter.scala:129)
org.apache.livy.repl.Session$$anonfun$7.apply(Session.scala:380)
org.apache.livy.repl.Session$$anonfun$7.apply(Session.scala:378)
scala.Option.map(Option.scala:146)
org.apache.livy.repl.Session.org$apache$livy$repl$Session$$executeCode(Session.scala:378)
org.apache.livy.repl.Session$$anonfun$execute$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(Session.scala:244)
org.apache.livy.repl.Session.org$apache$livy$repl$Session$$withRealtimeOutputSupport(Session.scala:518)
org.apache.livy.repl.Session$$anonfun$execute$1.apply$mcV$sp(Session.scala:243)
org.apache.livy.repl.Session$$anonfun$execute$1.apply(Session.scala:233)
org.apache.livy.repl.Session$$anonfun$execute$1.apply(Session.scala:233)
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
I have done some tries like:
Try to alter the location -> ALTER TABLE nameTable SET LOCATION yyy
Rename the table -> ALTER TABLE nameTable RENAME TO nameTable2
But I am always getting the same error...
EDIT
I set a HDFS of our on-prem location by accident:
Is there any way to force a rename, a location or a drop of the table from Synapse workspace?
thanks beforehand
The posted error is not particularly helpful and this may not work. But, you can try dropping the external table:
DROP EXTERNAL TABLE { database_name.schema_name.table_name | schema_name.table_name | table_name }
[;]

Google Cloud Spanner not implementing check constraint

When trying to create a table with gcloud CLI (using the spanner emulator)
gcloud spanner databases ddl update db_name --instance=instance_name --ddl=$data
I'm attempting to add a check constraint where $data evaluates to:
CREATE TABLE my_table (
my_key STRING(36),
some_column STRING(100),
CONSTRAINT ck_my_table_some_column_enum CHECK (some_column = 'this' OR some_column = 'that'),
) PRIMARY KEY (my_key)
and keep getting the following error:
ERROR: (gcloud.spanner.databases.ddl.update) HttpError accessing <http://localhost:9020/v1/projects/spanner-emulator-test/instances/instance_name/databases/db_name/ddl?alt=json>: response: <{'content-type': 'application/json', 'trailer': 'Grpc-Trailer-Content-Type', 'date': 'Tue, 31 Aug 2021 22:53:52 GMT', 'transfer-encoding': 'chunked', 'status': 501}>, content <{"error":"Check Constraint is not implemented.","code":12,"message":"Check Constraint is not implemented."}>
This may be due to network connectivity issues. Please check your network settings, and the status of the service you are trying to reach.
First inclination might be to verify the emulator is running given the last part of the message. It is. And I've tried immediately executing the same DDL without the check constraint and it works perfectly - table is created.
DBeaver
I've also tried creating this table using the DBeaver db tool and get the following error:
org.jkiss.dbeaver.model.sql.DBSQLException: SQL Error [12]: UNIMPLEMENTED: io.grpc.StatusRuntimeException: UNIMPLEMENTED: Check Constraint is not implemented.
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:133)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:513)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$0(SQLQueryJob.java:444)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:431)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:816)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3440)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:118)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:116)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:4718)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: com.google.cloud.spanner.jdbc.JdbcSqlExceptionFactory$JdbcSqlExceptionImpl: UNIMPLEMENTED: io.grpc.StatusRuntimeException: UNIMPLEMENTED: Check Constraint is not implemented.
at com.google.cloud.spanner.jdbc.JdbcSqlExceptionFactory.of(JdbcSqlExceptionFactory.java:222)
at com.google.cloud.spanner.jdbc.AbstractJdbcStatement.execute(AbstractJdbcStatement.java:259)
at com.google.cloud.spanner.jdbc.JdbcStatement.executeStatement(JdbcStatement.java:110)
at com.google.cloud.spanner.jdbc.JdbcStatement.execute(JdbcStatement.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.execute(JDBCStatementImpl.java:330)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:130)
... 12 more
Caused by: com.google.cloud.spanner.SpannerException: UNIMPLEMENTED: io.grpc.StatusRuntimeException: UNIMPLEMENTED: Check Constraint is not implemented.
at com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:284)
at com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:61)
at com.google.cloud.spanner.SpannerExceptionFactory.fromApiException(SpannerExceptionFactory.java:299)
at com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:174)
at com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:110)
at com.google.cloud.spanner.DatabaseAdminClientImpl.lambda$updateDatabaseDdl$7(DatabaseAdminClientImpl.java:337)
at com.google.api.core.ApiFutures$ApiFunctionToGuavaFunction.apply(ApiFutures.java:240)
at com.google.common.util.concurrent.AbstractCatchingFuture$CatchingFuture.doFallback(AbstractCatchingFuture.java:223)
at com.google.common.util.concurrent.AbstractCatchingFuture$CatchingFuture.doFallback(AbstractCatchingFuture.java:211)
at com.google.common.util.concurrent.AbstractCatchingFuture.run(AbstractCatchingFuture.java:124)
at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30)
at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1213)
at com.google.common.util.concurrent.AbstractFuture.addListener(AbstractFuture.java:724)
at com.google.common.util.concurrent.FluentFuture$TrustedFuture.addListener(FluentFuture.java:110)
at com.google.common.util.concurrent.ForwardingListenableFuture.addListener(ForwardingListenableFuture.java:45)
at com.google.api.core.ApiFutureToListenableFuture.addListener(ApiFutureToListenableFuture.java:52)
at com.google.common.util.concurrent.AbstractCatchingFuture.create(AbstractCatchingFuture.java:41)
at com.google.common.util.concurrent.Futures.catching(Futures.java:297)
at com.google.api.core.ApiFutures.catching(ApiFutures.java:99)
at com.google.api.gax.longrunning.OperationFutureImpl.<init>(OperationFutureImpl.java:97)
at com.google.cloud.spanner.DatabaseAdminClientImpl.updateDatabaseDdl(DatabaseAdminClientImpl.java:335)
at com.google.cloud.spanner.connection.DdlClient.executeDdl(DdlClient.java:89)
at com.google.cloud.spanner.connection.DdlClient.executeDdl(DdlClient.java:84)
at com.google.cloud.spanner.connection.SingleUseTransaction.lambda$executeDdlAsync$1(SingleUseTransaction.java:272)
at io.grpc.Context$2.call(Context.java:596)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Suppressed: com.google.cloud.spanner.connection.AbstractBaseUnitOfWork$SpannerAsyncExecutionException: Execution failed for statement: CREATE TABLE my_table (
my_key STRING(36),
some_column STRING(100),
CONSTRAINT ck_my_table_some_column_enum CHECK (some_column = 'this' OR some_column = 'that'),
) PRIMARY KEY (my_key)
at com.google.cloud.spanner.connection.AbstractBaseUnitOfWork.executeStatementAsync(AbstractBaseUnitOfWork.java:233)
at com.google.cloud.spanner.connection.AbstractBaseUnitOfWork.executeStatementAsync(AbstractBaseUnitOfWork.java:145)
at com.google.cloud.spanner.connection.SingleUseTransaction.executeDdlAsync(SingleUseTransaction.java:281)
at com.google.cloud.spanner.connection.ConnectionImpl.executeDdlAsync(ConnectionImpl.java:1157)
at com.google.cloud.spanner.connection.ConnectionImpl.execute(ConnectionImpl.java:803)
at com.google.cloud.spanner.jdbc.AbstractJdbcStatement.execute(AbstractJdbcStatement.java:250)
at com.google.cloud.spanner.jdbc.JdbcStatement.executeStatement(JdbcStatement.java:110)
at com.google.cloud.spanner.jdbc.JdbcStatement.execute(JdbcStatement.java:106)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.execute(JDBCStatementImpl.java:330)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:130)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:513)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$0(SQLQueryJob.java:444)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:431)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:816)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3440)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:118)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:116)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:4718)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: io.grpc.StatusRuntimeException: UNIMPLEMENTED: Check Constraint is not implemented.
at io.grpc.Status.asRuntimeException(Status.java:535)
at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:533)
at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at com.google.cloud.spanner.spi.v1.SpannerErrorInterceptor$1$1.onClose(SpannerErrorInterceptor.java:100)
at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:557)
at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:69)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:738)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:717)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
... 3 more
Check constraints is available in the latest version of emulator. Can you please check and get back to us if there are issues.
https://console.cloud.google.com/gcr/images/cloud-spanner-emulator/global/emulator#sha256:ef0fd2ec74bb17b6c31e5010fb699fd0009b9829721d5159e6a11b6a40f881f1/details
Well... it looks like check constraints don't seem to work with the emulator. I tried this with a test db in the console and it worked. If anyone comes up with a solution to this, however, or if there's something I need to configure, please let me know! Thank you.

Presto connect to Netezza

is there any way I can connect to netezza database using Presto?
2018-08-08T13:42:06.124-0400 ERROR Query-20180808_174205_00000_qwr8d-165 org.postgresql.Driver Connection error:
org.postgresql.util.PSQLException: A connection could not be made using the requested protocol null.
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:57)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:194)
at org.postgresql.Driver.makeConnection(Driver.java:450)
at org.postgresql.Driver.connect(Driver.java:252)
at com.facebook.presto.plugin.jdbc.DriverConnectionFactory.openConnection(DriverConnectionFactory.java:59)
at com.facebook.presto.plugin.jdbc.BaseJdbcClient.getSchemaNames(BaseJdbcClient.java:119)
at com.facebook.presto.plugin.jdbc.JdbcMetadata.listSchemaNames(JdbcMetadata.java:72)
at com.facebook.presto.metadata.MetadataManager.listSchemaNames(MetadataManager.java:290)
at com.facebook.presto.connector.informationSchema.InformationSchemaMetadata.calculatePrefixesWithSchemaName(InformationSchemaMetadata.java:257)
at com.facebook.presto.connector.informationSchema.InformationSchemaMetadata.getTableLayouts(InformationSchemaMetadata.java:222)
at com.facebook.presto.metadata.MetadataManager.getLayouts(MetadataManager.java:350)
at com.facebook.presto.sql.planner.iterative.rule.PickTableLayout.planTableScan(PickTableLayout.java:203)
at com.facebook.presto.sql.planner.iterative.rule.PickTableLayout.access$200(PickTableLayout.java:61)
at com.facebook.presto.sql.planner.iterative.rule.PickTableLayout$PickTableLayoutWithoutPredicate.apply(PickTableLayout.java:186)
at com.facebook.presto.sql.planner.iterative.rule.PickTableLayout$PickTableLayoutWithoutPredicate.apply(PickTableLayout.java:153)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.transform(IterativeOptimizer.java:165)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreNode(IterativeOptimizer.java:138)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreGroup(IterativeOptimizer.java:103)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreChildren(IterativeOptimizer.java:185)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreGroup(IterativeOptimizer.java:105)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreChildren(IterativeOptimizer.java:185)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.exploreGroup(IterativeOptimizer.java:105)
at com.facebook.presto.sql.planner.iterative.IterativeOptimizer.optimize(IterativeOptimizer.java:94)
at com.facebook.presto.sql.planner.LogicalPlanner.plan(LogicalPlanner.java:140)
at com.facebook.presto.sql.planner.LogicalPlanner.plan(LogicalPlanner.java:129)
at com.facebook.presto.execution.SqlQueryExecution.doAnalyzeQuery(SqlQueryExecution.java:341)
at com.facebook.presto.execution.SqlQueryExecution.analyzeQuery(SqlQueryExecution.java:326)
at com.facebook.presto.execution.SqlQueryExecution.start(SqlQueryExecution.java:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
If I tried to connect using netezza driver I am getting following error.
etc/config.properties
plugin.bundles=com.facebook.presto:presto-netezza:0.117
2018-08-08T13:15:37.716-0400 ERROR main com.facebook.presto.server.PrestoServer Could not resolve artifact: io.airlift.resolver.DefaultArtifact#558127d2
java.lang.RuntimeException: Could not resolve artifact: io.airlift.resolver.DefaultArtifact#558127d2
at com.facebook.presto.server.PluginManager.createClassLoader(PluginManager.java:284)
at com.facebook.presto.server.PluginManager.buildClassLoaderFromCoordinates(PluginManager.java:274)
at com.facebook.presto.server.PluginManager.buildClassLoader(PluginManager.java:239)
at com.facebook.presto.server.PluginManager.loadPlugin(PluginManager.java:154)
at com.facebook.presto.server.PluginManager.loadPlugins(PluginManager.java:142)
at com.facebook.presto.server.PrestoServer.run(PrestoServer.java:117)
at com.facebook.presto.server.PrestoServer.main(PrestoServer.java:67)
There is no (official) Presto plugin for Netezza yet (that I know of).
You can write one, or pay someone else to write one.

Resources