Does Presto have the equivalent of Hive's SET command - presto

It's very convenient to be able to set script variables. For example,
SET start_date = 20151201;
SELECT * FROM some_table where date = {$hiveconf:start_date};
Does Presto have this capability?

You can do this
WITH VARIABLES AS (SELECT VALUE AS VAR1, VALUE AS VAR2)
SELECT *
FROM TABLE CROSS JOIN VARIABLES
WHERE COLUMN = VAR1

Not yet.
Presto only have set session command for setting some presto properties during current session.
For example
SET SESSION distributed_join=true;
But presto can not set a variable and use it in following sql like hive does.

Not quite a script variable per say; but some UIs like DataGrip (probably DBeaver as well) emulate passing variables; e.g.:
where date = ${start_date}

Related

How to change format of date using sqlplus

i have problem with sqlplus couse
sqldeveloper gives me data like this:
22/03/09 52345
22/03/10 53462
22/03/11 26436
and sqlplus gives me data in this format
09-MAR-22 52345
10-MAR-22 53462
11-MAR-22 26436
Is any way to change name of month on number using an argument or set ... ?
LIKE:
sqlplus [arg] Login/pass#key << EOD
The quick answer is to set the NLS_DATE_FORMAT session variable:
alter session set nls_date_format = 'YYYY-MM-DD';
This will only work for dates. There are other settings for timestamps (NLS_TIMESTAMP_FORMAT) and timestamps with timezones (NLS_TIMESTAMP_TZ_FORMAT).
You can also set these variables as environment variables, and sqlplus should pick them up. You can also set these defaults in SQL Developer by going to Tools > Preferences > Database > NLS.
Check the Oracle documentation for more information.

Presto - static date and timestamp in where clause

I'm pretty sure the following query used to work for me on Presto:
select segment, sum(count)
from modeling_trends
where segment='2557172' and date = '2016-06-23' and count_time between '2016-06-23 14:00:00.000' and '2016-06-23 14:59:59.000';
group by 1;
now when I run it (on Presto 0.147 on EMR) I get an error of trying to assigning varchar to date/timestamp..
I can make it work using:
select segment, sum(count)
from modeling_trends
where segment='2557172' and date = cast('2016-06-23' as date) and count_time between cast('2016-06-23 14:00:00.000' as TIMESTAMP) and cast('2016-06-23 14:59:59.000' as TIMESTAMP)
group by segment;
but it feels dirty...
is there a better way to do this?
Unlike some other databases, Presto doesn't automatically convert between varchar and other types, even for constants. The cast works, but a simpler way is to use the type constructors:
WHERE segment = '2557172'
AND date = date '2016-06-23'
AND count_time BETWEEN timestamp '2016-06-23 14:00:00.000' AND timestamp '2016-06-23 14:59:59.000'
You can see examples for various types here: https://prestosql.io/docs/current/language/types.html
Just a quick thought.. have you tried omitting the dashes in your date? try 20160623 instead of 2016-06-23.
I encountered something similar with SQL server, but not used Presto.

Concatenation of variable value and string in apache hive

In apache Hive CLI or Beeline CLI, I need to concatenate value of a variable with a string. Is it possible to do so?
Example:
set path_on_hdfs="/apps/hive/warehouse/my_db.db";
how to get something like '${hivevar:path_on_hdfs}/myTableName'?
You can try something like this:
set path_on_hdfs= /test1/test2
create external table test(id int)
location '${hiveconf:path_on_hdfs}/myTable';
is should work...

Spark SQL version of EXEC()

Does anyone know of a way in Spark SQL to execute a string variable like the following?
INSERT TableA (Col1,Col2) SELECT Col1,Col2 FROM TableB
I understand that I can obviously write this statement directly. However, I am using a work flow engine where my Insert/Select statement is in String variable. If not, I assume I should use spark_submit. I was looking for other options.
I'm not sure what environment you're in. If this is a Spark application or a Spark shell you always provide queries as strings:
val query = "INSERT TableA (Col1,Col2) SELECT Col1,Col2 FROM TableB"
sqlContext.sql(query)
(See http://spark.apache.org/docs/latest/sql-programming-guide.html#running-sql-queries-programmatically.)
Spark Sql also support hive queries
insert overwrite table usautomobiles select * from sourcedata
Go Through this link

How can i describe table in cassandra database?

$describe = new Cassandra\SimpleStatement(<<<EOD
describe keyspace.tablename
EOD
);
$session->execute($describe);
i used above code but it is not working.
how can i fetch field name and it's data type from Cassandra table ?
Refer to CQL documentation. Describe expects a table/schema/keyspace.
describe table keyspace.tablename
Its also a cqlsh command, not an actual cql command. To get this information query the system tables. try
select * from system.schema_columns;
- or for more recent versions -
select * from system_schema.columns ;
if using php driver may want to check out http://datastax.github.io/php-driver/features/#schema-metadata
Try desc table keyspace.tablename;

Resources