In apache Hive CLI or Beeline CLI, I need to concatenate value of a variable with a string. Is it possible to do so?
Example:
set path_on_hdfs="/apps/hive/warehouse/my_db.db";
how to get something like '${hivevar:path_on_hdfs}/myTableName'?
You can try something like this:
set path_on_hdfs= /test1/test2
create external table test(id int)
location '${hiveconf:path_on_hdfs}/myTable';
is should work...
Related
I am calling HQL from shell script.
I am passing variable to HQL from querying from other table. Variable I am passing as follows:
$$A1=('123','124')
I see variable $$A1 properly in shell script with echo statement and it displayed as ('123','124').
but when I am using this variable in query, its missing single quotes. I mean it is passing as (123,124)
I am passing as $$A1 as follows:
select * from table1 where cd in $$A1
query is taking as select * from table where cd in (123,124)
why single quotes are missing when it is passing to the query.
appreciate any help on this.
Thanks,
Babu
Is there any way to save query result to file using pure SQL form?
I understand that we can do it easily using java or scala api. But I am looking for pure SQL solution which can be executed via Spark-sql CLI directly.
Create table is the closest thing:
CREATE TABLE table
USING format -- some format
LOCATION '/path/to/location'
AS SELECT * FROM some_view -- some query
It is not fully equivalent to simple save methods, as it uses metastore.
It's very convenient to be able to set script variables. For example,
SET start_date = 20151201;
SELECT * FROM some_table where date = {$hiveconf:start_date};
Does Presto have this capability?
You can do this
WITH VARIABLES AS (SELECT VALUE AS VAR1, VALUE AS VAR2)
SELECT *
FROM TABLE CROSS JOIN VARIABLES
WHERE COLUMN = VAR1
Not yet.
Presto only have set session command for setting some presto properties during current session.
For example
SET SESSION distributed_join=true;
But presto can not set a variable and use it in following sql like hive does.
Not quite a script variable per say; but some UIs like DataGrip (probably DBeaver as well) emulate passing variables; e.g.:
where date = ${start_date}
Does anyone know of a way in Spark SQL to execute a string variable like the following?
INSERT TableA (Col1,Col2) SELECT Col1,Col2 FROM TableB
I understand that I can obviously write this statement directly. However, I am using a work flow engine where my Insert/Select statement is in String variable. If not, I assume I should use spark_submit. I was looking for other options.
I'm not sure what environment you're in. If this is a Spark application or a Spark shell you always provide queries as strings:
val query = "INSERT TableA (Col1,Col2) SELECT Col1,Col2 FROM TableB"
sqlContext.sql(query)
(See http://spark.apache.org/docs/latest/sql-programming-guide.html#running-sql-queries-programmatically.)
Spark Sql also support hive queries
insert overwrite table usautomobiles select * from sourcedata
Go Through this link
I am writing a shell script which involves BigQuery commands to query an existing table and save the results to a destination table.
However, since my script will be run periodically, I have a parameter for the date for which the query should run.
For example, my script looks like this:
DATE_FORMATTED=$(date +%Y%m%d)
bq query --destination_table=Desttables.abc_$DATE_FORMATTED "select hits_eventInfo_eventLabel from TABLE_DATE_RANGE([mydata.table_],TIMESTAMP($DATE_FORMATTED),TIMESTAMP($DATE_FORMATTED)) where customDimensions_index = 4"
I get the following error:
Error in query string: Error processing job 'pro-cn:bqjob_r5437894379_1': FROM clause with table wildcards matches no table
How else can I pass the variable $DATE_FORMATTED to the TABLE_DATE_RANGE function from BigQuery in order to help execute my query?
Use double quotes "" + single quote ''. For example, in your case:
TIMESTAMP("'$DATE_FORMATTED'")
OR
select "'$variable'" as dummy from your_table
You are probably missing the single quotes around the $DATE_FORMATTED value inside the TIMESTAMP functions. Without the quotes it's going to be defaulting to the EPOCH time.
Try with:
TIMESTAMP('$DATE_FORMATTED'),TIMESTAMP('$DATE_FORMATTED')