Running plain sql in Slick 3.1.0 - slick

I'm trying to run plain SQL with slick 3.1.0.
The following works:
val q = sql"select name from users".as[String]
however if my sql is in a variable:
val string2 : String = "select name from users"
how do I execute string2 using sql prefix? This doesn't work:
sql+string2

Use interpolation within the string:
val q = sql"#$string2"
The #$ interpolator will use the literal string you're interpolating, so don't use it for user input - it won't quote or anything.
See this section of the docs for more details.

Related

Spark SQL - How do i set a variable within the query, to re-use throughout?

I'm trying to convert a query from T-SQL to Spark's SQL.
I've got 99% of the way, but we've made strong use of the DECLARE statement in T-SQL.
I can't seem to find an alternative in Spark SQL that behaves the same - that is, allows me to declare variables in the query itself, to be re-used in that query.
Example in T-SQL:
DECLARE #varA int
SET #varA = '4'
SELECT * FROM tblName where id = #varA;
How do i do the declaration of such a variable in Spark SQL? (I don't want to use string interpolation, unless necessary)
You can try this:
sqlContext.sql("set id_value = 3")
sqlContext.sql("select * from country where id = ${id_value}").show()

How to use like operator in Presto with string that contain a dot?

I have a column string like this
"test.123.test"
"something"
And I want run a query to find string like "test.*.test". In postgresql I use this query:
select * from table where string_column like 'test.%.test'
I run this query in presto but got nothing! It should be related to dot in my string because when I replace string like with something like this 'test.1%1.test' it work but it's not my result.
For a Presto query
string_column like 'test.%.test'
the predicate pushed down into the PostgreSQL connector is similar to:
string_column BETWEEN 'test.' AND 'test/'
however, string comparison are subject to collation and trailing punctuations hits an edge case of Presto/PostgreSQL incompatibility: https://github.com/trinodb/trino/issues/3645
You can workaround this by preventing predicate pushdown into the connector. You can achieve this by adding OR rand() = 42 to your query:
string_column like 'test.%.test' OR rand() = 42

Azure Kusto Java SDK get all columns of query results

I'm using the Azure Kusto Java SDK v2.0.1 with Scala over Java8.
I'm executing some query:
val query = " ... "
val tenantId = " ... "
val queryResponse = client.execute(tenantId, query)
val queryResponseResults = queryResponse.getPrimaryResults
I want to convert the given data structure to JSON eventually, so I want to get all columns, but I can't seem to find some kind of getColumns.
While debugging I see the object (KustoResultSetTable) has fields columnsAsArray (which is exactly what I want) and columns - but they are private and I didn't find any getters.
A getter will be added in the next version

How to use a variables in SQL statement in databricks?

I want to use a WHERE statement with two variables within the where clause. I've done research on this looking at how to use variables in SQL statements in Databricks and Inserting Variables Using Python, Not Working. I've tried to implement the solutions provided but it's not working.
a= 17091990
b = 30091990
df = spark.sql(' SELECT * FROM table WHERE date between "a" AND "b" ')
You can use python's formatted string literals
df = spark.sql(f"SELECT * FROM table WHERE date between {a} AND {b} ")
For more about formatted string literals you can refer to https://docs.python.org/3/whatsnew/3.6.html#whatsnew36-pep498

spark magic - enter sql context as string

Connecting to spark over livy works fine in Jupyter,
as does the following spark magic:
%%spark -c sql
select * from some_table
Now how can I use string variables to query tables?
The following does not work:
query = 'select * from some_table'
Next cell:
%%spark -c sql
query
Nor does the following work:
%%spark -c sql
'select * from some_table'
Any ideas? Is it possible to "echo" the content of a string variable into a cell?
Seems like I found a solution.
There is a function that turns strings into cell magic commands:
%%local
from IPython import get_ipython
ipython = get_ipython()
line = '-c sql -o df'
query = 'select * from some_table'
ipython.run_cell_magic(magic_name='spark', line=line, cell=query)
After this, the query is in the pandas dataframe df.

Resources