Databricks SQL, error when running update with join - databricks

I am trying to run an update on a Delta table and I am getting the following error.
Error in SQL statement: ParseException:
mismatched input 'from' expecting <EOF>(line 3, pos 0)
I really can't figure out why I get this error. Can anyone help me?
update eff
set eff.ACP = ia.COSTVALUE
from views.test_acp_effect eff
left join source_tables_db.ia_master_items ia
on eff.CODE = ia.IMM_CODE
where eff.DXN_Period = (
select td.MY_FISCAL_PERIOD_ABBR
from timedelta td
where current_date() between td.MIN_P_DATE
and td.MAX_P_DATE
)
and eff.CODE = source_tables_db.ia_master_items.IMM_CODE

Related

EOF in multi-line string error in PySpark

I was running the following query in PySpark, the SQL query runs fine on Hive
spark.sql(f"""
create table DEFAULT.TMP_TABLE as
select b.customer_id, prob
from
bi_prod.TMP_score_1st_day a,
(select customer_id, prob from BI_prod.Tbl_brand1_scoring where insert_date = 20230101
union all
select customer_id, prob from BI_prod.Tbl_brand2_scoring where insert_date = 20230101)  b
where a.customer_id = b.customer_id
""")
This produces the following error
ERROR:root:An unexpected error occurred while tokenizing input
The following traceback may be corrupted or invalid
The error message is: ('EOF in multi-line string', (1, 0))
Need to fix this error, can't find out why error is occurring.
I recommend rewriting the code in a more Pythonic way.
from pyspark.sql.functions import col
df1 = (
spark.table('bi_prod.TMP_score_1st_day')
.filter(col('insert_date') == '20230101')
.select('customer_id')
)
df2 = (
spark.table('bi_prod.Tbl_brand2_scoring')
.filter(col('insert_date') == '20230101')
.select('customer_id', 'prob')
)
df = df1.join(df2, 'customer_id')
df.show(1, vertical=True)
Let me know how this works for you and if you still get the same error.

Dask read_sql_query did not execute sql that I put in

Hi all I'm new to Dask.
I faced an error when I tried using read_sql_query to get data from Oracle database.
Here is my python script:
con_str = "oracle+cx_oracle://{UserID}:{Password}#{Domain}/?service_name={Servicename}"
sql= "
column_a, column_b
from
database.tablename
where
mydatetime >= to_date('1997-01-01 00:00:00','YYYY-MM-DD HH24:MI:SS')
"
from sqlalchemy.sql import select, text
from dask.dataframe import read_sql_query
sa_query= select(text(sql))
ddf = read_sql_query(sql=sa_query, con=con, index_col="index", head_rows=5)
I refered this post: Reading an SQL query into a Dask DataFrame
Remove "select" string from my query.
And I got an cx_Oracle.DatabaseError with missing expression [SQL: SELECT FROM DUAL WHERE ROWNUM <= 5]
But I don't get it where the query came from.
Seem like it didn't execute the sql code I provided.
I'm not sure which part I did not config right.
*Note: using pandas.read_sql is ok , only fail when using dask.dataframe.read_sql_query

Getting Databricks error Error SparkUnsupportedOperationException: [INTERNAL_ERROR] Cannot generate code for expression: outer

In Databricks sql while executing SQL with NOT EXISTS operator (using correlated subquery) its not working. Getting Databricks error Error SparkUnsupportedOperationException: [INTERNAL_ERROR] Cannot generate code for expression: outer.
Below is the sql query
SELECT in_cs.COMM_ID AS CUSTOMER_SERVICE_EPIC_ID,
Data.CUR_VALUE_DATETIME AS VALUE_INSTANT,
FROM hive_metastore.RAW_CLARITY.SMRTDTA_ELEM_DATA Data
INNER JOIN hive_metastore.RAW_CLARITY.SMRTDTA_ELEM_VALUE Value
ON Data.HLV_ID = Value.HLV_ID
INNER JOIN hive_metastore.RAW_CLARITY.CLARITY_CONCEPT SmartDataElement
ON Data.ELEMENT_ID = SmartDataElement.CONCEPT_ID
INNER JOIN hive_metastore.RAW_CLARITY.CUST_SERVICE in_cs
ON Data.RECORD_ID_NUMERIC = in_cs.COMM_ID AND NOT EXISTS
( SELECT 1 FROM hive_metastore.RAW_CLARITY.CUST_SERVICE AS cs
LEFT JOIN hive_metastore.RAW_CLARITY.CAL_REFERENCE_CRM AS crc
ON cs.COMM_ID = crc.REF_CRM_ID
LEFT JOIN hive_metastore.RAW_CLARITY.CAL_COMM_TRACKING AS cct
ON crc.COMM_ID = cct.COMM_ID
WHERE cct.COMM_ID IS NULL AND in_cs.COMM_ID = cs.COMM_ID)

jpql query with date literal causes IllegalArgumentException

I am trying to write query with date literal:
SELECT r
FROM Restaurant r
LEFT JOIN r.dishes dh ON dh.date = {d '2019-12-31'}
GROUP BY r
But when running I get an error
java.lang.IllegalArgumentException: org.hibernate.QueryException: unexpected char: '{'
What's wrong?
Try use only date in ISO format
SELECT r
FROM Restaurant r
LEFT JOIN r.dishes dh ON dh.date = '2019-12-31'
GROUP BY r

OperationalError: near "u": syntax error <- while trying to delete rows from 2 columns inner jointed

I am trying to delete rows from two tables with inner join. I don't really understand why this error pops up.
import sqlite3
login = 'uzytkownik6'
conn = sqlite3.connect('fiszki.db')
c = conn.cursor()
c.execute("DELETE u.*, t.* FROM users u INNER JOIN translations t ON
u.user_id=t.user_id WHERE u.user_name='{}'".format(login))
conn.commit()
But I get error:
OperationalError: near "u": syntax error
You should never use the normal python string formatting when executing SQL commands. Example: db.execute("DELETE FROM users WHERE userId = (?)", [userId]). Also, you don't really need to have run the db.cursor() method after connecting. See SQLite3 API documentation for Python 3.

Resources