Description
What
ParamEscaper
's escape_string()
gives incorrect behavior on Databricks SQL and in Databricks notebooks.
It replaces a single quote '
with ''
, but the correct way to escape '
is with a backslash, like \'
.
You can verify in PySpark with:
assert spark.sql("select 'cat''s meow' as my_col").head(1)[0]['my_col'] == "cats meow"
assert spark.sql("select 'cat\\'s meow' as my_col").head(1)[0]['my_col'] == "cat's meow"
Note that because it starts as a Python literal, we need two backslashes \\
to get Python to first escape \\'
to \'
and then Spark escapes to '
.
I don't know what the motivation for this implementation was, but the result seems to be concatenation instead of escaping the quote character.
Reproduction in databricks-sql-python
The following demonstrates the issue in version 1.2.2 2.0.5 of databricks-sql-python against a serverless SQL warehouse in Azure, v2022.30, plus an implementation without parameter substitution showing an escape treatment that does work :
from typing import List
from databricks import sql
import os
from databricks.sql import ServerOperationError
server_hostname= os.environ.get('DBT_DATABRICKS_HOST')
http_path=f'/sql/1.0/endpoints/{os.environ.get("DBT_DATABRICKS_ENDPOINT")}'
access_token=os.environ.get('DBT_DATABRICKS_TOKEN')
user_agent_entry = "dbt-databricks/1.2.2"
connection = sql.connect(
server_hostname=server_hostname,
http_path=http_path,
access_token=access_token,
_user_agent_entry=user_agent_entry
)
cursor = connection.cursor()
def get_result_using_parameter_bindings(p:List[str]):
try:
cursor.execute('select %s as my_col', p)
result = list(cursor.fetchall())[0]['my_col']
except ServerOperationError as exc:
result = exc.message.strip()[:20] + '...'
return result
def get_result_using_fstring(p:List[str]):
try:
escaped = p[0].replace('\\','\\\\' ).replace("'", "\\'")
cursor.execute(f"select '{escaped}' as my_col")
result = list(cursor.fetchall())[0]['my_col']
except ServerOperationError as exc:
result = exc.message.strip()[:20] + '...'
return result
params = [
["cat's meow"],
["cat\'s meow"],
["cat\\'s meow"],
["cat''s meow"],
]
for p in params:
# using dbt-databricks-sql's parameter substitution
param_binding_result = get_result_using_parameter_bindings(p)
# using manually built and escaped query
f_string_result = get_result_using_fstring(p)
print('\nparameter value:', p[0], 'parameter-binding result:', param_binding_result, 'round-trip ok?', p[0]==param_binding_result)
print('parameter value:', p[0], 'f-string result:', f_string_result, 'round-trip ok?',
p[0] == f_string_result
)
assert p[0] == f_string_result
cursor.close()
connection.close()
The output is:
bash_1 | parameter value: cat's meow parameter-binding result: cats meow round-trip ok? False
bash_1 | parameter value: cat's meow f-string result: cat's meow round-trip ok? True
bash_1 |
bash_1 | parameter value: cat's meow parameter-binding result: cats meow round-trip ok? False
bash_1 | parameter value: cat's meow f-string result: cat's meow round-trip ok? True
bash_1 |
bash_1 | parameter value: cat\'s meow parameter-binding result: [PARSE_SYNTAX_ERROR]... round-trip ok? False
bash_1 | parameter value: cat\'s meow f-string result: cat\'s meow round-trip ok? True
bash_1 |
bash_1 | parameter value: cat''s meow parameter-binding result: cats meow round-trip ok? False
bash_1 | parameter value: cat''s meow f-string result: cat''s meow round-trip ok? True
Expected results
String parameters with single quotes and backslashes should be properly reproduced:
"cat's meow"
would be escaped as "cat\\'s meow"
and the resulting SQL would return cat's meow
"cat\\'s meow"
would escape to "cat\\\\\\'s meow"
and the SQL would return cat\'s meow
Suggested fix
I'm not sure how this is usually implemented, but in my example just doing param.replace('\\','\\\\' ).replace("'", "\\'")
at least preserves single quotes and backslashes, which are probably the most common cases. It would also leave alone escaped unicode literals like \U0001F44D
.
How I encountered it
I'm using dbt with Databricks and noticed on upgrading from dbt-databricks 1.0 to 1.2.2 that single quotes started disappearing from our "seeds" (csv files loaded as Delta tables). Code had changed in dbt-databricks to use the new parameter binding functionality in this library, whereas (I assume) before it must have been injecting the values as literals into the SQL.