Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Query information_schema not working (Exception thrown in awaitResult) #443

Open
gfallerBB opened this issue Aug 29, 2019 · 4 comments
Open

Comments

@gfallerBB
Copy link

When i try to query the list of tables in a schema from a Redshift DB. I get following error.
I have tried to use query and dbtable options with the same result. When i query the DB with lets say dbeaver I can extract the list of tables with no problem. If I use below script with a "real" table it works fine.
DataBricks: 5.3 (includes Apache Spark 2.4.0, Scala 2.11)

`java.sql.SQLException: Exception thrown in awaitResult:
/databricks/spark/python/pyspark/sql/dataframe.py in show(self, n, truncate, vertical)
377 """
378 if isinstance(truncate, bool) and truncate:
--> 379 print(self._jdf.showString(n, 20, vertical))
380 else:
381 print(self._jdf.showString(n, int(truncate), vertical))

/databricks/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py in call(self, *args)
1255 answer = self.gateway_client.send_command(command)
1256 return_value = get_return_value(
-> 1257 answer, self.gateway_client, self.target_id, self.name)
1258
1259 for temp_arg in temp_args:`

This is the script i use:
`JDBC_URL = "jdbc:redshift://xyz.redshift.amazonaws.com:5439/xyz?user=user&password=pwd"
SQL_QUERY = "SELECT * FROM information_schema.tables t WHERE t.table_schema = 'schema_name' AND t.table_type = 'BASE TABLE'"
REDSHIFT_S3_TEMP_FOLDER = 's3a://xyz'

df = spark.read
.format("com.databricks.spark.redshift")
.option("url", JDBC_URL)
.option("query", SQL_QUERY)
.option("tempdir", REDSHIFT_S3_TEMP_FOLDER)
.option("forward_spark_s3_credentials", "true")
.load()

df.show()`

@xavier-rigau
Copy link

Exact same issue!

@udaypatel-psi
Copy link

hi, do we have a solution for this? I am having exactly the same issue of not being able to write to Redshift from Databricks but from DBeaver it works well. I can read from Redshift using Databricks (FYI)..

@toaryangupta
Copy link

Any solution available?

@udaypatel-psi
Copy link

udaypatel-psi commented Apr 13, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants