0

Getting the error when I try to execute spark sql.

Caused by: org.apache.spark.sql.AnalysisException: [NOT_SUPPORTED_COMMAND_WITHOUT_HIVE_SUPPORT] CREATE Hive TABLE (AS SELECT) 
is not supported, if you want to enable it, please set "spark.sql.catalogImplementation" to "hive".;
'CreateTable `spark_catalog`.`raw`.`raw_customers`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, ErrorIfExists\n\n\tat org.apache.spark.sql.errors.QueryCompilationErrors$.ddlWithoutHiveSupportEnabledError(QueryCompilationErrors.scala:1806)tat 
org.apache.spark.sql.execution.datasources.HiveOnlyCheck$.$anonfun$apply4ドル(rules.scala:466)
aran
13.6k5 gold badges49 silver badges76 bronze badges
asked Apr 18, 2025 at 15:54

1 Answer 1

0
NOT_SUPPORTED_COMMAND_WITHOUT_HIVE_SUPPORT

Tells you the root.

You have to enable Hive support for Spark.
Set this config in your SparkSession initialization:

spark.conf.set("spark.sql.catalogImplementation", "hive")
answered Apr 18, 2025 at 16:00
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.