Configuring Hive metastore for spark
Hi Folk’s ,
This post is going to be about how to configure hive metastore to access your hive table’s in spark sql.
Defaults is spark’s default metastore respective to the session.
1)In Spark-env.sh Add your mysql connector jar path in the class path and hive home variable as shown below.
This is for spark to access your hive metastore i’m using Mysql as metastore , hence Mysql jar.
Copy hive-site.xml to $SPARK_HOME/conf directory.
Note: spark 2.X will only support hive metastore versions ranging from 0.12.0 to 2.3.3 (As give in the documentation https://spark.apache.org/docs/latest/sql-data-sources-hive-tables.html). Default one is 1.2.1 but i am going to use 1.2.0. Hive version 2.X is not supported by spark, i have tried but it didn’t work.