hiveql - HIVE on Spark Issue -


i trying configure hive on spark after trying 5 days not getting solution..

steps followed:

1.after spark installation,going in hive console , setting below proeprties

set hive.execution.engine=spark; set spark.master=spark://inbbrdssvm294:7077; set spark.executor.memory=2g;              set spark.serializer=org.apache.spark.serializer.kryoserializer; 

2.added spark -asembly jar in hive lib.

3.when running select count(*) table_name getting below error:

2016-08-08 15:17:30,207 error [main]: spark.sparktask (sparktask.java:execute(131))  - failed execute spark task, exception  'org.apache.hadoop.hive.ql.metadata.hiveexception (failed create spark client.)' 

hive version: 1.2.1
spark version: tried 1.6.1,1.3.1 , 2.0.0
appreciate if 1 can suggest something.

you can download spark-1.3.1 src spark download website , try build spark-1.3.1 without hive version using:

./make-distribution.sh --name "hadoop2-without-hive" --tgz "-pyarn,hadoop-provided,hadoop-2.4" -dhadoop.version=2.7.1 -dyarn.version=2.7.1 –dskiptests 

then copy spark-assembly-1.3.1-hadoop2.7.1.jar hive/lib folder.

and follow https://cwiki.apache.org/confluence/display/hive/hive+on+spark%3a+getting+started#hiveonspark:gettingstarted-sparkinstallation set necessary properties.


Comments

Popular posts from this blog

Spring Boot + JPA + Hibernate: Unable to locate persister -

go - Golang: panic: runtime error: invalid memory address or nil pointer dereference using bufio.Scanner -

c - double free or corruption (fasttop) -