pyspark - hive meta store.db error in spark 2.0 -
i new user in spark. working on remote linux os based pc using putty. job purpose have created hive table in spark can manipulate sql queries on it.after putty session on when re-enter in linux pc , creating table have got errors
java.sql.sqlexception: unable open test connection given database. jdbc url = jdbc:derby:;databasename=metastore_db;create=true, username = app. terminating connection pool (set lazyinit true if expect start database after app). original exception: ------ java.sql.sqlexception: failed start database 'metastore_db' class loader org.apache.spark.sql.hive.client.isolatedclientloader$$anon$1@79ac37cd, see next exception caused by: error xsdb6: instance of derby may have booted database /home/ubuntu/spark-2.0.0-bin-hadoop2.7/metastore_db.
i have run query
spark.sql("create table if not exists fact_cmdoubtfulaccount (entityid string,leaseid string,suiteid string,txndate date,txndateint int,period string,baddebtamt int)")
before putty session closed working . giving error . think need meta store. kindly guide how can solve problem.i using spark 2.0
in case, running on shell , should not declare new sqlcontext
Comments
Post a Comment