spark 启动scala 或者 pyspark都会有问题,好像路径有问题?

来源:3-6 Spark简单使用

Beebop

2017-12-22

17/12/21 17:55:40 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0

17/12/21 17:55:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

17/12/21 17:55:41 WARN Utils: Your hostname, hadoop001 resolves to a loopback address: 127.0.0.1; using 10.41.13.210 instead (on interface eth1)

17/12/21 17:55:41 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address

17/12/21 17:55:44 WARN HiveMetaStore: Retrying creating default database after error: Error creating transactional connection factory

javax.jdo.JDOFatalInternalException: Error creating transactional connection factory

at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)

at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)



写回答

3回答

Beebop

提问者

2017-12-22

启动 start-all.sh也有问题


starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop001.out

localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-hadoop001.out


0
0

Beebop

提问者

2017-12-22

sc 没有启动起来

0
0

Beebop

提问者

2017-12-22


Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.


Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.


Caused by: java.lang.reflect.InvocationTargetException: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':


Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':

 

Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient


Caused by: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient


Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient



Caused by: java.lang.reflect.InvocationTargetException: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory


Caused by: java.lang.reflect.InvocationTargetException: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

  

<console>:14: error: not found: value spark

       import spark.implicits._

              ^

<console>:14: error: not found: value spark

       import spark.sql


0
3
Michael_PK
回复
Beebop
先hdfs起来先,spark shell启动加--jars
2017-12-22
共3条回复

以慕课网日志分析为例 进入大数据Spark SQL的世界

快速转型大数据:Hadoop,Hive,SparkSQL步步为赢

1644 学习 · 1129 问题

查看课程