报错信息

来源:10-3 -Spark SQL加载数据

慕运维7479159

2017-12-25

[hadoop@hadoop001 bin]$ ./spark-shell --master local[2] --jars ~/software/mysql-connector-java-5.1.27-bin.jar 

Setting default log level to "WARN".

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

17/12/24 21:04:36 WARN spark.SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0

17/12/24 21:04:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

17/12/24 21:04:47 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0

17/12/24 21:04:47 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':

  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)

  at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)

  at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)

  at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)

  at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)

  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)

  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)

  at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)

  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)

  at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)

  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)

  at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)

  ... 47 elided

Caused by: java.lang.reflect.InvocationTargetException: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':

  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)

  ... 58 more

Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':

  at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)

  at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)

  at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)

  at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)

  at scala.Option.getOrElse(Option.scala:121)

  at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)

  at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)

  at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)

  at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)

  ... 63 more

Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.net.ConnectException: Call From hadoop001/192.168.140.128 to hadoop001:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

  at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)

  ... 71 more

Caused by: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.net.ConnectException: Call From hadoop001/192.168.140.128 to hadoop001:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

  at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)

  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)

  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)

  at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)

  ... 76 more

Caused by: java.lang.RuntimeException: java.net.ConnectException: Call From hadoop001/192.168.140.128 to hadoop001:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)

  at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)

  ... 84 more

Caused by: java.net.ConnectException: Call From hadoop001/192.168.140.128 to hadoop001:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

  at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)

  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)

  at org.apache.hadoop.ipc.Client.call(Client.java:1475)

  at org.apache.hadoop.ipc.Client.call(Client.java:1408)

  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)

  at com.sun.proxy.$Proxy25.getFileInfo(Unknown Source)

  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:757)

  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

  at java.lang.reflect.Method.invoke(Method.java:606)

  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256)

  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)

  at com.sun.proxy.$Proxy26.getFileInfo(Unknown Source)

  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2102)

  at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1215)

  at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1211)

  at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

  at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1211)

  at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1412)

  at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:596)

  at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)

  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)

  ... 85 more

Caused by: java.net.ConnectException: Connection refused

  at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)

  at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)

  at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)

  at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)

  at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)

  at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)

  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:713)

  at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)

  at org.apache.hadoop.ipc.Client.getConnection(Client.java:1524)

  at org.apache.hadoop.ipc.Client.call(Client.java:1447)

  ... 105 more

<console>:14: error: not found: value spark

       import spark.implicits._

              ^

<console>:14: error: not found: value spark

       import spark.sql

              ^

Welcome to

      ____              __

     / __/__  ___ _____/ /__

    _\ \/ _ \/ _ `/ __/  '_/

   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0

      /_/

         

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51)

Type in expressions to have them evaluated.

Type :help for more information.


scala> 


写回答

1回答

Michael_PK

2017-12-25

Call From hadoop001/192.168.140.128 to hadoop001:8020 failed on connection exception: java.net.ConnectException: Connection refused

思考下是什么原因导致的

0
0

以慕课网日志分析为例 进入大数据Spark SQL的世界

快速转型大数据:Hadoop,Hive,SparkSQL步步为赢

1644 学习 · 1129 问题

查看课程