spark 2.4.0 Master正常, Worker节点报以下错误:

来源:3-5 Spark Standalone模式环境搭建

花开北海

2019-01-16

19/01/16 11:21:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
19/01/16 11:21:55 INFO Utils: Successfully started service ‘sparkWorker’ on port 42143.
19/01/16 11:21:55 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
org.apache.spark.SparkException: Invalid master URL: spark://VM_0_8_centos:7077
at org.apache.spark.util.Utils.extractHostPortFromSparkUrl(Utils.scala:2407)atorg.apache.spark.rpc.RpcAddress.extractHostPortFromSparkUrl(Utils.scala:2407) at org.apache.spark.rpc.RpcAddress.extractHostPortFromSparkUrl(Utils.scala:2407)atorg.apache.spark.rpc.RpcAddress.fromSparkURL(RpcAddress.scala:47)

写回答

2回答

Michael_PK

2019-01-16

无效的master URL,这个是云主机自动生成的hostname吧,

0
3
花开北海
确实是这个问题
2019-01-16
共3条回复

花开北海

提问者

2019-01-16

提问不允许太长,放到这里啦

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

19/01/16 11:21:53 INFO Worker: Started daemon with process name: 11509@VM_0_8_centos

19/01/16 11:21:53 INFO SignalUtils: Registered signal handler for TERM

19/01/16 11:21:53 INFO SignalUtils: Registered signal handler for HUP

19/01/16 11:21:53 INFO SignalUtils: Registered signal handler for INT

19/01/16 11:21:53 WARN Utils: Your hostname, VM_0_8_centos resolves to a loopback address: 127.0.0.1; using 172.27.0.8 instead (on interface eth0)

19/01/16 11:21:53 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address

19/01/16 11:21:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

19/01/16 11:21:54 INFO SecurityManager: Changing view acls to: hadoop

19/01/16 11:21:54 INFO SecurityManager: Changing modify acls to: hadoop

19/01/16 11:21:54 INFO SecurityManager: Changing view acls groups to: 

19/01/16 11:21:54 INFO SecurityManager: Changing modify acls groups to: 

19/01/16 11:21:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set()

19/01/16 11:21:55 INFO Utils: Successfully started service 'sparkWorker' on port 42143.

19/01/16 11:21:55 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]

org.apache.spark.SparkException: Invalid master URL: spark://VM_0_8_centos:7077

at org.apache.spark.util.Utils$.extractHostPortFromSparkUrl(Utils.scala:2407)

at org.apache.spark.rpc.RpcAddress$.fromSparkURL(RpcAddress.scala:47)

at org.apache.spark.deploy.worker.Worker$$anonfun$16.apply(Worker.scala:800)

at org.apache.spark.deploy.worker.Worker$$anonfun$16.apply(Worker.scala:800)

at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)

at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)

at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)

at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)

at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)

at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)

at org.apache.spark.deploy.worker.Worker$.startRpcEnvAndEndpoint(Worker.scala:800)

at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:769)

at org.apache.spark.deploy.worker.Worker.main(Worker.scala)


0
0

以慕课网日志分析为例 进入大数据Spark SQL的世界

快速转型大数据:Hadoop,Hive,SparkSQL步步为赢

1644 学习 · 1129 问题

查看课程