spark2.1.0的standardalone模式,sbin下启动start-all.sh脚本,报JAVA_HOME is not set的问题咨询

来源:3-5 Spark Standalone模式环境搭建

慕神816625

2018-06-04

spark2.1.0的sbin下启动start-all.sh脚本后,报如下信息,即,

starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop001.out

localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-hadoop001.out

localhost: failed to launch: nice -n 0 /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://hadoop001:7077

localhost:   JAVA_HOME is not set

localhost: full log in /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-hadoop001.out


master的日志内容如下:

[hadoop@hadoop001 ~]$ tail -100f /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop001.out

Spark Command: /usr/lib/jvm/jdk1.8.0_161/bin/java -cp /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/conf/:/home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/jars/* -Xmx1g org.apache.spark.deploy.master.Master --host hadoop001 --port 7077 --webui-port 8080

========================================

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

18/06/04 09:57:06 INFO Master: Started daemon with process name: 2321@hadoop001

18/06/04 09:57:06 INFO SignalUtils: Registered signal handler for TERM

18/06/04 09:57:06 INFO SignalUtils: Registered signal handler for HUP

18/06/04 09:57:06 INFO SignalUtils: Registered signal handler for INT

18/06/04 09:57:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

18/06/04 09:57:07 INFO SecurityManager: Changing view acls to: hadoop

18/06/04 09:57:07 INFO SecurityManager: Changing modify acls to: hadoop

18/06/04 09:57:07 INFO SecurityManager: Changing view acls groups to: 

18/06/04 09:57:07 INFO SecurityManager: Changing modify acls groups to: 

18/06/04 09:57:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set()

18/06/04 09:57:09 INFO Utils: Successfully started service 'sparkMaster' on port 7077.

18/06/04 09:57:09 INFO Master: Starting Spark master at spark://hadoop001:7077

18/06/04 09:57:09 INFO Master: Running Spark version 2.1.0

18/06/04 09:57:10 INFO Utils: Successfully started service 'MasterUI' on port 8080.

18/06/04 09:57:10 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://192.168.222.200:8080

18/06/04 09:57:10 INFO Utils: Successfully started service on port 6066.

18/06/04 09:57:10 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066

18/06/04 09:57:11 INFO Master: I have been elected leader! New state: ALIVE

但master没看到内存注入和core的注入的信息。


worker的比较简单就是个:JAVA_HOME is not set

我查看了环境变量~/.bash_profile下有JAVA_HOME,即,

export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_161

export PATH=$PATH:$JAVA_HOME/bin

这样改了后,不报JAVA_HOME is not set的问题了,日志最后几行为,即,

18/06/04 09:57:09 INFO Utils: Successfully started service 'sparkMaster' on port 7077.

18/06/04 09:57:09 INFO Master: Starting Spark master at spark://hadoop001:7077

18/06/04 09:57:09 INFO Master: Running Spark version 2.1.0

18/06/04 09:57:10 INFO Utils: Successfully started service 'MasterUI' on port 8080.

18/06/04 09:57:10 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://192.168.222.200:8080

18/06/04 09:57:10 INFO Utils: Successfully started service on port 6066.

18/06/04 09:57:10 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066

18/06/04 09:57:11 INFO Master: I have been elected leader! New state: ALIVE

18/06/04 10:15:19 ERROR Master: RECEIVED SIGNAL TERM

写回答

2回答

Michael_PK

2018-06-04

你把javahome设置到spark-env.sh中去,把所有进程都停了,重新来过,再看下日志。standalone模式建议玩玩就行,生产上很少用

0
2
starkpan
老师,我的也报了这个错,网上百度了个方法是把java_home 添加到spark-congfig.sh中,启动也正常了。spark-env.sh与spark-config.sh作用是不是作用域不一样? exoport JAVA_HOME=本地javahome路径
2018-08-18
共2条回复

Michael_PK

2018-08-18

spark env中加上Java home

0
0

以慕课网日志分析为例 进入大数据Spark SQL的世界

快速转型大数据:Hadoop,Hive,SparkSQL步步为赢

1644 学习 · 1129 问题

查看课程