java.lang.ClassNotFoundException: com.imooc.bigdata.SparkWordCountApp
来源:2-17 YARN模式下提交Spark应用程序

慕娘1366764
2020-12-06
具体的问题是:
java.lang.ClassNotFoundException: com.imooc.bigdata.SparkWordCountApp
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:810)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
这个问题我搜了很多次,也试了不同的方法去解决,但是就不行。
我的command是:
./bin/spark-submit --class com.imooc.bigdata.SparkWordCountApp\
--master yarn\
--name SparkWordCountApp\
/home/hadoop/lib/sparksql-train-1.0.jar\
hdfs://hadoop000:8020/pk/wc.data hdfs://hadoop000:8020/pk/out
我也核对了我scala的文件文件名和文件里的内容,都是SparkWordCountApp。
写回答
1回答
-
Michael_PK
2020-12-06
三个可能,一是你的类名全路径是不是对,二是脚本中jar的路径是否对,三是你的jar里面是否真的已经把类的Java文件打包进去了
042022-05-08