启动报错,用的十四章源代码和数据,是不是windows下load不了模型数据?

来源:14-1 点击率预估模型的介绍

master_D

2020-07-10

Caused by: java.lang.RuntimeException: Error while running command to get file permissions : java.io.IOException: (null) entry in command string: null ls -F C:\Users\86188\Desktop\dianping\data\lrmode\metadata\part-00000
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:762)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:859)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:842)
	at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:587)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:562)
	at org.apache.hadoop.fs.LocatedFileStatus.<init>(LocatedFileStatus.java:47)
	at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1701)
	at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1681)
	at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:268)
	at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
	at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
	at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:204)
	at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49)
	at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
	at org.apache.spark.rdd.RDD.$anonfun$take$1(RDD.scala:1343)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
	at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
	at org.apache.spark.rdd.RDD.$anonfun$first$1(RDD.scala:1378)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
	at org.apache.spark.rdd.RDD.first(RDD.scala:1378)
	at org.apache.spark.ml.util.DefaultParamsReader$.loadMetadata(ReadWrite.scala:615)
	at org.apache.spark.ml.classification.LogisticRegressionModel$LogisticRegressionModelReader.load(LogisticRegression.scala:1251)
	at org.apache.spark.ml.classification.LogisticRegressionModel$LogisticRegressionModelReader.load(LogisticRegression.scala:1245)
	at org.apache.spark.ml.util.MLReadable.load(ReadWrite.scala:380)
	at org.apache.spark.ml.util.MLReadable.load$(ReadWrite.scala:380)
	at org.apache.spark.ml.classification.LogisticRegressionModel$.load(LogisticRegression.scala:1220)
	at org.apache.spark.ml.classification.LogisticRegressionModel.load(LogisticRegression.scala)
	at com.imooc.dianping.recommend.RecommendSortService.init(RecommendSortService.java:31)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:363)
	at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:307)
	at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:136)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:414)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1770)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:593)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1251)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1171)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:593)
	at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:90)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:374)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1411)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:592)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1251)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1171)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:593)
	at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:90)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:374)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1411)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:592)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:845)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549)
	at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140)
	at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:742)
	at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:389)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:311)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:1213)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:1202)
	at com.imooc.dianping.DianpingApplication.main(DianpingApplication.java:16)

	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:625) ~[hadoop-common-2.6.5.jar:na]
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:562) ~[hadoop-common-2.6.5.jar:na]
	at org.apache.hadoop.fs.LocatedFileStatus.<init>(LocatedFileStatus.java:47) ~[hadoop-common-2.6.5.jar:na]
	at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1701) ~[hadoop-common-2.6.5.jar:na]
	at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1681) ~[hadoop-common-2.6.5.jar:na]
	at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:268) ~[hadoop-mapreduce-client-core-2.6.5.jar:na]
	at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228) ~[hadoop-mapreduce-client-core-2.6.5.jar:na]
	at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313) ~[hadoop-mapreduce-client-core-2.6.5.jar:na]
	at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:204) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at scala.Option.getOrElse(Option.scala:121) ~[scala-library-2.12.4.jar:na]
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:251) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at scala.Option.getOrElse(Option.scala:121) ~[scala-library-2.12.4.jar:na]
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:251) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDD.$anonfun$take$1(RDD.scala:1343) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDD.take(RDD.scala:1337) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDD.$anonfun$first$1(RDD.scala:1378) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.rdd.RDD.first(RDD.scala:1378) ~[spark-core_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.ml.util.DefaultParamsReader$.loadMetadata(ReadWrite.scala:615) ~[spark-mllib_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.ml.classification.LogisticRegressionModel$LogisticRegressionModelReader.load(LogisticRegression.scala:1251) ~[spark-mllib_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.ml.classification.LogisticRegressionModel$LogisticRegressionModelReader.load(LogisticRegression.scala:1245) ~[spark-mllib_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.ml.util.MLReadable.load(ReadWrite.scala:380) ~[spark-mllib_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.ml.util.MLReadable.load$(ReadWrite.scala:380) ~[spark-mllib_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.ml.classification.LogisticRegressionModel$.load(LogisticRegression.scala:1220) ~[spark-mllib_2.12-2.4.4.jar:2.4.4]
	at org.apache.spark.ml.classification.LogisticRegressionModel.load(LogisticRegression.scala) ~[spark-mllib_2.12-2.4.4.jar:2.4.4]
	at com.imooc.dianping.recommend.RecommendSortService.init(RecommendSortService.java:31) ~[classes/:na]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_241]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_241]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_241]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_241]
	at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:363) ~[spring-beans-5.1.8.RELEASE.jar:5.1.8.RELEASE]
	at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:307) ~[spring-beans-5.1.8.RELEASE.jar:5.1.8.RELEASE]
	at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:136) ~[spring-beans-5.1.8.RELEASE.jar:5.1.8.RELEASE]
	... 44 common frames omitted

看抛出的异常信息是RecommendSortService.java的31行代码load方法报错

图片描述

写回答

3回答

慕设计7056922

2024-06-25

我遇到相同的问题,按照以下步骤即可全部解决:

1、需要在本地安装对应版本的 hadoop环境,并配置环境变量

2、下载与hadoop版本相同的winUtils,把bin文件夹里面的全部内容拷贝到hadoop/bin/ 目录下,有重复的文件直接覆盖即可。

3、在idea 的 main方法第一行指定hadoop.home.dir

System.setProperty("hadoop.home.dir", "D:\\soft\\hadoop-2.6.5");


0
0

liuyang_v

2021-06-05

问题解决了吗?我也遇到这个问题了。

0
0

龙虾三少

2020-07-10

需要修改成你自己本地的路径

0
2
龙虾三少
回复
master_D
报错信息里有个空指针 检查下路径和文件是否存在
2020-07-21
共2条回复

ES7+Spark 构建高匹配度搜索服务+千人千面推荐系统

ElasticSearch实现高相关性搜索,Spark MLlib实现个性化推荐

1384 学习 · 559 问题

查看课程