问题:无法下载相关Spark和Scala依赖
来源:2-16 local模式下使用spark-submit提交Spark应用程序

weixin_慕前端2537682
2020-09-14
自己使用的Spark和scala版本修改pom.xml文件
问题:无法下载相关Spark和Scala依赖
Failed to execute goal on project sparksql-train: Could not resolve dependencies for project com.imooc.bigdata:sparksql-train:jar:1.0: The following artifacts could not be resolved: org.apache.spark:spark-sql_2.11:jar:3.0.1, org.apache.spark:spark-hive_2.11:jar:3.0.1, org.apache.spark:spark-hive-thriftserver_2.11:jar:3.0.1: Failure to find org.apache.spark:spark-sql_2.11:jar:3.0.1 in https://repository.cloudera.com/artifactory/cloudera-repos was cached in the local repository, resolution will not be reattempted until the update interval of cloudera has elapsed or updates are forced -> [Help 1]
2回答
-
Michael_PK
2020-09-14
org.apache.spark:spark-sql_2.11:jar:3.0.1, org.apache.spark:spark-hive_2.11:jar:3.0.1, org.apache.spark:spark-hive-thriftserver_2.11:jar:3.0.1: Failure to find org.apache.spark:spark-sql_2.11:jar:3.0.1
这些目录在你的本地仓库里面有,update的文件存在,你去删了update的,然后重新import
0102020-09-15 -
Michael_PK
2020-09-14
in https://repository.cloudera.com/artifactory/cloudera-repos was cached in the local repository, resolution will not be reattempted until the update interval of cloudera has elapsed or updates are forced
关键点在这里,你去你的maven本地仓库里找到 org.apache.spark:spark-sql_2.11:jar:3.0.1, org.apache.spark:spark-hive_2.11:jar:3.0.1, org.apache.spark:spark-hive-thriftserver_2.11:jar:3.0.1: Failure to find org.apache.spark:spark-sql_2.11:jar:3.0.1
这些, 你的本地目录里面肯定有没更新完毕的带update的文件,删了重新import
00
相似问题