问题:无法下载相关Spark和Scala依赖

来源:2-16 local模式下使用spark-submit提交Spark应用程序

weixin_慕前端2537682

2020-09-14

自己使用的Spark和scala版本
修改pom.xml文件
图片描述
问题:无法下载相关Spark和Scala依赖
Failed to execute goal on project sparksql-train: Could not resolve dependencies for project com.imooc.bigdata:sparksql-train:jar:1.0: The following artifacts could not be resolved: org.apache.spark:spark-sql_2.11:jar:3.0.1, org.apache.spark:spark-hive_2.11:jar:3.0.1, org.apache.spark:spark-hive-thriftserver_2.11:jar:3.0.1: Failure to find org.apache.spark:spark-sql_2.11:jar:3.0.1 in https://repository.cloudera.com/artifactory/cloudera-repos was cached in the local repository, resolution will not be reattempted until the update interval of cloudera has elapsed or updates are forced -> [Help 1]

写回答

2回答

Michael_PK

2020-09-14

 org.apache.spark:spark-sql_2.11:jar:3.0.1, org.apache.spark:spark-hive_2.11:jar:3.0.1, org.apache.spark:spark-hive-thriftserver_2.11:jar:3.0.1: Failure to find org.apache.spark:spark-sql_2.11:jar:3.0.1


这些目录在你的本地仓库里面有,update的文件存在,你去删了update的,然后重新import

0
10
Michael_PK
回复
weixin_慕前端2537682
Scala版本改后,搞定没
2020-09-15
共10条回复

Michael_PK

2020-09-14

in https://repository.cloudera.com/artifactory/cloudera-repos was cached in the local repository, resolution will not be reattempted until the update interval of cloudera has elapsed or updates are forced 

关键点在这里,你去你的maven本地仓库里找到 org.apache.spark:spark-sql_2.11:jar:3.0.1, org.apache.spark:spark-hive_2.11:jar:3.0.1, org.apache.spark:spark-hive-thriftserver_2.11:jar:3.0.1: Failure to find org.apache.spark:spark-sql_2.11:jar:3.0.1


这些, 你的本地目录里面肯定有没更新完毕的带update的文件,删了重新import

0
0

SparkSQL入门 整合Kudu实现广告业务数据分析

大数据工程师干货课程 带你从入门到实战掌握SparkSQL

535 学习 · 192 问题

查看课程