依赖冲突问题

来源:8-6 需求一之IP规则库解析

慕运维8677934

2021-05-23

PK哥,我之前用的scala版本是2.12,现在我安装kudu1.4将scala版本降到了2.11.8,然后引入pom文件之后,执行的时候报这个错误
Exception in thread “main” java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
错误提示行是
val spark: SparkSession = SparkSession.builder().master(“local[2]”).appName(“logETLApp”).getOrCreate()

这是我的pom文件

4.0.0
like
sparkSql
1.0-SNAPSHOT
2008

<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
UTF-8
<scala.tools.version>2.11</scala.tools.version>
<scala.version>2.11.8</scala.version>
<spark.version>2.4.3</spark.version>
<hadoop.version>2.6.0-cdh5.15.1</hadoop.version>

cloudera https://repository.cloudera.com/artifactory/cloudera-repos
org.scala-lang scala-library ${scala.version}
<!--Spark SQL依赖-->
<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-sql_2.11</artifactId>
  <version>${spark.version}</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-hive_2.11</artifactId>
  <version>${spark.version}</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-hive-thriftserver_2.11</artifactId>
  <version>${spark.version}</version>
</dependency>

<!-- Hadoop相关依赖-->
<dependency>
  <groupId>org.apache.hadoop</groupId>
  <artifactId>hadoop-client</artifactId>
  <version>${hadoop.version}</version>
</dependency>

<dependency>
  <groupId>mysql</groupId>
  <artifactId>mysql-connector-java</artifactId>
  <version>5.1.47</version>
</dependency>

<dependency>
  <groupId>com.typesafe</groupId>
  <artifactId>config</artifactId>
  <version>1.3.3</version>
</dependency>

<dependency>
  <groupId>org.apache.hive</groupId>
  <artifactId>hive-jdbc</artifactId>
  <version>1.2.1</version>
</dependency>

<dependency>
  <groupId>org.apache.kudu</groupId>
  <artifactId>kudu-client</artifactId>
  <version>1.4.0</version>
</dependency>

<dependency>
  <groupId>org.apache.kudu</groupId>
  <artifactId>kudu-spark2_2.11</artifactId>
  <version>1.4.0</version>
</dependency>

<!-- Test -->
<dependency>
  <groupId>junit</groupId>
  <artifactId>junit</artifactId>
  <version>4.11</version>
  <scope>test</scope>
</dependency>
src/main/scala src/test/scala org.scala-tools maven-scala-plugin compile testCompile ${scala.version} -target:jvm-1.8 org.apache.maven.plugins maven-eclipse-plugin true ch.epfl.lamp.sdt.core.scalabuilder ch.epfl.lamp.sdt.core.scalanature org.eclipse.jdt.launching.JRE_CONTAINER ch.epfl.lamp.sdt.launching.SCALA_CONTAINER org.scala-tools maven-scala-plugin ${scala.version}
写回答

1回答

Michael_PK

2021-05-23

1)你把工程中创建出来的,src/test/scala这个里面是不是有东西?直接删了

2)打开项目的project stucuture,选中你的项目,右侧有个dependency,你把2.12的东西都删了

0
3
慕运维8677934
回复
Michael_PK
老师好了 我把maven本地仓库关于spark和scala包删除了 然后reimport就好了
2021-05-29
共3条回复

SparkSQL入门 整合Kudu实现广告业务数据分析

大数据工程师干货课程 带你从入门到实战掌握SparkSQL

535 学习 · 192 问题

查看课程