【发布时间】:2021-12-09 01:13:00
【问题描述】:
我的类路径缺少可序列化和可克隆的类。 我不知道如何解决这个问题。
我有一个看起来像这样的 sbt 应用程序
name := "realtime-spark-streaming"
version := "0.1"
resolvers += "confluent" at "https://packages.confluent.io/maven/"
resolvers += "Public Maven Repository" at "https://repository.com/content/repositories/pangaea_releases"
val sparkVersion = "3.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.2.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.2.0"
libraryDependencies += "com.walmart.grcaml" % "us-aml-commons" % "latest.release"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion
//libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "3.2.0" % "2.1.3"
//libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.7.12"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka
libraryDependencies += "org.apache.kafka" %% "kafka" % "6.1.0-ccs"
resolvers += Resolver.mavenLocal
scalaVersion := "2.13.6"
当我进行 sbt 构建时,我得到了..
Symbol 'type scala.package.Serializable' is missing from the classpath.
This symbol is required by 'class org.apache.spark.sql.SparkSession'.
Make sure that type Serializable is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'SparkSession.class' was compiled against an incompatible version of scala.package.
import org.apache.spark.sql.{DataFrame, SparkSession}
Symbol 'type scala.package.Serializable' is missing from the classpath.
This symbol is required by 'class org.apache.spark.sql.Dataset'.
Make sure that type Serializable is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'Dataset.class' was compiled against an incompatible version of scala.package.
def extractData(spark: SparkSession, configDetails: ReadProperties, pcSql: String, query: String): DataFrame = {
我的依赖树只显示 jar,但这似乎是类/包冲突或丢失..
【问题讨论】:
-
你是怎么得到这个错误的?
sbt clean compile? -
只需从 intellij 构建项目
-
您是否尝试过使用 sbt 以确保这不是 IntelliJ 不知道某些依赖项更改的问题。
标签: java scala maven apache-spark sbt