【问题标题】:Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class [duplicate]线程“主”java.lang.NoClassDefFoundError 中的异常:scala/Product$class [重复]
【发布时间】:2019-05-16 19:54:27
【问题描述】:

Buld.sbt

name := "BigData"

version := "0.1"

scalaVersion := "2.12.7"

libraryDependencies += "com.github.tototoshi" %% "scala-csv" % "1.3.5"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"
// https://mvnrepository.com/artifact/com.microsoft.sqlserver/mssql-jdbc
libraryDependencies += "com.microsoft.sqlserver" % "mssql-jdbc" % "6.1.0.jre8"
libraryDependencies += "com.databricks" % "spark-xml_2.11" % "0.4.1"

// https://mvnrepository.com/artifact/com.typesafe.akka/akka-actor
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.19"
// https://mvnrepository.com/artifact/com.typesafe.akka/akka-http
libraryDependencies += "com.typesafe.akka" %% "akka-http" % "10.1.5"
// https://mvnrepository.com/artifact/com.typesafe.akka/akka-stream
libraryDependencies += "com.typesafe.akka" %% "akka-stream" % "2.5.19"

// https://mvnrepository.com/artifact/org.apache.livy/livy-core
libraryDependencies += "org.apache.livy" %% "livy-core" % "0.5.0-incubating"

使用 scala 和 spark 编写代码

import org.apache.spark.sql.SparkSession

object sparkXml {
  def main(args: Array[String]): Unit = {

    val spark = SparkSession.
      builder.master("local[*]")
      //.config("spark.debug.maxToStringFields", "100")
      .appName("Insight Application Big Data")
      .getOrCreate()

    val df = spark.read
      .format("com.databricks.spark.xml")
      .option("rowTag", "book")
      .load("src/main/resources/in/books.xml")
    df.printSchema()

  }
}

错误信息

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class
    at com.databricks.spark.xml.XmlRelation.<init>(XmlRelation.scala:35)
    at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:65)
    at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:43)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
    at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
    at com.amkcambodia.insight.app.components.sparkXml$.main(sparkXml.scala:16)
    at com.amkcambodia.insight.app.components.sparkXml.main(sparkXml.scala)
Caused by: java.lang.ClassNotFoundException: scala.Product$class
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 9 more
18/12/16 07:15:17 INFO SparkContext: Invoking stop() from shutdown hook

【问题讨论】:

  • 特别是您在任何地方都使用 Scala 2.12,但对于 spark-xml,您使用 2.11 - "com.databricks" %% "spark-xml" % "0.4.1" 应该可以解决问题。
  • 修改后还是一样的问题。
  • 你是如何运行它的? sbt run ?
  • 来自 IntelliJ IDE

标签: scala apache-spark


【解决方案1】:

目前 Scala 2.12 支持的 com.databricks-spark-xml" 包在 Maven 存储库中不可用 https://mvnrepository.com/artifact/com.databricks/spark-xml

降级到 Scala 2.11 应该可以解决这个问题。请尝试以下版本更改

scalaVersion := "2.11.12"
libraryDependencies += "com.databricks" % "spark-xml_2.11" % "0.4.1"

【讨论】:

  • 更改后在线程“main”java.lang.ExceptionInInitializerError 中得到此异常
  • 现在可以使用 Scala 2.12。
猜你喜欢
  • 2021-08-18
  • 1970-01-01
  • 2017-07-18
  • 2016-07-03
  • 2021-11-04
  • 2019-06-23
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多