【问题标题】:How to run scala code in spark container using docker?如何使用 docker 在 spark 容器中运行 scala 代码?
【发布时间】:2020-01-04 17:32:55
【问题描述】:

我使用以下 Dockerfile 创建了一个 Spark 容器:

FROM ubuntu:16.04

RUN apt-get update -y && apt-get install -y \
default-jdk \
nano \
wget && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

RUN useradd --create-home --shell /bin/bash ubuntu

ENV HOME /home/ubuntu
ENV SPARK_VERSION 2.4.3
ENV HADOOP_VERSION 2.6
ENV MONGO_SPARK_VERSION 2.2.0
ENV SCALA_VERSION 2.11

WORKDIR ${HOME}

ENV SPARK_HOME ${HOME}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}
ENV PATH ${PATH}:${SPARK_HOME}/bin

COPY files/times.json /home/ubuntu/times.json
COPY files/README.md /home/ubuntu/README.md
COPY files/examples.scala /home/ubuntu/examples.scala
COPY files/initDocuments.scala /home/ubuntu/initDocuments.scala

RUN chown -R ubuntu:ubuntu /home/ubuntu/*
USER ubuntu

# get spark
RUN wget http://apache.mirror.digitalpacific.com.au/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz && \
tar xvf spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz

RUN rm -fv spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz

我还有两个用 Scala 编程语言编写的文件,这对我来说听起来很新鲜。问题是容器只知道 java 并且没有安装任何其他命令。有什么方法可以在不安装任何程序的情况下运行 Scala 容器?

文件名是examples.scalainitDocuments.scala。这是 initDocuments.scala 文件:

import com.mongodb.spark._
import com.mongodb.spark.config._
import org.bson.Document 

val rdd = MongoSpark.load(sc)

if (rdd.count<1){
    val t = sc.textFile("times.json")
    val converted = t.map((tuple)=>Document.parse(tuple))
    converted.saveToMongoDB(WriteConfig(Map("uri"->"mongodb://mongodb/spark.times")))
    println("Documents inserted.")
} else {
    println("Database 'spark' collection 'times' is not empty. Maybe you've loaded a data into the collection previously ? skipping process. ")
}
System.exit(0);

我也尝试了以下方法,但它不起作用。

spark-shell --conf "spark.mongodb.input.uri=mongodb://mongodb:27017/spark.times" --conf "spark.mongodb.output.uri=mongodb://mongodb/spark.output" --packages org.mongodb.spark:mongo-spark-connector_${SCALA_VERSION}:${MONGO_SPARK_VERSION} -i ./initDocuments.scala

错误:

Ivy Default Cache set to: /home/ubuntu/.ivy2/cache
The jars for the packages stored in: /home/ubuntu/.ivy2/jars
:: loading settings :: url = jar:file:/home/ubuntu/spark-2.4.3-bin-hadoop2.6/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.mongodb.spark#mongo-spark-connector_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-d0f95242-e9b9-4d49-8dde-42afc7c55e9a;1.0
        confs: [default]
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
:: resolution report :: resolve 40879ms :: artifacts dl 0ms
        :: modules in use:
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
        Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.pom

        Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.jar

        Host dl.bintray.com not found. url=https://dl.bintray.com/spark-packages/maven/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.pom

        Host dl.bintray.com not found. url=https://dl.bintray.com/spark-packages/maven/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.jar

                module not found: org.mongodb.spark#mongo-spark-connector_2.11;2.2.0

        ==== local-m2-cache: tried

          file:/home/ubuntu/.m2/repository/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.pom

          -- artifact org.mongodb.spark#mongo-spark-connector_2.11;2.2.0!mongo-spark-connector_2.11.jar:

          file:/home/ubuntu/.m2/repository/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.jar

        ==== local-ivy-cache: tried

          /home/ubuntu/.ivy2/local/org.mongodb.spark/mongo-spark-connector_2.11/2.2.0/ivys/ivy.xml

          -- artifact org.mongodb.spark#mongo-spark-connector_2.11;2.2.0!mongo-spark-connector_2.11.jar:

          /home/ubuntu/.ivy2/local/org.mongodb.spark/mongo-spark-connector_2.11/2.2.0/jars/mongo-spark-connector_2.11.jar

        ==== central: tried

          https://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.pom

          -- artifact org.mongodb.spark#mongo-spark-connector_2.11;2.2.0!mongo-spark-connector_2.11.jar:

          https://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.jar

        ==== spark-packages: tried

          https://dl.bintray.com/spark-packages/maven/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.pom

          -- artifact org.mongodb.spark#mongo-spark-connector_2.11;2.2.0!mongo-spark-connector_2.11.jar:

          https://dl.bintray.com/spark-packages/maven/org/mongodb/spark/mongo-spark-connector_2.11/2.2.0/mongo-spark-connector_2.11-2.2.0.jar

                ::::::::::::::::::::::::::::::::::::::::::::::

                ::          UNRESOLVED DEPENDENCIES         ::

                ::::::::::::::::::::::::::::::::::::::::::::::

                :: org.mongodb.spark#mongo-spark-connector_2.11;2.2.0: not found

                ::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.mongodb.spark#mongo-spark-connector_2.11;2.2.0: not found]
        at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1306)
        at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54)
        at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:315)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

PS:我尝试使用以下命令更改代理地址,但我认为我没有适合我使用的良好代理。如果有人可以帮助我运行配置良好的代理来解决我的下载问题,我将不胜感激。

export JAVA_OPTS="$JAVA_OPTS -Dhttp.proxyHost=yourserver -Dhttp.proxyPort=8080 -Dhttp.proxyUser=username -Dhttp.proxyPassword=password"

【问题讨论】:

  • 构建并运行 jar。
  • @Lamanus 你能解释一下吗?我是 scala 的新手
  • 类似于java。你不能直接运行scala代码,需要先构建。
  • @Lamanus 我也不熟悉java。没有安装 sbt、maven 等。我刚刚在那个容器上安装了java。您能解释一下这些步骤作为答案吗?
  • 对于代理配置,您可能需要查看this question 的答案。

标签: scala docker apache-spark ubuntu


【解决方案1】:

根据您在下面的错误消息

:: org.mongodb.spark#mongo-spark-connector_2.11;2.2.0: not found

表示包丢失。检查当前可用的MongoDB Connector for Spark 包,确认该包不再可用(替换为已修补的 v2.2.6)。

您可以在sindbach/mongodb-spark-docker 上查看带有 Docker 的 MongoDB Spark 连接器的更新示例。

附加信息: spark-shell 是一个 REPL(读取-评估-打印循环)工具。它是程序员用来与框架交互的交互式外壳。您无需显式执行build 即可执行。当您指定spark-shell--packages 参数时,它将自动获取包并将其包含在您的shell 环境中。

【讨论】:

  • Hi Wan,感谢您更新存储库并给予良好的支持。我仍然无法按预期工作,我面临org.mongodb.spark#mongo-spark-connector_2.11;2.4.1: not found 之类的新错误。我最好手动添加所需的文件。因为我已经检查了可能的解决方案,但它们并没有帮助我。例如,这里是link to see
猜你喜欢
  • 2017-04-26
  • 2017-03-03
  • 2018-11-28
  • 1970-01-01
  • 2018-08-23
  • 1970-01-01
  • 2020-11-17
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多