【问题标题】:When I try and run pyspark.cmd I get the error message "find: 'version': No such file or directory"当我尝试运行 pyspark.cmd 时,我收到错误消息“find: 'version': No such file or directory”
【发布时间】:2015-02-03 15:56:11
【问题描述】:

我正在尝试开始使用 Apache Spark。我想通过python使用它。但是,当我从命令行运行 pyspark 时,我收到以下错误消息:

C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\bin>pyspark.cmd
Running python with PYTHONPATH=C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.
4\bin\..\python\lib\py4j-0.8.2.1-src.zip;C:\Programs\Apache\Spark\spark-1.2.0-bi
n-hadoop2.4\bin\..\python;
Python 2.7.8 |Anaconda 2.1.0 (32-bit)| (default, Jul  2 2014, 15:13:35) [MSC v.1
500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://binstar.org
find: 'version': No such file or directory
else was unexpected at this time.
Traceback (most recent call last):
  File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\bin\..\python\pyspark
\shell.py", line 45, in <module>
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
  File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\contex
t.py", line 102, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway)
  File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\contex
t.py", line 211, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\java_g
ateway.py", line 73, in launch_gateway
    raise Exception(error_msg)
Exception: Launching GatewayServer failed with exit code 255!
Warning: Expected GatewayServer to output a port, but found no output.

当我尝试通过运行 spark-shell 来运行 scala 界面时,我收到以下消息:

find: 'version': No such file or directory
else was unexpected at this time.

除了

,我在网上找不到任何关于这个错误的信息

结果证明这是一条死胡同。 https://issues.apache.org/jira/browse/SPARK-3808 请帮忙!

【问题讨论】:

    标签: python apache-spark anaconda


    【解决方案1】:

    我在 spark 1.2.0 中遇到了同样的问题,但在 spark 1.0.2 中没有。 就我而言,原因是我在 DOS 类路径中有 cygwin。 Spark 在文件“spark-class2.cmd”中使用 find 命令,然后使用 cygwin find 命令而不是 DOS find 命令,其工作方式有所不同。 我从 DOS PATH 中删除了 cygwin,解决了这个问题。

    问候,菲利克斯

    【讨论】:

    • 谢谢 - 我在 PATH 上有另一个程序,它有自己的 find 命令,但这是正确的。
    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 2019-07-13
    • 2016-11-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2020-03-19
    • 1970-01-01
    相关资源
    最近更新 更多