【发布时间】:2015-04-19 06:41:36
【问题描述】:
我已经在 Ubuntu 12.o4 客户端操作系统上安装了 Scala、sbt 和 hadoop 1.0.3。参考链接 - http://docs.sigmoidanalytics.com/index.php/How_to_Install_Spark_on_Ubuntu-12.04,我尝试构建 Spark 并得到与保留空间相关的错误。
这是我要运行的内容:
hduser@vignesh-desktop:/usr/local/spark-1.1.0$ SPARK_HADOOP_VERSION=1.1.0 sbt/sbt assembly
输出如下错误:
Using /usr/lib/jvm/java-6-openjdk-i386/ as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
【问题讨论】:
标签: java scala sbt apache-spark