【问题标题】:Hadoop error can not start-all.shHadoop错误无法启动-all.sh
【发布时间】:2012-11-01 18:17:40
【问题描述】:

我已经在我的笔记本电脑单模式下设置了一个 hadoop。 信息:Ubuntu 12.10,jdk 1.7 oracle,从 .deb 文件安装 hadoop。 地点: /etc/hadoop /usr/share/hadoop

我在 /usr/share/hadoop/templates/conf/core-site.xml 中有配置我添加了 2 个属性

    <property>
  <name>hadoop.tmp.dir</name>
  <value>/app/hadoop/tmp</value>
  <description>A base for other temporary directories.</description>
</property>

<property>
  <name>fs.default.name</name>
  <value>hdfs://localhost:9000</value>
  <description>The name of the default file system.  A URI whose
  scheme and authority determine the FileSystem implementation.  The
  uri's scheme determines the config property (fs.SCHEME.impl) naming
  the FileSystem implementation class.  The uri's authority is used to
  determine the host, port, etc. for a filesystem.</description>
</property>

在 hdfs-site.xml 中

<property>
  <name>dfs.replication</name>
  <value>1</value>
  <description>Default block replication.
  The actual number of replications can be specified when the file is created.
  The default is used if replication is not specified in create time.
  </description>
</property>

在 mapred-site.xml 中

    <property>
  <name>mapred.job.tracker</name>
  <value>localhost:9001</value>
  <description>The host and port that the MapReduce job tracker runs
  at.  If "local", then jobs are run in-process as a single map
  and reduce task.
  </description>
</property>

当我从命令开始时 hduser@sepdau:~$ start-all.sh

starting namenode, logging to /var/log/hadoop/hduser/hadoop-hduser-namenode-sepdau.com.out
localhost: starting datanode, logging to /var/log/hadoop/hduser/hadoop-hduser-datanode-sepdau.com.out
localhost: starting secondarynamenode, logging to /var/log/hadoop/hduser/hadoop-hduser-secondarynamenode-sepdau.com.out
starting jobtracker, logging to /var/log/hadoop/hduser/hadoop-hduser-jobtracker-sepdau.com.out
localhost: starting tasktracker, logging to /var/log/hadoop/hduser/hadoop-hduser-tasktracker-sepdau.com.out

但是当我通过 jps 查看进程时

hduser@sepdau:~$ jps
13725 Jps

更多

 root@sepdau:/home/sepdau# netstat -plten | grep java
tcp6       0      0 :::8080                 :::*                    LISTEN      117        9953        1316/java       
tcp6       0      0 :::53976                :::*                    LISTEN      117        16755       1316/java       
tcp6       0      0 127.0.0.1:8700          :::*                    LISTEN      1000       786271      8323/java       
tcp6       0      0 :::59012                :::*                    LISTEN      117        16756       1316/java  

当我 stop-all.sh

    hduser@sepdau:~$ stop-all.sh
no jobtracker to stop
localhost: no tasktracker to stop
no namenode to stop
localhost: no datanode to stop
localhost: no secondarynamenode to stop

在我的主机文件中

hduser@sepdau:~$ cat /etc/hosts

127.0.0.1       localhost
127.0.1.1   sepdau.com



# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

文件从属:本地主机主:本地主机

这是一些日志

    hduser@sepdau:/home/sepdau$ start-all.sh
mkdir: cannot create directory `/var/run/hadoop': Permission denied
starting namenode, logging to /var/log/hadoop/hduser/hadoop-hduser-namenode-sepdau.com.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-namenode.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting datanode, logging to /var/log/hadoop/hduser/hadoop-hduser-datanode-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-datanode.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting secondarynamenode, logging to /var/log/hadoop/hduser/hadoop-hduser-secondarynamenode-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-secondarynamenode.pid: No such file or directory
mkdir: cannot create directory `/var/run/hadoop': Permission denied
starting jobtracker, logging to /var/log/hadoop/hduser/hadoop-hduser-jobtracker-sepdau.com.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-jobtracker.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting tasktracker, logging to /var/log/hadoop/hduser/hadoop-hduser-tasktracker-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-tasktracker.pid: No such file or directory

我使用root用户,但它有同样的问题

我在这里做错了什么。如何使用 hadoop 插件连接到 eclipse。 感谢提前

【问题讨论】:

  • 它有助于检查日志中的错误以了解发生了什么。在您的 hadoop/logs 文件中查找相应的日志文件,然后在其中发布错误。
  • chmod 不会为其他用户添加权限,试试 chown -R username:usergroup /var。但是我不明白为什么当你提到 hadoop.tmp.dir 时它使用 /var,你能仔细检查一下吗?
  • 我使用root用户,但它可以启动,但是当使用jps查看进程时,它没有任何进程。 stop-all.sh:没有进程停止。

标签: hadoop


【解决方案1】:

尝试添加

<property>
  <name>dfs.name.dir</name>
   <value>/home/abhinav/hdfs</value>
 </property>

到 hdfs-site.xml 并确保它存在

我为此写了一个小教程。看看这是否有帮助http://blog.abhinavmathur.net/2013/01/experience-with-setting-multinode.html

【讨论】:

    【解决方案2】:

    您可以添加 pid 所在的路径以及通过编辑文件 hadoop-env.sh 创建的日志。该文件存储在 conf 文件夹中。

    export HADOOP_LOG_DIR=/home/username/hadoop-1x/logs
    
    export HADOOP_PID_DIR=/home/username/pids
    

    【讨论】:

      【解决方案3】:

      修改你的 hdfs-site.xml

      <property>
        <name>dfs.name.dir</name>
        <value>/home/user_to_run_hadoop/hdfs/name</value>
      </property>
      
      <property>
        <name>dfs.data.dir</name>
        <value>/home/user_to_run_hadoop/hdfs/data</value>
      </property>
      

      确保在/home/user_to_run_hadoop 创建目录hdfs。然后在hdfs上创建2个目录namedata

      之后你需要chmod -R 755 ./hdfs/path_to_hadoop_home/bin/hadoop namenode -format

      【讨论】:

        【解决方案4】:

        重启终端,先格式化NameNode。

        在一些罕见的情况下,有人更改了 Hadoop 中 Bin 文件夹中的 Start-all.sh 文件。检查一次。

        检查一次 bashrc 文件配置好不好?

        【讨论】:

          猜你喜欢
          • 1970-01-01
          • 2019-12-16
          • 1970-01-01
          • 1970-01-01
          • 1970-01-01
          • 1970-01-01
          • 1970-01-01
          • 2013-08-10
          • 1970-01-01
          相关资源
          最近更新 更多