I get often this problem that http://localhost:50070/dfshealth.jsp crashes and it doesn't show up anything.
I am running pseudo mode configuration.
One of the temporary solution which i found online was to format dfs again but this is very frustrating.
Also in
http://localhost:50030/jobtracker.jsp
Jobtracker history i get the following message
HTTP ERROR 500
Problem accessing /jobhistoryhome.jsp. Reason:
INTERNAL_SERVER_ERROR
http://localhost:50030/jobhistoryhome.jsp
I see similar problem was also observed here
http://grokbase.com/p/hadoop/common-user/10383vj1gn/namenode-problem
Solution
If you see carefully the log of namenode
We have error as
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /tmp/hadoop-hadoop/dfs/name is in an inconsistent state: storage directory does not exist or is not accessible.
This says that following variables are not properly set
Normally this is due to the machine having been rebooted and /tmp being cleared out. You do not want to leave the Hadoop name node or data node storage in /tmp for this reason. Make sure you properly configure dfs.name.dir and dfs.data.dir to point to directories
outside of /tmp and other directories that may be cleared on boot.
The quick setup guide is really just to help you start experimenting with Hadoop. For setting up a cluster for any real use, you'll want to
follow the next guide - Cluster Setup -
http://hadoop.apache.org/common/docs/current/cluster_setup.html
So here is what i did in hadoop-site.xml added the following two properties and now its working fine
<property>
<name>dfs.name.dir</name>
<value>/home/hadoop/workspace/hadoop_space/data_dir</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>/home/hadoop/workspace/hadoop_space/name_dir</value>
</property>
Source : http://lucene.472066.n3.nabble.com/Directory-tmp-hadoop-root-dfs-name-is-in-an-inconsistent-state-storage-directory-DOES-NOT-exist-or-ie-td812243.html
Found some other solution for this problem ? Please share below , thanks.
Thanks, re-formatting helped.
ReplyDeleteIn my case, I believe, the value for dfs.name.dir was too long. I shortened it and it works.