Hadoop shows only one live datanode

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Hadoop shows only one live datanode

Jérémy C

Hello everyone,


I installed hadoop 3.1.1 on 3 virtual machines with VMware on Ubuntu. When I run hdfs namenode -format and start-all.sh then jps works correctly on my master and two slaves nodes.

However, with the command hdfs dfsadmin -report, I can see only one live data node (I get the same result when I check on master:50070 or 8088).


I tried to disable firewall as follows: ufw disable but it didn't solve the problem. The 3 machines can connect with each other (without passwd) with ping and ssh. I also deleted the hadoop tmp folder with datanode and namenode folders but it didn't work as well. There are also no issues shown in the log files.


Do you have any solutions to get three live datanode instead of one? Thanks.


You will find attached my configurations files.




---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

core-site.xml (1K) Download Attachment
hadoop-env.sh (22K) Download Attachment
hdfs-site.xml (1K) Download Attachment
mapred-site.xml (1K) Download Attachment
workers (30 bytes) Download Attachment
yarn-site.xml (1K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Hadoop shows only one live datanode

Akira Ajisaka-4
Hi Jérémy,

Would you set "dfs.namenode.rpc-address" to "master:9000" in
hdfs-site.xml? The NameNode RPC address is "localhost:8020" by default
and that's why only the DataNode running on master is registered.
DataNodes running on slave1/slave2 want to connect to "localhost:8020"
and cannot find the NameNode because NameNode is not running on slave1
or slave2.

-Akira

2018年12月24日(月) 0:13 Jérémy C <[hidden email]>:

>
> Hello everyone,
>
>
> I installed hadoop 3.1.1 on 3 virtual machines with VMware on Ubuntu. When I run hdfs namenode -format and start-all.sh then jps works correctly on my master and two slaves nodes.
>
> However, with the command hdfs dfsadmin -report, I can see only one live data node (I get the same result when I check on master:50070 or 8088).
>
>
> I tried to disable firewall as follows: ufw disable but it didn't solve the problem. The 3 machines can connect with each other (without passwd) with ping and ssh. I also deleted the hadoop tmp folder with datanode and namenode folders but it didn't work as well. There are also no issues shown in the log files.
>
>
> Do you have any solutions to get three live datanode instead of one? Thanks.
>
>
> You will find attached my configurations files.
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Hadoop shows only one live datanode

Gurmukh Singh
In reply to this post by Jérémy C
core-site.xml is wrong.

It is "fs.defaultFS" not "fs.default.FS"

Also remove "/" after the port

<configuration>
     <property>
         <name>fs.default.name</name>
         <value>hdfs://master:9000/</value>
     </property>
     <property>
         <name>fs.default.FS</name>
         <value>hdfs://master:9000/</value>
     </property>
</configuration>

Also, you are running yarn; so you do not need the below:

  <property>
         <name>mapreduce.job.tracker</name>
         <value>master:5431</value>
     </property>

On 24/12/18 1:07 am, J�r�my C wrote:

> <configuration>
>      <property>
>          <name>fs.default.name</name>
>          <value>hdfs://master:9000/</value>
>      </property>
>      <property>
>          <name>fs.default.FS</name>
>          <value>hdfs://master:9000/</value>
>      </property>
> </configuration>

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]