Breaking News

Hadoop and Hbase ports


The Following ports are required for Hadoop/Hbase to function properly

8020 - Namenode(Property-fs.defaultFS)
comments:Need to open this port for all IP's(For all datanodes and for clients). Because Once the DFS service starts, Datanodes start sending the block report to the  namenode on this port and it is configurable in this property(fs.defaultFS). Used for Filesystem metadata operations.

50010 -Datanode(dfs.datanode.address)
Comments: For all the HDFS Read/Write operations, this port has to be opened for clients(Including Namenode). It is used for DFS data transfer

50020- Datanode(dfs.datanode.ipc.address)
Comments: Block metadata operations and recovery.


8021- Job Tracker(mapred.job.tracker)
Comments: Open this port for all the task trackers since once mapred service starts, task tracker try to communicate to Job Tracker through this port

2181- Zookeeper(hbase.zookeeper.propert.clientPort)
Comments: This zookeeper client port has to be opened for Hbase master and Hregionservers and for clients.

60000- Hbase Master(hbase.master.port)
Comments: The RPC port the master listens on for client requests. Enable for regionservers.The port the HBase Master should bind to.

60020- Hbase Regionserver(hbase.regionserver.port)
Comments: The port the HBase RegionServer binds for client requests. Enable for master to regionserver and from regionserver to regionserver.

50070: Namenode HTTP Web Port
Comments: open this port for secondary namenode(SNN), since SNN try to get the image from http://namenode:webport

50090: SecondaryNamenode
Comments: The secondary namenode http server address and port. Namenode will get the merged image from http://snn:webport

2888,3888- These ports are required when there is a zookeeper Quorum(more than ZK nodes) for peer to peer communication and leader election



Web UI ports

The below ports are usually meant for Web UI.

50070- Namenode
50075- Datanode
50030- JobTracker
50060- TaskTracker
60010- Hmaster
60030- HRegionserver


2 comments:

  1. It is nice blog Thank you porovide importent information and i am searching for same information to save my timeBig data hadoop online training Bangalore

    ReplyDelete
  2. very nice blog,keep sharing more blogs with us.

    Thank you...

    big data hadoop training

    ReplyDelete