Home > Software > BIGDATA > HADOOP
Interview Questions   Tutorials   Discussions   Programs   Videos   Discussion   

HADOOP - HDFS performance for small files




1388
views
asked experts May 11, 2015 04:01 AM  

Processing large set of small files with Hadoop


           

1 Answer



 
answered By Akshay Singh   0  
This answer is pending moderator approval     HDFS is designed for storing huge chunks of Data , more number of small files leads to
more RAM consumption of NameNode.
flag   
   add comment

 
answered By experts   0  
Detailed explanation i found in below link
http://blog.cloudera.com/blog/20...-files-problem/
flag   
   add comment

 
answered By amruta_Allundi   0  
Hadoop is not designed to work with small files. However if we have huge number of small files we can get that in seqence file format and work with it.
flag   
   add comment

Your answer

Join with account you already have

FF

Preview


Ready to start your tutorial with us? That's great! Send us an email and we will get back to you as soon as possible!

Alert