24

How can I view how many blocks has a file been broken into, in a Hadoop file system?

phs
  • 10,378
  • 3
  • 56
  • 80
London guy
  • 26,580
  • 42
  • 114
  • 173

4 Answers4

44

We can use hadoop file system check command to know the blocks for the specific file.

Below is the command:

hadoop fsck [path] [options]

To view the blocks for the specific file :

hadoop fsck /path/to/file -files -blocks
Phani
  • 3,107
  • 4
  • 24
  • 49
Ramana
  • 6,633
  • 7
  • 26
  • 31
4

hadoop fsck filetopath

used the above commad in CDH 5. Got the below Error.

hadoop-hdfs/bin/hdfs: line 262: exec: : not found

Use the below command and it worked good

hdfs fsck filetopath

yoga
  • 1,829
  • 2
  • 13
  • 18
2

It is always a good idea to use hdfs instead of hadoop as 'hadoop' version is deprecated.

Here is the command with hdfs and to find the details on a file named 'test.txt' in the root, you would write

hdfs fsck /test.txt -files -blocks -locations

user1795667
  • 401
  • 4
  • 10
-3

This should work..

hadoop fs -stat "%o" /path/to/file
  • `%o` is the block size, not the number of blocks, per http://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-common/FileSystemShell.html#stat – Nickolay May 25 '15 at 02:34