0

There is full permission to the folder I am trying to list but still, couldn't. Please help enter code here:

scala> new File("hdfs://mapdigidev/apps/hive/warehouse/da_ai.db/t_fact_ai_pi_ww").listFiles
res0: Array[java.io.File] = null
Sudha
  • 199
  • 1
  • 2
  • 11
  • What tools allow you to list files and see there's something actually there? – 9000 May 01 '17 at 20:15
  • 1
    See http://stackoverflow.com/questions/27023766/spark-iterate-hdfs-directory. You need to use the libraries that know how to handle HDFS, and java.io does not. – Don Branson May 01 '17 at 20:18

1 Answers1

0

You can use the hadoop libraries to list files in hadoop:

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.{FileSystem, Path}

val fs = FileSystem.get(new URI("hdfs://mapdigidev"), new Configuration())
val files = fs.listFiles(new Path("/apps/hive/warehouse/da_ai.db/t_fact_ai_pi_ww"), false)

But java.io doesn't know about hadoop/hdfs.

Don Branson
  • 13,430
  • 10
  • 57
  • 100