-1

Hi I am trying to fetch big table from hive and then write a csv file . Is there any way where we can write the whole result set in one go in a file as iteration over rows for a table containing several millions of records takes time .

I dont want to do while(rs.next()).

Aman Mittal
  • 51
  • 2
  • 9
  • Some DBMSes support writing to a csv file in their SQL language, but if you have a `ResultSet`, you will have to iterate it. – Maurice Perry Jun 12 '18 at 08:17
  • Maybe this can help you https://stackoverflow.com/questions/17086642/how-to-export-a-hive-table-into-a-csv-file – YCF_L Jun 12 '18 at 08:18

1 Answers1

2

You didn't mention the SQL environment but usually you can just use SQL like so:

SELECT * FROM [TABLENAME]
  INTO OUTFILE '[FILENAME]'
  FIELDS TERMINATED BY ','
  ENCLOSED BY '"'
  LINES TERMINATED BY '\n'

For Hive it should be similar to the following

INSERT OVERWRITE LOCAL DIRECTORY '[FILENAME]' 
SELECT * FROM [TABLENAME]

I can't be more specific on how to exactly separate fields since i have no access to Hive but the documentaion on this should help

Writing data into the filesystem from queries

CannedMoose
  • 494
  • 1
  • 9
  • 14