0

I was asked in an interview how to read a very huge file ,the situation was suppose i have the ram size is 2GB but the file i need to read is of 5GB how this can be done effectively in java , i have searched on Google most of the solutions is to use buffered Reader . Can any body help me

public class BufferedReaderExample {

public static void main(String[] args) {

    try (BufferedReader br = new BufferedReader(new FileReader("C:\\testing.txt")))
    {

        String sCurrentLine;

        while ((sCurrentLine = br.readLine()) != null) {
            System.out.println(sCurrentLine);
        }

    } catch (IOException e) {
        e.printStackTrace();
    } 

}
}

Will this be effective or java provide some other in build functionality to do this

manlio
  • 17,446
  • 14
  • 72
  • 116
ZohebSiddiqui
  • 209
  • 3
  • 4
  • 14
  • read (FileInpuStream) and GZip (GZipInputStream) on the fly ? – Pierre Mar 29 '14 at 17:20
  • read (FileInpuStream) – ZohebSiddiqui Mar 29 '14 at 18:04
  • 1
    The way you read a huge file is exactly the same as the way you read a small one. You just have to be careful not to keep everything you read in memory. – JB Nizet Mar 29 '14 at 18:13
  • ok you want to say only keep that data in memory which is being worked on there after remove that data and take another set of data.can you please explain me which function or api we can use for this – ZohebSiddiqui Mar 29 '14 at 18:32
  • There is no API for such thing. You already do this in your code. Each line is assigned to a variable with the loop body as its scope. And the only thing you do with it is to print it out. After each iteration the variable holding the last line becomes unreachable and the string is ready for garbage collection. What you should avoid is storing all lines in a collection like a list. – vanje Mar 29 '14 at 22:33

0 Answers0