java - How to handle processing large csv file or read large CSV file in chunks -


i have large csv files i'm trying iterate through. i'm using opencsv , i'd use csvtobean can dynamically set column mappings database. question have how without grabbing entire file , throwing list. i'm trying prevent memory errors.

i'm passing entire result set list so.

list<myoption> myobjects = csv.parse(strat, getreader("file.txt"));  (myobject myobject : myobjects) {     system.out.println(myobject); } 

but found iterator method , i'm wondering if iterate each row rather entire file @ once?

iterator myobjects = csv.parse(strat, getreader("file.txt")).iterator();  while (myobjects.hasnext()) {     myobject myobject = (myobject) myobjects.next();     system.out.println(myobject); } 

so question difference between iterator , list?

the enhanced loop (for (myobject myobject : myobjects)) implemented using iterator (it requires instance returned csv.parse(strat, getreader("file.txt")) implements iterable interface, contains iterator() method returns iterator), there's no performance difference between 2 code snippets.

p.s

in second snippet, don't use raw iterator type, use iterator<myobject> :

iterator<myobject> myobjects = csv.parse(strat, getreader("file.txt")).iterator();  while (myobjects.hasnext()) {     myobject myobject = myobjects.next();     system.out.println(myobject); } 

Comments