My companie has lots of very big files. Textfiles, CVS tables and so forth. I want to write a programm that reads all the contents from the file, line by line and write it into a SQLite Database.
The Program works exactly like it should. But it is too slow. It writes 50.000 Entries in almost 2 Minutes. But the files it have to deal with, if I will use it in production, have over 10.000.000 entries. Some of them twice as much.
Wich means I need to speed things up. Here is the program:
require 'sqlite3'
require 'benchmark'
system("clear")
db = SQLite3::Database.open "./database/BenchmarkBase.db"
counter = 0
bench = Benchmark.measure {
theFile = File.open("/root/Downloads/test/cellphoneNumbers.txt", 'r').each do |line|
begin
db.execute "INSERT INTO PhoneN(cellPhone) VALUES ('#{line}')"
print "Entry: "
print counter += 1
print " .. has been written into database." + "\r"
rescue SQLite3::Exception => e
puts "Exception occurred"
puts e
end
end
}
puts "time consumption: #{bench.real}"
db.close if db
Forking this process have litteraly no performance boost. (or mybe I did it wrong.) so if anyone has a good idea how I can make the progress faster, please dont hesitate to tell me.
Thank you in advance, VB