28

I read this: Importing a CSV file into a sqlite3 database table using Python

and it seems that everyone suggests using line-by-line reading instead of using bulk .import from SQLite. However, that will make the insertion really slow if you have millions of rows of data. Is there any other way to circumvent this?

Update: I tried the following code to insert line by line but the speed is not as good as I expected. Is there anyway to improve it

for logFileName in allLogFilesName:
    logFile = codecs.open(logFileName, 'rb', encoding='utf-8')
    for logLine in logFile:
        logLineAsList = logLine.split('\t')
        output.execute('''INSERT INTO log VALUES(?, ?, ?, ?)''', logLineAsList)
    logFile.close()
connection.commit()
connection.close()
Community
  • 1
  • 1
Shar
  • 457
  • 1
  • 5
  • 8

3 Answers3

49

Since this is the top result on a Google search I thought it might be nice to update this question.

From the python sqlite docs you can use

import sqlite3

persons = [
    ("Hugo", "Boss"),
    ("Calvin", "Klein")
]

con = sqlite3.connect(":memory:")

# Create the table
con.execute("create table person(firstname, lastname)")

# Fill the table
con.executemany("insert into person(firstname, lastname) values (?,?)", persons)

I have used this method to commit over 50k row inserts at a time and it's lightning fast.

Fred
  • 834
  • 1
  • 8
  • 5
  • 11
    It is lightning fast also because you are using sqlite as an in-memory database... – ant1g Nov 17 '17 at 14:54
  • 2
    decreased a 2 min creation task to less than a second! and it's a file database, so it is lightning anyway – Math Nov 23 '17 at 22:37
  • how do I read back from this? What should I select or .open in sqlite3 to retrieve or read values in this database? .schema does not show me this table when i open sqlite3 cli – VIshu Kamble Sep 26 '19 at 14:43
  • @VIshuKamble this creates in memory db, which is lost after you close the program. If you want to persist the data, use `sqlite3.connect("path/to/dbFile")`. – jnovacho Oct 05 '19 at 09:24
  • @jnovacho no I meant when the program is running. But thanks! – VIshu Kamble Oct 09 '19 at 19:01
24

Divide your data into chunks on the fly using generator expressions, make inserts inside the transaction. Here's a quote from sqlite optimization FAQ:

Unless already in a transaction, each SQL statement has a new transaction started for it. This is very expensive, since it requires reopening, writing to, and closing the journal file for each statement. This can be avoided by wrapping sequences of SQL statements with BEGIN TRANSACTION; and END TRANSACTION; statements. This speedup is also obtained for statements which don't alter the database.

Here's how your code may look like.

Also, sqlite has an ability to import CSV files.

alecxe
  • 441,113
  • 110
  • 1,021
  • 1,148
18

Sqlite can do tens of thousands of inserts per second, just make sure to do all of them in a single transaction by surrounding the inserts with BEGIN and COMMIT. (executemany() does this automatically.)

As always, don't optimize before you know speed will be a problem. Test the easiest solution first, and only optimize if the speed is unacceptable.

davidfg4
  • 466
  • 3
  • 9
  • Thanks! I am trying that right now and will report on the speed. – Shar Aug 13 '13 at 22:11
  • I just tried what you suggested by inserting line by line. The speed is not too bad, but it is still not as fast as I hope it to be. Maybe my code was not well-written enough. I updated it in the question above. Do you have any suggestions? – Shar Aug 13 '13 at 22:26