I have a multi-threaded python process, and I am using SQLite to store data between the processes. The flow is as follows:
- Main process created named in memory database, and table within the database
- Worker threads insert atomic data into the database
- When worker threads all finish main thread does a
SELECT *and transforms the data into JSON.
My connection string looks like file:memdb1?mode=memory&cache=shared. My insert process on my worker threads looks like:
sqlite.connect(conStr, timeout=500000000)
conn.insert()
conn.commit()
conn.close()
At first, if I used more than one thread, I would eventually get lock errors regardless of my timeout. This was fixed by adding isolation_level=none to the connect call.
Are there any other options I can use here to speed up sqlite? I need to be notified if the INSERT fails, so any kind of fire and forget option is out of the question. However, is there any sort of journaling that can be turned off, or anything like that?
Aucun commentaire:
Enregistrer un commentaire