-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
db.batch_commit() mode #1539
Comments
I originally thought that the Something like that should be do-able, but it is probably cleaner to pass an iterator, e.g.: for row in db.batch_commit(list_of_data, 100):
Model.create(**row) Continuing that thought, the # Helper.
def chunked(iterable, n):
marker = object()
for group in (list(g) for g in izip_longest(*[iter(iterable)] * n,
fillvalue=marker)):
if group[-1] is marker:
del group[group.index(marker):]
yield group
# method on Database
def batch_commit(self, it, n):
for obj_group in chunked(it, n):
with self.atomic():
for obj in obj_group:
yield obj I didn't test this yet, so I'm not sure if it works. Just thinking out loud. |
Added in 3d4e6e4. |
Every N'th model creation commits the transaction, the equivalent Python without
batch_commit
is:Having a
batch_commit
would make it easier/simpler to make batch inserts faster.Alternatively there is
Model.insert_many
, but to create multiple linked models at the same time thebatch_commit
method could be useful.Using
db.atomic()
works unless you have a large number of items to insert, in that case periodic commits are used to ensure data is synchronised to disk, e.g.:it's probably easier to do it using a batch iterator instead of modifying peewee:
Feel free to close, I answered own question...
The text was updated successfully, but these errors were encountered: