Mercurial > p > roundup > code
diff test/db_test_base.py @ 7668:5b41018617f2
fix: out of memory error when importing under postgresql
If you try importing more than 20k items under postgresql you can run
out of memory:
psycopg2.errors.OutOfMemory: out of shared memory
HINT: You might need to increase max_locks_per_transaction.
Tuning memory may help, it's unknown at this point.
This checkin forces a commit to the postgres database after 10,000
rows have been added. This clears out the savepoints for each row and
starts a new transaction.
back_postgresql.py:
Implement commit mechanism in checkpoint_data(). Add two class level
attributes for tracking the number of savepoints and the limit when
the commit should happen.
roundup_admin.py:
implement pragma and dynamically create the config item
RDBMS_SAVEPOINT_LIMIT used by checkpoint_data.
Also fixed formatting of descriptions when using pragma list in
verbose mode.
admin_guide.txt, upgrading.txt:
Document change and use of pragma savepoint_limit in roundup-admin
for changing the default of 10,000.
test/db_test_base.py:
add some more asserts. In existing testAdminImportExport, set the
savepoint limit to 5 to test setting method and so that the commit
code will be run by existing tests. This provides coverage, but
does not actually test that the commit is done every 5 savepoints
8-(. The verification of every 5 savepoints was done manually
using a pdb breakpoint just before the commit.
acknowledgements.txt:
Added 2.4.0 section mentioning Norbert as he has done a ton of
testing with much larger datasets than I can test with.
| author | John Rouillard <rouilj@ieee.org> |
|---|---|
| date | Thu, 19 Oct 2023 16:11:25 -0400 |
| parents | 027912a59f49 |
| children | 25a03f1a8159 |
line wrap: on
line diff
--- a/test/db_test_base.py Thu Oct 19 14:07:56 2023 -0400 +++ b/test/db_test_base.py Thu Oct 19 16:11:25 2023 -0400 @@ -3061,6 +3061,7 @@ self.db.commit() self.assertEqual(self.db.user.lookup("duplicate"), active_dupe_id) + self.assertEqual(self.db.user.is_retired(retired_dupe_id), True) finally: shutil.rmtree('_test_export') @@ -3151,12 +3152,25 @@ self.assertRaises(csv.Error, tool.do_import, ['_test_export']) self.nukeAndCreate() + + # make sure we have an empty db + with self.assertRaises(IndexError) as e: + # users 1 and 2 always are created on schema load. + # so don't use them. + self.db.user.getnode("5").values() + self.db.config.CSV_FIELD_SIZE = 3200 tool = roundup.admin.AdminTool() tool.tracker_home = home tool.db = self.db + # Force import code to commit when more than 5 + # savepoints have been created. + tool.settings['savepoint_limit'] = 5 tool.verbose = False tool.do_import(['_test_export']) + + # verify the data is loaded. + self.db.user.getnode("5").values() finally: roundup.admin.sys = sys shutil.rmtree('_test_export')
