I've got a database on which I need to generate a meta table thingie. Problem is, the table I'm generating is about 300,000 rows, so I need to know what's the fastest way to insert a large number of rows into a blank table.
Currently, I'm building the inserts as a single transaction into a proxy table, dropping the main table and then recreating the main table as a select of the proxy table, before finally re-creating the indexes on the main table. This allows lookups to still occur on the main table as the lock only happens for a very small amount of time.
The inserts into the proxy table are quite slow, and whilst this isn't necessarily causing any performance problems in the app, I want to be sure I'm doing it in the most efficient way.
Currently, I'm building the inserts as a single transaction into a proxy table, dropping the main table and then recreating the main table as a select of the proxy table, before finally re-creating the indexes on the main table. This allows lookups to still occur on the main table as the lock only happens for a very small amount of time.
The inserts into the proxy table are quite slow, and whilst this isn't necessarily causing any performance problems in the app, I want to be sure I'm doing it in the most efficient way.