Soldato
Ok I have about 400 csv type files (about 50MB in total, also they're remote i.e. reading each one from a URL), I read all of these files into memory, populate a couple of MySQL temporary tables from this data, then populate 8 permanent table from these temporary tables.
Across all 8 tables there are roughly 192k rows.
All of this takes my Python code about 9.5 minutes to do when the database is empty, and 7 minutes when it's just doing updates. I realise this is quite a high level of detail, but does that sound like reasonable performance?
Across all 8 tables there are roughly 192k rows.
All of this takes my Python code about 9.5 minutes to do when the database is empty, and 7 minutes when it's just doing updates. I realise this is quite a high level of detail, but does that sound like reasonable performance?
Last edited: