Soldato
Hello all,
Looking to restore one of my sites locally to do some profiling on performance issues that have grown as the site has expanded.
Looking at a 2.5GB MYSQLDUMP, and expanded in situ size of around 3.4GB DB. A few of the tables house a few million rows each. Nothing is relational as it's all active record, and no funky foreign key restraints etc at database level for cascading.
Having major issues running this in locally to my machine, used to do syncs with HeidiSQL and worked fine, but it now seems too large to even perform that activity.
Anyone got experience working with large dumps? What's the quickest way to get them onto my rig? Started playing with Navicat tonight and that is going better with single transactions turned off for server to local sync.
Any tips would be appreciated, net doesn't yield much for this situation!
Looking to restore one of my sites locally to do some profiling on performance issues that have grown as the site has expanded.
Looking at a 2.5GB MYSQLDUMP, and expanded in situ size of around 3.4GB DB. A few of the tables house a few million rows each. Nothing is relational as it's all active record, and no funky foreign key restraints etc at database level for cascading.
Having major issues running this in locally to my machine, used to do syncs with HeidiSQL and worked fine, but it now seems too large to even perform that activity.
Anyone got experience working with large dumps? What's the quickest way to get them onto my rig? Started playing with Navicat tonight and that is going better with single transactions turned off for server to local sync.
Any tips would be appreciated, net doesn't yield much for this situation!