Tools to migrate SQL instances to another server

Man of Honour
Joined
30 Oct 2003
Posts
13,249
Location
Essex
Thank you, will check out redgate I'd not heard of them and this topic seems difficult to Google

As above redgate might be of use but more to the point why wouldn't you just back up the databases, restore them to the new server and then reconnect clients? Hardly seems worth finding a "tool" when no tool will be able to give an end-to-end solution for sql migrations.
 
Soldato
OP
Joined
31 Dec 2003
Posts
4,655
Location
Stoke on Trent
As above redgate might be of use but more to the point why wouldn't you just back up the databases, restore them to the new server and then reconnect clients? Hardly seems worth finding a "tool" when no tool will be able to give an end-to-end solution for sql migrations.

i'm trying to assess the options with no experience of having done this before. Thoughts going through my head include

1) The storage is all on a SAN, is it possible to simply detach the storage then re-attach the storage to a brand new server with a new SQL version?
2) Your approach of backing them up then restoring them will cause a large downtime due to the sizes of the databases involved, but is the "cleanest" way
3) Clients reconnecting is an issue, because the clients are looking for NETWORKNAME\INSTANCENAME, and you can't have 2 of the same network name online at the same time I believe
 
Associate
Joined
16 Jan 2003
Posts
1,913
Few options.
Not sure on one, I really doubt it though and it’s high risk.

Can reduce downtime by doing a full backup, restoring in recovery mode and then and then a differential during downtime.

You will want to check logins are copied over as well as settings a broadly similar (hardware dependant). You can run sp_configure to get most of the info. Logins MS do a helpful Stored procedure called sp_help_revlogin and think you can now get a few more helpful PowerShell scripts to do this.
Clients reconnecting. Options are to either change your connection strings on the clients (depends on number and feasibility, changing destination server name/IP during downtime or configuring new connections with CNAMES etc which might be a better idea going forward. Essentially will be like using DNS so can make future changes more easily.

And test test test
 
Soldato
OP
Joined
31 Dec 2003
Posts
4,655
Location
Stoke on Trent
I didn't know about recovery mode, so thanks for that tip.

I saw some scripts which can copy the DB permissions. I was rather hoping I could chuck some cash at a tool which will restore the differentials right up until the restore point, meaning next to no downtime.
 
Soldato
Joined
30 May 2007
Posts
4,845
Location
Glasgow, Scotland
I didn't know about recovery mode, so thanks for that tip.

I saw some scripts which can copy the DB permissions. I was rather hoping I could chuck some cash at a tool which will restore the differentials right up until the restore point, meaning next to no downtime.

These might be the tools you have seen, but dbatools and Ola Hallengrens scripts can be a fantastic check, backup and restore companion.

The instance backup is good as it scripts out everything in the db: sp's, roles, etc.
 
Associate
Joined
3 Oct 2007
Posts
795
A little late to the thread, but you haven't mentioned what SQL versions, but just in case it's reeeeeealy old keep in mind there is no direct path from SQL 2000 to SQL 2012 and up - you have to go via SQL 2008

1) The storage is all on a SAN, is it possible to simply detach the storage then re-attach the storage to a brand new server with a new SQL version?

I've used broadly this approach for an SQL server hardware migration where time was a factor. How you describe could work, but won't give you a quick and easy back out plan, other than restore from backup.

The plan with an EMC VNX5600 (YMMV):
> Setup Clone of storage beforehand
> Shutdown SQL on Live Server
> Create Marker Text files on Database drives (If these files are there when you mount the clone to the new box then you can be sure that your clone was in sync and you have consistent DB's)
> Shutdown Live server
> Fracture Clone so that the clone is now independant
> Attach new server to fractured clone
> Power on new server
> Attach DB's

You can carry out dry runs of this process (without the SQL shutdown) until you have a solid plan you can follow when doing it for real.

One final tip, regardless of your approach, document every step of the process beforehand in a notepad file - what commands to type or exactly where in a gui to go and what options are getting set.

Having a printed copy you can tick off as you complete each step is an excellent aid, and a digital version to copy and paste from will save you from silly mistakes and typos - if you've tested the process thoroughly you'll have given yourself the best chance of a successful migration.

###Edit: I found my own cribsheet for the process and updated the broad outline of steps above ###
 
Last edited:
Soldato
Joined
4 Dec 2002
Posts
3,941
Location
Bourne, Lincs
If you are getting rid of your old server, you could CNAME the new one, with servers like this where we have clients connecting always CNAME the server so can change it without having to reconfigure your clients
 
Soldato
OP
Joined
31 Dec 2003
Posts
4,655
Location
Stoke on Trent
A little late to the thread
thanks, it's good to hear that the "storage migration" route is possibly an option.

It's 2008 we're moving from, where possible moving to 2017.

If you are getting rid of your old server, you could CNAME the new one, with servers like this where we have clients connecting always CNAME the server so can change it without having to reconfigure your clients

Can I assume you mean set up a DNS record pointing the old server/network instance name to the new server?
 
Soldato
Joined
4 Dec 2002
Posts
3,941
Location
Bourne, Lincs
Can I assume you mean set up a DNS record pointing the old server/network instance name to the new server?

Yes

So the hostname of your original server might be something like sqlserver_01 you create a CNAME called sqlserver and point it at sqlserver_01 and when you set your clients up you connect to \\sqlserver

When you move over to a new server, you just change teh CNAME to sqlserver_02 for isntance and you dont need to change any client settings

Kimbie
 
Man of Honour
Joined
17 Feb 2003
Posts
29,640
Location
Chelmsford
I have just migrated an iSeries DB2 database to an SQL database but the same technique could be used for anything.

1. Created a program on the source server to build the create table scripts based on the meta data (If it's the same platform, ie.e SQl db to SQl Db you won't need to do this)
2. The new table scripts and the tables themselves are made available using a share and mapped to a network drive on the sever.
3. Created a Powershell script to build the tables in the new SQL database
4. Created a second PowerShell script (below) to copy the data from each table on the share into the target SQL database using a BulkCopy command I downloaded from the MS TechNet:

https://gallery.technet.microsoft.com/scriptcenter/Import-Large-CSVs-into-SQL-fa339046

The PowerShell script below requires 3 parameters.. SQL server, SQL database and the source path . The key part of this solution is to make the source db available through a share.

Code:
# Program copies on the iSeries share to the equivalent SQL DB.

# ./CopyData.ps1 -SVR 'servername' -DB 'sqldatabase' -drive 'MappedShare';

Param([String]$SVR , [String]$DB , [string]$drive)
Write-Host "Starting bulk copy of data from $drive to $db on $svr..."
Start-Sleep -s 2
$lines = (Get-childitem -Path $drive\*.txt).Name;
Import-Module .\CsvSqlimport.psm1;

$out = "F:\IBISDataConversion\log\CopyData.log";

if (test-Path $out)

{
Remove-Item -Path $out
}
foreach ($line in $lines)
{

$a,$b = $line.Split('.')

$c = "$drive\$line";


# Get the first 3 rows of the file ...
# NB - I only need to determine if there is at least one row to copy so to speed up processing I've only selected first 3 rows.

$test = Get-Content -Path $c -TotalCount 3

Write-Host " " | Out-File -filepath $out -NoClobber -Append

# If there is at least one recor to copy, then bulk Copy the data, otherwise skip....

If ($test.Count -gt 0)
{
Write-Host "Copying file $line to $a..." | Out-File -filepath $out -NoClobber -Append
try
{
Import-CsvToSql -Csv $c -SqlServer $SVR -Database $DB -Delimiter "|" -table $a -truncate | Out-File -filepath $out -NoClobber -Append

Start-Sleep -s 2

Remove-Item -path $c
}
catch
{
Write-Host "** Error while Copying file $line to $a..." | Out-File -filepath $out -NoClobber -Append
}
}
else
{
Write-Host "No records in file $line. Copy skipped" | Out-File -filepath $out -NoClobber -Append
}
}

hope this helps.
 
Last edited:
Back
Top Bottom