In this month issue we are going to build a simple solution to automate the migration of databases across SQL Server versions using a mix of PowerShell and T-SQL scripting.
Migration of SQL Server databases is quite a broad topic that usually involves a lot of scenarios and critical things to be carefully planned. Just to give an idea, the official SQL Server 2008 R2 Upgrade Technical Reference Guide is almost 500 pages.
Preparing and testing a solid migration procedure is mandatory and scripting techniques can be very helpful to automate part of the process. In some cases, for example when you want to minimize downtime, it’s actually the only practical way to prepare, test and execute a migration.
Predictability is just one of the advantages, being able to automate the process can be helpful also in other scenarios such as when you have dozens or even hundreds of databases that need to be migrated and you need to regularly refresh a copy in a test environment.
Simplified migration scenario
For the sake of the discussion, let’s assume a simplified migration scenario where activities are divided into these main groups:
Inventory of databases that need to be migrated.
Static analysis of database code and traced workload (this can be assisted with SQL Server Upgrade Advisor).
Development of migration procedures.
Execution of migration procedures tests.
Execution of application tests.
First of all, when you are doing the inventory it may turn out that for some databases you will have to follow well-known procedures or practices.
For example, this happens often with databases used by: centralized Backup solutions, centralized anti-virus solutions, ERP solutions, CMS solutions and so on and so forth. These databases are usually bad candidates for a generalized migration solution because they may have to follow a very different set of actions.
In some cases you may find that several databases can be migrated using a common procedure or at least a set of common actions before further applying another set of customized actions. These are the cases where you may want to develop generic migration scripts and build a solution to automate their execution. Such a generic approach can be especially effective if you are going to iterate through several test sessions and want to migrate a fresh copy before each session.
As listed below, the following is a set of actions for such a simplified migration procedure:
- Backup database to be migrated from original instance.
Restore database to be migrated on destination instance.
Update allocation usage.
Check database consistency with data purity.
Set page verify level (e.g. CHECKSUM).
Set database compatibility level to match destination instance version (e.g. 100).
Drop orphaned schemas (schemas automatically created for each database user to maintain compatibility but with no objects belonging to them).
Please note that some of these actions strictly apply only to databases migrated from SQL Server 2000 (yes, there are still people with databases running on SQL Server 2000).
For example, checking consistency using the DATA_PURITY option of DBCC CHECKDB can be run just once when migrating from 2000. The information that it has run once is then marked internally in the database and, beginning with SQL Server 2005, such checks are done automatically during regular DBCC CHECKDB checks.
If you want to know more about data purity checks, we suggest you to read Knowledge Base article 923247 titled “Troubleshooting DBCC error 2570 in SQL Server 2005”.
Another example is using DBCC UPDATEUSAGE to update allocation usage.
A mixed approach
In most cases migration scripts can be completely developed either with PowerShell or with T-SQL; however both have their strengths and weaknesses.
For example: in T-SQL we may need to fiddle with string concatenation to build a restore command if we need to move the physical to different disks/directories; in PowerShell we can’t directly issue a consistency check action with data purity against SMO.
Moreover, one may already have tested scripts in T-SQL and just want to automate some other actions through PowerShell. Whatever the reason, let’s see how you can leverage both scripting environments to accomplish our goal.
The idea is to build a PowerShell script that, for each database you want to migrate, this script will implement the backup/restore operations and then execute a T-SQL script to finalize the migration.
The PowerShell migration script
First of all, you should define which assemblies need to be loaded and a generic error handler to globally trap unhandled exceptions:
Then, you can proceed defining some handlers to give some feedback about the backup and restore operations progress:
These handlers are called back directly by SMO and you are going to implement a simple progress notification like the one provided by SSMS or sqlcmd when you issue the backup and restore commands in T-SQL. Please note that the completion handler simply outputs the error message, if no errors occurred it contains the familiar message that indicates success show in these examples:
Next step is to write a generic function to connect to a SQL Server instance:
You will use this function to connect both to the origin and destination instances. Please note that it supports both Windows and SQL Server authentication methods.
Another generalized function that you will need is one that will back up the databases to be migrated:
This function accepts three parameters: the SQL Server instance hosting the database to be migrated, the database name and a path to store the backup set. This path can either be a path to a local disk/directory or a UNC path to a shared directory. The important thing to remember is that the SQL Server engine, which is running the backup command, needs to have proper permissions to access the path.
As a simple convention, the backup set name is simply derived from the database name appending the “.bak” suffix.
Furthermore, you should note that we are doing a “Copy Only” backup so that we don’t break any existing backup maintenance procedure.
Finally, you need another generalized function to restore the database:
The script should be pretty much self-explanatory, but there are some things worth noting.
The main script invokes using sqlcmd.exe command line utility to execute a T-SQL script that contains the post-migration actions to be executed. Since we want the T-SQL script to be generic, i.e. not tied to a particular database, we are going to use sqlcmd scripting variables inside the script. One way to assign values to sqlcmd scripting variables is to use the –v parameter as shown in the main script.
Assuming the T-SQL script is in the same directory as the PowerShell one, we are using the $MyInvocation.MyCommand.Path property and the Split-Path CmdLet to derive the path to concatenate to the script name before passing it to the sqlcmd.exe utility. The same path is used also for the log file generated by the utility and passed using the –o parameter.
It may happen, for some reason, that you don’t have the sqlcmd.exe utility directory in the environment PATH variable, thus forcing you to specify also the path to the command. Instead, you can remove the comment in the script and add on-the-fly the directory as follows:
You can find the complete solution in powershell-201102.
The T-SQL migration script
As we said initially, we want to keep some post-migration activities in a T-SQL script, for this simplified scenario we want to execute the following actions: checking database consistency, updating allocation usage, setting page verify level, setting database compatibility level, updating statistics and dropping orphaned schemas.
Following is the T-SQL code you may want to use to carry out these activities:
As you can see the code uses the sqlcmd scripting variable $(DatabaseName) so that it can be used for multiple databases by simply assiging a database name to the variable before executing it. For example using the –v parameter with the sqlcmd.exe utility as you have seen in the previous section.
You can find a more complete T-SQL script, that includes commands to print progress, in powershell-201102.
Checking the migration logs
DBCC commands and the sp_updatestats system stored procedure produce a very verbose output. This is a good thing because you want to be sure to get all the information you may need.
However, when automating a migration for several test iterations it can become tedious each time, to go through the T-SQL script execution logs to see if there is any error.
To ease this step, you can build a quick PowerShell script to parse the logs based on some well-known patterns as in the following examples:
The script loops through all files in a given directory with the “.log” suffix in the filename. For each file, it first searches for the second last string output by DBCC CHECKDB to see if it reported any allocation and/or consistency errors, and it then searches for generic SQL Server errors.
If errors are found, they are reported like in the following example:
Otherwise the output will be similar to the following:
You can find the script inpowershell-201102
This month we have seen how to use a mix of PowerShell and T-SQL scripting to help us automate the migration of databases from old SQL Server instances to newer ones.
Always remember that the code presented in this article is just an example to show how you can leverage PowerShell and it is not suitable for production or general purpose usage. As usual, we had to introduce some simplifications to keep the article to a reasonable size.
For example, in parsing the logs, we are relying on fixed error message formats. If, for any reason, this changes, you risk missing some errors. Furthermore, we purposely ignore Service Broker messages. So, you may want to build a more robust solution for production usage.
Another example is with the set of post-migration actions. In a real migration scenario, you usually need to carry out more actions some of which also depend on conditional logic.
Migration is a very interesting topic and there are several areas were PowerShell can be used to automate things so we may come back to it in the future.
Until then, we’d love to hear from you—especially about which topics you’d like us to cover next.
Post co-authors: Gianluca Hotz and Herbert Albert