Next Steps with SQL Azure

Following on from my previous posts (check them out before continuing) – this article assumes you have followed steps outlined in the  previous posts to create various models and accounts etc.


So now that we’ve had a look at some of the basics around SQL Azure, Microsoft’s Database in the cloud, let us take a look at something more practical.  For our next example, I’m going to refer to a schema I designed and populated last year – my vinyl record collection.  You can have a look at the details of the schema in a previous entry – Project Sneak Peek.

The project was previously defined in this entry – New Project Announcement: SqlAzure Database and Azure Application (though we may skip the Azure application part, depending on time).

Given the schema is for SQL Server 2008, one of our first tasks will be to script the database and existing data, and then convert it to be conform with SQL Azure’s T-SQL restrictions.

Schema from the Previous Post

Generating Scripts with SQL Management Studio 2008 R2 (November 2009 CTP)

Let us take a quick look at the first step – scripting the existing database. 
You may skip this step and use the script I’ve attached (at the end of this post) if you wish, or you can download the pre-migration script and follow along. 
There’s nothing tricky here except that some of the options have changed.

Once you’ve created the Database on a SQL Server 2008 instance (using the script provided, at the end of this post) right click the Database and select “Generate Scripts..”.


When the “Generate and Publish Scripts” wizard loads, you’ll notice that the initial screen has from the 2008 RTM version.
This is a “cut down” version which heavily simplifies the process of generating a script.  Click “Next”.

From the radio buttons, change it from the default setting, select “Select specific database objects”, and then check the checkbox next to the Tables item in the tree view, below.  Note that the “Script Data” option is no longer here, it has been merged into “Types of data to script”.

image image

From the next screen – do not click “Next” just yet.  Click the “Advanced” button first.
Change the defaults to the values specified in the screen below.

image image

Once everything is set, you may click OK and then proceed to generate the script.
Once the script has been generated, you’ll need to make some changes to the T-SQL, as it will not be compatible with SQL Azure by default.

Making a SQL Azure Compatible Script

For this purpose, I recommend downloading the following tool from CodePlex – SQL Azure Migration Wizard v3.1.4.

The other option is to connect to S
QL Azure via SQL Management Studio R2 (Beta) and parse the script whilst connected to SQL Azure.  The tool is the better option IMHO.

If you use the tool, once it is running, select the “Analyze and Migrate” option, “TSQL File”.  Note, you could also migrate an existing schema (but where would be the fun in that?).

SQL Migration Assistant

Select the source file and click “Next”.  you’ll see a very busy window and a lot of T-SQL generated.  Once it is finished, you can select the contents and save to a new file or simply hit the save button.  Either way, you should now have a valid T-SQL syntax for SQL Azure.

Following the steps from the previous post, you can now connect to SQL Azure and run the script to generate the database.  A quick short list of steps follows:

  • Through the Azure Developer Portal, create a new Database
  • Open SQL Management Studio 2008 R2 and connect to the SQL Azure Database
  • Open the script with the active connection
  • Parse the script (to check for any issues)
  • Execute the script (ensure that the first statement is Use [Vinyl])
  • Verify no errors, and that all data is added

Stay tuned for the next article, as we’ll discover what we can do with our new Database in the Cloud.

Database Scripts

SQL 2008 T-SQL Script

SQL Azure T-SQL Script

Further Reading

Tools and References

Leave a comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.