SQL Server

Database Creation with DbUp

This is going to be a quick post to show how to configure DbUp to create a new database on top of the migration functionality we have covered in the past. Use the following post to catch up on the DbUp topics we have covered in the past.

Database Migrations with DbUp
Code-based Database Migrations with DbUp
Always Run Migrations with DbUp
Logging Script Output with DbUp
Migrations in Transactions with DbUp

Database Creation

The code for database creation is a one-liner that just needs a connection string and should be run before any migrations. The following is part of our Main function for the sample application we have been working with with the EnsureDatabase line added and highlighted which is the code that takes care of database creation.

static int Main(string[] args)
{
    var connectionString =
        args.FirstOrDefault()
        ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

    EnsureDatabase.For.SqlDatabase(connectionString);

    var upgrader =
        DeployChanges.To
                     .SqlDatabase(connectionString)
                     .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                           f => !f.Contains(".AlwaysRun."))
                     .LogToConsole()
                     .WithTransaction()
                     .Build();

    var result = upgrader.PerformUpgrade();

Wrapping Up

As you can see it is simple to add the ability to create databases in your DbUp applications. Database creation in the official DbUp docs is just a note and is easy to miss so I wanted to do a post to call it out as it is a powerful feature when you need it. In our example, the application will always attempt to create the database if it doesn’t exist, but if that doesn’t fit your use case it might be worth wrapping database creation in an argument so the user could decide if the database should be created or not.

Database Creation with DbUp Read More »

Migrations in Transactions with DbUp

This post will be continuing our exploration of DbUp with the addition of transactions as part of migration execution. If you are new to DbUp check out the following post to catch up on this series.

Database Migrations with DbUp
Code-based Database Migrations with DbUp
Always Run Migrations with DbUp
Logging Script Output with DbUp

Transaction Types and Restrictions

DbUp has three different options for transactions. The default is no transaction, which is what we have been using so far. The other two options are a transaction per migration and finally a single transaction for all the migrations in a single upgrader. As far as I can tell so far there is no way to use the same transaction across multiple upgraders so that means in our sample application we could have our normal migrations work correctly and get committed, but have an issue in our always run migrations. The other thing to keep in mind is that some database providers don’t support transactions when changing the structure of the database. As pointed out by the DbUp docs, the Postgres docs have a good summary of transactional support for DDL of different database providers.

Adding Transactions

In the following example, we have changed the upgrader to use a single transaction. This is the full code for the upgrader with the change highlighted.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                       f => !f.Contains(".AlwaysRun."))
                 .LogToConsole()
                 .WithTransaction()
                 .Build();

The other options are WithTransactionPerScript for, surprise, a transaction per migration, and WithoutTransaction for no transactions (or just leave it out since this is the default value). The following is a sample of the output with transactions on with two different upgraders in play and the begin transactions highlighted.

Beginning transaction
Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning transaction
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.01-EverytimeNoPrints.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.02-EverytimeNoPrintsNoCountOn.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.03-EverytimePrints.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.04-EverytimePrintNoCountOn.sql'
Upgrade successful
Success!

Wrapping Up

DbUp makes adding transactions to your migrations super simple so this was a really short post. It would be nice to have an option to use a transaction across multiple upgraders. The official DbUp docs on transactions can be found here.

Migrations in Transactions with DbUp Read More »

Logging Script Output with DbUp

In today’s post, we will be going over how to log script output during a DbUp run. If you are new to this series of posts or DbUp in general you might find the following post helpful to review.

Database Migrations with DbUp
Code-based Database Migrations with DbUp
Always Run Migrations with DbUp

Enable Script Output Logging

I will be using the always run migrations we covered last weekend to simplify testing, but the concepts are the same for normal migrations. The change to enable script output logging is a single line added to our upgrader setup as you can see with the highlighted line in the following code.

var alwaysRunUpgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(), 
                                                       f => f.Contains(".AlwaysRun."))
                 .JournalTo(new NullJournal())
                 .LogToConsole()
                 .LogScriptOutput()
                 .Build();

Running the application will result in something like the following. Notice the records affected line which is the output from our migration.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
RecordsAffected: 0

Upgrade successful
Success!

Improve the Script Output

So the above is great for a simple migration that has a single statement, but for more complex migrations it might be helpful to use PRINT statements to provide more information. Let’s tweak our script to print the start and end times of the execution. The following is the sample script (and yes I know it would never actually update anything).

PRINT CONVERT(VarChar, GETDATE(), 121)  + ' Start every time';

UPDATE Application.Cities SET CityName = 'Nothing' WHERE 1 = 0;

PRINT CONVERT(VarChar, GETDATE(), 121)  +  + ' End every time';

The above migration results in the following output.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
2020-08-05 06:15:21.877 Start every time

2020-08-05 06:15:21.877 End every time

RecordsAffected: 0

Upgrade successful
Success!

The prints are helpful, but the records affected is noise in this case, especially since it is the records affected for the last statement and does print per SQL statement. To suppress the records affected message we and use SET NOCOUNT ON in our script like the following.

SET NOCOUNT ON;

PRINT CONVERT(VarChar, GETDATE(), 121)  + ' Start every time with NoCount ON';

UPDATE Application.Cities SET CityName = 'Nothing' WHERE 1 = 0;

PRINT CONVERT(VarChar, GETDATE(), 121)  +  + ' End every time with NoCount ON';

SET NOCOUNT OFF;

And as you can see in the following we got a cleaner output.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
2020-08-05 06:38:18.863 Start every time

2020-08-05 06:38:18.863 End every time

Upgrade successful
Success!

Wrapping Up

Logging the output of scripts can be really helpful at times, but if you are going to use it as we saw above it is important to make sure and plan what that output is going to look like. Having 500 instances of the number of records affected isn’t necessarily helpful.

Logging Script Output with DbUp Read More »

Always Run Migrations with DbUp

In this post, we are going to walk through how to use DbUp to run specific migrations on every run instead of just once. If you are new to DbUp check out the following post to see how the sample project got to its current state.

Database Migrations with DbUp
Code-based Database Migrations with DbUp

 

Differentiating for Always Run

One of the keys to getting migrations that should run every time is being able to identify them separately from migrations that should only run once. This can be done at either the filename level or by placing all the always run files in a specific directory. This sample is going with the directory methods, but the file method would work basically the same way. In the sample project under the existing Scripts directory, I added an AlwaysRun directory. I also added an example script in this directory so we could verify that it is getting executed on every run. In the sample, the file is named Everytime.sql.

Based on the above when DbUp is processing it will see Everytime.sql as DbupTest.Scripts.AlwaysRun.Everytime.sql.

Filtering Out Always Run

Now that we have a way to identify the migrations we want to always run we need to filter them out of the existing upgrade process so that they don’t get included in the log of already executed migrations. In DbUp this log is referred to as the journal. The extension methods that setup how migrations will be located, either WithScriptsEmbeddedInAssembly or WithScriptsAndCodeEmbeddedInAssembly in our examples, can be provided with a filter which we will use to exclude migrations that are in the AlwaysRun directory. The following code is the upgrader with the filter in place.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                       f => !f.Contains(".AlwaysRun."))
                 .LogToConsole()
                 .Build();

var result = upgrader.PerformUpgrade();

Executing Always Run Migrations

Now that we have adjusted our original upgrader to exclude migrations in the AlwaysRun directory we need something to execute the always run migrations. To accomplish this we add a second upgrader that is filtered just to AlwaysRun scripts. The second really important bit about this second upgrader is that it uses a NullJournal. The NullJournal is what keeps the execution of the scripts from being logged which results in them always getting run. The following code was inserted after the above code. The highlighted bits point out the filter and null journal.

if (result.Successful)
{
    var alwaysRunUpgrader =
        DeployChanges.To
                     .SqlDatabase(connectionString)
                     .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                           f => f.Contains(".AlwaysRun."))
                     .JournalTo(new NullJournal())
                     .LogToConsole()
                     .Build();

    result = alwaysRunUpgrader.PerformUpgrade();
}

The following is the full Main function just in case the above code didn’t give enough context on how the code should fit together.

static int Main(string[] args)
{
    var connectionString =
        args.FirstOrDefault()
        ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

    var upgrader =
        DeployChanges.To
                     .SqlDatabase(connectionString)
                     .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                           f => !f.Contains(".AlwaysRun."))
                     .LogToConsole()
                     .Build();

    var result = upgrader.PerformUpgrade();

    if (result.Successful)
    {
        var alwaysRunUpgrader =
            DeployChanges.To
                         .SqlDatabase(connectionString)
                         .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(), 
                                                               f => f.Contains(".AlwaysRun."))
                         .JournalTo(new NullJournal())
                         .LogToConsole()
                         .Build();

        result = alwaysRunUpgrader.PerformUpgrade();
    }

    if (!result.Successful)
    {
        Console.ForegroundColor = ConsoleColor.Red;
        Console.WriteLine(result.Error);
        Console.ResetColor();
#if DEBUG
        Console.ReadLine();
#endif                
        return -1;
    }

    Console.ForegroundColor = ConsoleColor.Green;
    Console.WriteLine("Success!");
    Console.ResetColor();

    return 0;
}

Wrapping Up

With the above changes we now have a project that can run normal migrations, code-base migrations, and always run migrations. No matter what style of database change management you decide on having the changes in some sort of source control system will be super helpful.

For more information on DbUp’s journalling check out the official docs.

Always Run Migrations with DbUp Read More »

Code-based Database Migrations with DbUp

In today’s post, we are going to check out the code-based migration feature of DbUp which allows code to run as part of the migration process instead of just SQL based scripts. The ability to run code as part of the migration provides a ton of flexibility in the migration process. This builds on last week’s post, Database Migrations with DbUp, make sure and check it out if you are new to DbUp.

Script Provider Change

In our example application from last week’s post in the Program class when setting up our upgrader we used the following setup.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

The above only picks up embedded SQL type files. In order to also pick up code-based migrations we have to change from the WithScriptsEmbeddedInAssembly script provider to WithScriptsAndCodeEmbeddedInAssembly. The following is the upgrader with the new script provider highlighted.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

Code-based Migrations

To add a code-based migration add a new class and set it to implement DbUp’s IScript interface. The script provider we changed to above will pick up any non-abstract class that implements IScript as well as SQL script files. The results are combined and sorted by name to determine execution order so make sure whatever naming convention you pick will sort properly for both your SQL scripts as well as code.

The IScript interface defines a single function, ProvideScript, that has a single parameter that provides a function to create a database command and returns a string. Depending on the need you can return a string with the SQL to be executed or you can create a command and execute any commands you need directly on the database or some combination of both if needed.

To test out code-based migrations, I added the following new class in the Scripts folder. It doesn’t actually change any data but instead logs some information to the console. Even without changing data it still conveys how code-based migrations work.

using System;
using System.Data;
using DbUp.Engine;

namespace DbupTest.Scripts
{
    public class Code0002Test : IScript
    {
        public string ProvideScript(Func<IDbCommand> dbCommandFactory)
        {
            using (var cmd = dbCommandFactory())
            {
                cmd.CommandText = "SELECT DeliveryMethodName FROM Application.DeliveryMethods";

                using (var dr = cmd.ExecuteReader())
                {
                    Console.WriteLine("  Delivery Methods");

                    while (dr.Read())
                    {
                        Console.WriteLine($"    {dr["DeliveryMethodName"]}");
                    }
                }
            }

            return string.Empty;
        }
    }
}

The following is the console output from running the application.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
  Delivery Methods
    Air Freight
    Chilled Van
    Courier
    Customer Collect
    Customer Courier to Collect
    Delivery Van
    Post
    Refrigerated Air Freight
    Refrigerated Road Freight
    Road Freight
Executing Database Server script 'DbupTest.Scripts.Code0002Test.cs'
Checking whether journal table exists..
Upgrade successful
Success!

Wrapping Up

The ability to run code as part of the database migration process is an awesome set of functionality. I would advise sticking with SQL based migrations as default and only using code when SQL doesn’t provide a way to do what is needed or doing the migration in code is much more understandable. Make sure and check out the official DbUp docs on this feature for more information.

Code-based Database Migrations with DbUp Read More »

Database Migrations with DbUp

Managing database schemas can be a challenging problem. It is also an area where there is no one size fits all solution (not sure that is ever really true for anything). Solutions range from manual script management, database vendor-specific options (such as DACPAC for Microsoft SQL Server), to migrations. This post will be looking at one of the options for a migration based approach using DbUp.

Sample Database

I needed a sample database to start off with so I dug up one of my posts for Getting a Sample SQL Server Database to guide me through getting Microsoft’s WideWorldImporters sample database downloaded and restored.

DbUp Console Application

There are quite a few ways DbUp can be used, for this example, we are going to be using it from a .NET Core Console application. From a terminal use the following command to create a new console application in the directory you want the application in.

dotnet new console

Now we can use the following command to add the DbUp NuGet package to the sample project. In this case, we are using the SQL Server package, but there are packages for quite a few database providers so install the one that is appropriate for you.

dotnet add package dbup-sqlserver

Next, open the project in Visual Studio (or any editor but part of how this example is setup is easier in Visual Studio). In the Program class replace all the code with the following. We will look at a couple big of this code that below.

using System;
using System.Linq;
using System.Reflection;
using DbUp;

namespace DbupTest
{
    class Program
    {
        static int Main(string[] args)
        {
            var connectionString =
                args.FirstOrDefault()
                ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

            var upgrader =
                DeployChanges.To
                             .SqlDatabase(connectionString)
                             .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                             .LogToConsole()
                             .Build();

            var result = upgrader.PerformUpgrade();

            if (!result.Successful)
            {
                Console.ForegroundColor = ConsoleColor.Red;
                Console.WriteLine(result.Error);
                Console.ResetColor();
#if DEBUG
                Console.ReadLine();
#endif                
                return -1;
            }

            Console.ForegroundColor = ConsoleColor.Green;
            Console.WriteLine("Success!");
            Console.ResetColor();

            return 0;
        }
    }
}

The bit below is trying to pull the connection string for SQL Server out of the first argument passed from the terminal to the application and if no arguments were passed in then it falls back to a hardcoded value. This works great for our sample, but I would advise against the fallback value for production use. It is always a bad day when you think your migrations have run successfully but it was on the wrong database because of a fall back value.

var connectionString = 
    args.FirstOrDefault() 
    ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

This next section is where all the setup happens for which database to deploy to, where to find the scripts to run, and where to log. There are a lot of options provided by DbUp in this area and I recommend checking out the docs under the More Info section for the details.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

Finally, the following is where the scripts are actually executed against the database.

var result = upgrader.PerformUpgrade();

The rest of the function is dealing with displaying to the console the results of the scripts running.

Adding Scripts

Keep in mind that we are using the Embedded Scripts option so DbUp is going to find all the embedded files that end with ‘.sql’. I have added a Scripts directory to the project so the scripts don’t clutter the root of the project, but that isn’t a requirement. Do not that how the files are named will control the order in which the scripts get executed so make sure you establish a good naming convention upfront. For our sample, I added a file named 0001-TestScript.sql to the Scripts directory. Since we are using the embedded scripts provider it is critical that after adding the file we go to its properties and set the Build Action to Embedded resource.

The script file itself doesn’t do anything in this case except select a value.

Trying it out

At this point, we can hit Run in Visual Studio and it will execute our new script. The output will look something like the following.

If we were to run the application again it would tell us that no new scripts need to be executed. DbUp like all migration based solutions I have encountered use a table in the database to keep a record of what migrations have been run. The default table for DbUp is SchemaVersion and it consists of an Id, ScriptName, and Applied columns. Given this structure, it is important to keep in mind that if you rename a script that has already been executed on a database DbUp will see that as a different script and execute it again.

Wrapping Up

In the docs for DbUp there is a philosophy section and one of the points is that a database is a result of a set of transitions, not just its new state vs. its old state. This point is key to why I am a fan of a migration based approach.

Check out the DbUp docs and GitHub repo for more information.

Database Migrations with DbUp Read More »

Azure App Service with On-premises Connection

Imagine you get a new project with the ability to use whatever cloud services you want. You jump at the change and dream up an amazing architecture using Azure. Next, you present your plan to the rest of your team and discover a new requirement of having to use an existing on-premises database.

Does this mean your grand plans are shot? Thankfully not, as Azure has multiple solutions that allow you to connect to your existing on-premises resources to enable hybrid cloud strategies.

In this post, we are going to use one of the options, Hybrid Connections, to connect from a web site hosted in an App Service to an on-premises database.

Sample Application

The base sample application we will be using for this post is a new Razor Pages App targeting .NET Core 3. We will walk through actually connecting to the database later in this post. To get started you need the new app created and running in an Azure App Service.

I have multiple walkthroughs of creating a new application and publishing them to Azure so I’m not going to rehash that here. If you need help getting your app created and published to Azure you can check out my Deploying an ASP.NET Core Application to Microsoft Azure post and instead of using the Razor template use the Web App template like the following.

dotnet new webapp

Also, note that Hybrid Connections aren’t available on the free or shared hosting plans so when you are setting up your publish profile avoid those options.

Add a Hybrid Connection to an App Service

From the Azure Portal open the App Serice you created above and under the Settings section of the menu click Networking.

In the networking details, we want to click Configure your hybrid connection endpoints.

I’m going to point out again that Hybrid Connections aren’t available at free and shared scales levels. If your App Service is on a free or shared scale use the Scale up menu option to switch to a scale level that will support Hybrid Connections.

From the Hybrid connections detail page click Download connection manager. When done this will need to be installed on the machine that is running the on-premises database you want to connect to.

Next, click Refresh and then click Add hybrid connection.

Now on the Add hybrid connection page click Create new hybrid connection.

In order to create a new hybrid connection, Azure will require some information. The key parts here are the Endpoint Host which is the name of the machine that is hosting the database you wish to communicate with and the Endpoint Port which will need to be the port that your database is configured to communicate over.

Hybrid Connection Manager on the host machine

Now that the Azure side is configured we need to use the Hybrid Connection Manager application that we installed on the target machine above to allow talk to our App Service.

After opening the Hybrid Connection Manager on the target machine click Add a new Hybrid Connection.

Now Select the Subscription the App Service is a part of. After the list of available connections, loads select the one created above and finally click Save.

After making the above changes my hybrid connection continued to show offline in Azure. After some searching, I found a blog post that suggested restating the Azure Hybrid Connection Manager Service which cleared the problem up for me.

Sample Application Changes to Connect to On-Premises Database

This is a very raw test, but it gets the point across. First, add a reference to the System.Data.SqlClient NuGet package. Next in the Index.cshtml.cs delete the OnGet function and replace it with the following.

public async Task OnGetAsync()
{
    Tables.Clear();
    var connectionString = "data source=Server;initial catalog=master; User Id=User;Password=Password"";
    await using var connection = new SqlConnection(connectionString);
    await connection.OpenAsync();
    await using var command = 
                    new SqlCommand("SELECT name FROM sys.tables", connection);
    await using var reader = command.ExecuteReader();

    while (await reader.ReadAsync())
    {
        Tables.Add(reader["name"].ToString());
    }
}

The above connects to the master database on the specified server and pulls a list of table. Note that you will need to modify the connection string to something valid for your system. It is also important to know that you can’t use integrated security with this setup so you will need to specify a user and password that exists on your SQL Server. Add the following property in the same file.

public List<string> Tables { get; set; } = new List<string>();

Add the following to the bottom of the Index.cshtml which will output the table list we pulled above to the page.

@foreach (var table in Model.Tables)
{
    @table <br/>
}

After the changes are done republish the application to your Azure App Service. After the publish is done your site should show with a list of tables that exist in the master database of your SQL Server.

Wrapping Up

Hybrid connections is a great way to take a workload to the cloud without having to actually move all your data. It is also one of those features of Azure that I had no idea that existed before a week ago.

If you need a lot of hybrid connections look closely at the pricing as the number you can use is tied to what App Service scale you are using. The number of available starts at 5 and can go up to 200 with the more expensive App Service scales.

Azure App Service with On-premises Connection Read More »

Entity Framework Core: SQL Server Concurrency Checks

Over the last couple of weeks, I have covered concurrency checks in SQLite and Postgres. This week I’m expanding my Entity Framework sample repo to include a SQL Server based example project in order to demo concurrency checks. for SQL Server. The repo with the SQL Server project added, but before the concurrency checks can be found here.

Note that all the changes in this post will be made under the SqlServer folder.

Context Changes and Data Migration

SQL Server has a built-in column type of rowversion which is automatically generated with a unique binary number when a row is inserted or updated. In order to utilize the rowversion column type via Entity Framework, we need to add a new column to our model with the Timestamp attribute like the following from the sample code in the Contact model.

[Timestamp] 
public byte[] Timestamp { get; set; }

Next, from a command prompt in the same directory as your project file using the following .NET CLI command to generate a migration for the above change.

dotnet ef migrations add ContactTimestamp --context ContactsDbContext

Then, use the following command to apply the migration to your database.

dotnet ef database update --context ContactsDbContext

Testing it out

For a quick test add a ConcurrencyTest razor page under the Contacts directory. This function is going to ensure a specific contact exists, then pull the contact from two different DBContexts, make a mutation on the resulting contact objects, then attempt to save. The first save will work and the second should fail. Please note that this function isn’t an example of how things should be done just a quick and dirty way to prove that the concurrency check is happening.

public void OnGet()
{
    var context1 = new ContactsDbContext(new DbContextOptionsBuilder<ContactsDbContext>()
                                         .UseSqlServer("yourConnectionString")
                                         .Options);
    var contactFromContext1 = context1.Contacts
                              .FirstOrDefault(c => c.Name == "Test");

    if (contactFromContext1 == null)
    {
        contactFromContext1 = new Contact
                              {
                                  Name = "Test"
                              };

        context1.Add(contactFromContext1);
        context1.SaveChanges();
    }

    var context2 = new ContactsDbContext(new DbContextOptionsBuilder<ContactsDbContext>()
                                         .UseSqlServer("yourConnectionString")
                                         .Options);
    var contactFromContext2 = context2.Contacts
                              .FirstOrDefault(c => c.Name == "Test");

    contactFromContext1.Address = DateTime.Now.ToString();
    contactFromContext2.Address = DateTime.UtcNow.ToString();

    context1.SaveChanges();
    context2.SaveChanges();
}

Run the application and hit the ConcurrenctTest route which is https://localhost:44324/Contacts/ConcurrencyTest for my test. The following is the resulting exception.

An unhandled exception occurred while processing the request.

DbUpdateConcurrencyException: Database operation expected to affect 1 row(s) but actually affected 0 row(s). Data may have been modified or deleted since entities were loaded. See http://go.microsoft.com/fwlink/?LinkId=527962 for information on understanding and handling optimistic concurrency exceptions.

Wrapping Up

Of the three providers, Postgres seems to have the simplest implementation, not that concurrency is going to be how you choose your database provider. I don’t plan on adding any more providers at the moment, but if you are using with that hasn’t be covered leave a comment and I will look at adding another example.

The code in its final state can be found here.

Entity Framework Core: SQL Server Concurrency Checks Read More »

.NET Parameterized Queries Issues with SQL Server Temp Tables

In the last few weeks at work, I have had multiple people have issues using a parameterized query in .NET that involved a temp table. It took a little bit of digging, but we finally tracked down the issue. This post is going to cover the cause of the issue as well as a couple of ways to fix it. The database used in this post is Wide World Importers sample database from Microsoft. For instructions on setting it up check out my Getting a Sample SQL Server Database post from last week.

Sample Project Creation

To keep things as simple as possible I am using a console application created using the following .NET CLI command.

dotnet new console

Followed by this command to add in the SQL Client from Nuget.

dotnet add package System.Data.SqlClient

The following is the full Program class with the sample code that will result in the exception this post is dealing with. Yes, I am aware this isn’t the be way to structure this type of code so please don’t judge it from that aspect. It is meant to be simple to demonstrate the issue.

class Program
{
    static void Main(string[] args)
    {
        Console.WriteLine("Running sample");

        using (var connection = 
                  new SqlConnection(@"Data Source=YourServer;
                                      Initial Catalog=YourDatabase;
                                      Integrated Security=SSPI;"))
        {
            connection.Open();

            using (var command = connection.CreateCommand())
            {
                SqlTest(command);
            }
        }

        Console.ReadLine();
    }

    private static void SqlTest(SqlCommand command)
    {
        command.CommandText = @"SELECT OrderId
                                      ,CustomerId
                                      ,SalespersonPersonID
                                      ,BackorderOrderId
                                      ,OrderDate
                                INTO #backorders
                                FROM Sales.Orders
                                WHERE BackorderOrderID IS NOT NULL
                                  AND OrderDate > @OrderDateFilter";

        command.Parameters.Add("@OrderDateFilter", 
                                SqlDbType.DateTime)
                          .Value = DateTime.Now.AddYears(-1);
        command.ExecuteNonQuery();

        command.CommandText = "SELECT OrderId FROM #backorders";

        using (var reader = command.ExecuteReader())
        {
            while (reader.Read())
            {
                Console.WriteLine(reader["OrderId"]);
            }
        }
    }
}

The Error

Running the application as it exists above will result in the following error.

Invalid object name ‘#backorders’.

Strange error since we just created the #backorder temp table. Let’s give it a try without the filter. The query now looks like the following.

command.CommandText = @"SELECT OrderId
                              ,CustomerId
                              ,SalespersonPersonID
                              ,BackorderOrderId
                              ,OrderDate
                        INTO #backorders
                        FROM Sales.Orders
                        WHERE BackorderOrderID IS NOT NULL";

command.ExecuteNonQuery();

Now the application runs without any issues. What if we try adding back the filter, but without using the command parameter?

command.CommandText = @"SELECT OrderId
                              ,CustomerId
                              ,SalespersonPersonID
                              ,BackorderOrderId
                              ,OrderDate
                        INTO #backorders
                        FROM Sales.Orders
                        WHERE BackorderOrderID IS NOT NULL
                          AND OrderDate > '2018-01-01'";

command.ExecuteNonQuery();

Again the application runs without any issues.

The Reason

Why is it that adding a command parameter is causing our temp table to disappear? I discovered the issue by using SQL Server Profiler (in SQL Server Management Studio it can be found in Tools > SQL Server Profiler). With the code back to the original version with the command parameter and Profiler connected to the same server as the sample application running the sample application shows the following command received by SQL Server.

exec sp_executesql N'SELECT OrderId 
                     FROM #backorders',
                   N'@OrderDateFilter datetime',
                   @OrderDateFilter='2017-08-28 06:41:37.457'

It turns out that when you use command parameters in .NET it gets executed on SQL Server using the sp_executesql stored procedure. This was the key bit of information I was missing before. Now that I know parameterized queries are executed in the scope of a stored procedure it also means the temp table used in our first query is limited to the usage within the stored procedure in which it was created.

Options to Fix

The first option is to not use parameters on your initial data pull. I don’t recommend this option. Parameters provide a level of protection that we don’t want to lose.

The second option and the way we addressed this issue is to create the temp table first. Now that the temp table has been created outside of a stored procedure it is scoped to the connection and then allows us to insert the data using parameters. The following code is our sample using this strategy.

command.CommandText = @"CREATE TABLE #backorders
                        (
                           OrderId int
                          ,CustomerId int
                          ,SalespersonPersonID int
                          ,BackorderOrderID int
                          ,OrderDate date
                        )";

command.ExecuteNonQuery();

command.CommandText = @"INSERT INTO #backorders
                        SELECT OrderId
                              ,CustomerId
                              ,SalespersonPersonID
                              ,BackorderOrderId
                              ,OrderDate
                        FROM Sales.Orders
                        WHERE BackorderOrderID IS NOT NULL
                          AND OrderDate > @OrderDateFilter";

command.Parameters.Add("@OrderDateFilter", 
                       SqlDbType.DateTime)
                  .Value = DateTime.Now.AddYears(-1);
command.ExecuteNonQuery();

Wrapping Up

I hope this saves someone some time. Once I understood what was going on the issue made sense. How .NET deals with a SQL command with parameters is one of those things that just always worked and I never had the need to dig into until now. Always something new to learn which is one of the reasons I love what I do.

.NET Parameterized Queries Issues with SQL Server Temp Tables Read More »

Getting a Sample SQL Server Database

As tends to happen this isn’t the post I set out to write today. We hit an issue at work the other day that I want to write about, but to do so I need a sample database (I can’t use the data from work). I wanted something that is more fleshed out that my normal single table contact database. I google for AdventureWorks, since that is the sample database I have always seen in examples.

Options

Turns out that AdventureWorks isn’t the only SQL Server sample database option in town these days. Microsoft has a SQL Server Samples repo that has three different options depending on your needs. The following is a description and links to all there options as they stand right now.

wide-world-importers

The new sample database for SQL Server 2016 and Azure SQL Database. It illustrates the core capabilities of SQL Server 2016 and Azure SQL Database, for transaction processing (OLTP), data warehousing and analytics (OLAP) workloads, as well as hybrid transaction and analytics processing (HTAP) workloads.

contoso-data-warehouse

Sample data warehouse that illustrates loading data into Azure SQL Data Warehouse.

AdventureWorks

Sample databases and Analysis Services models for use with SQL Server.

Since I’m not dealing with a data warehouse the middle option is out. AdventureWorks is still a valid option, but I just can’t pass up WideWorldImporters which is the newest sample used to show off SQL Server 2016.

Getting Started

Make sure you have both SQL Server and SQL Server Management Studio installed. For SQL Server we will be using the on-premises version. I recommend grabbing the developer edition as it provides the full set of features for free as long as it isn’t used in production. Once you have both of the above installed it is time to get download the database related files.

As of this writing, this GitHub release page contains the latest bits to get started with, and yes it is from June of 2016. There are a lot of options listed, but the file we are interested in is WideWorldImporters-Full.bak. If you are looking to explore some of the new features of SQL Server 2016 you should also grab sample-scripts.zip to play with, but this isn’t going to be covered in this post.

Restoring the Database

Now that we have the backup of the database we are going to restore it is time to open up SQL Server Management Studio. In the connection dialog, connect to the server you want the database to end up on. In this case, I’m connecting to a SQL Server instance running on my local PC. When you are all set hit the Connect button.

Once connected you should see the selected server in the Object Explorer window.

Right-click on the Databases folder (or node if you prefer) and click Restore Database.

This will launch the Restore Database dialog. This dialog is full of stuff, but we are going to focus on the minimum needed to restore a database from a backup file. First, select the Device option and then click the  button.

This will show the Select backup devices dialog. We want to use the File type and then click the Add button.

On the next screen, you need to enter the path to your backup file. I’m not sure why, but this isn’t the easiest dialog for finding a file. I found it easier to use Windows Explorer to find the directory the backup file is in and copy it to the Backup File Location. Once you have the right directory select the backup file and click OK.

Back on the Select backup devices dialog click the OK button to continue. This will land you back on the Restore Database dialog which will now display information from the backup that was selected. Click the OK button to start the restore process.

After a minute or so the restore process should complete. If you expand the Databases node back in the Object Explorer window you should see the restored database listed.

Data Generation

The WideWorldImporters database comes with a stored procedure that will generate current data for you to work with. This isn’t a fast process so be prepared to wait if you decide to run this. To start open a new query window for the WideWorldImporters database by right-clicking and selecting New Query.

If you run the following query it will start the data generation process.

EXECUTE DataLoadSimulation.PopulateDataToCurrentDate
    @AverageNumberOfCustomerOrdersPerDay = 60,
    @SaturdayPercentageOfNormalWorkDay = 50,
    @SundayPercentageOfNormalWorkDay = 0,
    @IsSilentMode = 1,
    @AreDatesPrinted = 1;

The official docs have more details on data generation. From the docs, it says that data generation will take about 10 minutes per year and starts off from 2016 so you are looking at a 30-minute minimum runtime (and based on my test that 10 minutes per year number is much too low). Do note that the back up comes with data so this process is only needed if you want recent (date wise) data.

Wrapping Up

The above process wasn’t hard, but if you are like me and haven’t worked much with database backup hopefully you found this post helpful. I’m sure you will find the sample database helpful for trying out some of the newer features that SQL Server or for a source of data that is OK to use publically.

Make sure you check out the official docs pages for Wide World Importers. It has a lot more information as well as instructions for other workload use cases such as data warehousing.

Getting a Sample SQL Server Database Read More »