GitHub and Azure Pipelines: Build Triggers

In response to my post on GitHub and Azure Pipelines, I got the following question on Reddit.

Does this automatically detect branches? From your screenshot, you’re building master. If you copy to feature-A, does a new pipeline automatically get created and built?

When I initially answered this question I didn’t go deep enough. The answer to does a new pipeline automatically get created and built is no as I replied, but I think the intent of the question is do I have to go set up a new pipeline every time I create a new branch and the answer to that is also no. The existing pipeline will be triggered when any change is checked in on any branch by default. Do note that it won’t be triggered when the branch is created only when a change is checked in.

Limiting Builds

There are a couple of ways to control what branches trigger continuous integration builds. The first is by making edits to the azure-pipeline.yml file in the repo and the second is via an override in the Azure Pipeline.

YAML

The official Build pipeline triggers docs are really good, but I will cover the basic here for including branches and excluding branches. Check for docs for information on path includes/excludes as well as how to control PR validation. As an example here is the yaml file used to define a build in this repo.

pool:
  vmImage: 'Ubuntu 16.04'

variables:
  buildConfiguration: 'Release'

steps:
- script: dotnet build Sqlite --configuration $(buildConfiguration)
  displayName: 'dotnet build $(buildConfiguration)'

In order to control what branches get built, we need to add a trigger section. The smilest example is to list the branches you want to build. Ending wildcards are allowed. See the following example (trigger section taken from the official docs).

pool:
  vmImage: 'Ubuntu 16.04'

variables:
  buildConfiguration: 'Release'

steps:
- script: dotnet build Sqlite --configuration $(buildConfiguration)
  displayName: 'dotnet build $(buildConfiguration)'

trigger:
- master
- releases/*

This would build master and all branches under releases, but nothing else. The following shows how to use includes and excludes together. Again the triggers section is taken from the official docs.

pool:
  vmImage: 'Ubuntu 16.04'

variables:
  buildConfiguration: 'Release'

steps:
- script: dotnet build Sqlite --configuration $(buildConfiguration)
  displayName: 'dotnet build $(buildConfiguration)'

trigger:
  branches:
    include:
    - master
    - releases/*
    exclude:
    - releases/old*

This would build master and everything in under releases that does not start with old. Really go read the official docs on this one to see all the ins and outs.

Azure Pipelines

To override the CI build from Azure DevOp go to the build in question and click Edit.

Next, select Triggers and Continuous integration and check Override YAML.

After checking the override you will see a lot more options light up. As you can see in the following screenshot the same include and exclude options are available with the same options for wildcards.

Wrapping Up

As you can see Azure Pipelines provides a lot of flex ability in how a build gets triggered. On top of what I covered here, there are also options for setting up scheduled builds as well as trigging a build with another build is completed. If you hit a scenario that couldn’t be covered I would love to hear about it in the comments.

Visual Studio Missing Scaffolding for ASP.NET Core

This morning I set out to try the Scaffold Identity option that was added as part of the ASP.NET 2.1 release. The docs for this feature is really good so I didn’t think I would have any issue, but I was wrong.

Sample Application

To start I used the .NET CLI to create a new application using the following command.

dotnet new razor

I then opened the project in Visual Studio. Then following the docs I right-clicked on the project and expanded the Add option, but the New Scaffolded item is missing.

Sample Application 2

Since the project created using the CLI didn’t have the option I thought I would try creating a new project in Visual Studio. From in Visual Studio using File > New > Project.

Under Visual C# > Web select ASP.NET Core Web Application and click OK.

Select Web Application and then click OK.

After the creation process finishes I right-clicked on the project and expanded the Add option and the New Scaffolded item is there.

What’s the difference?

After trying a lot of different things I finally got to the point of looking at the csproj files for both projects. Here is the csproj from the Visual Studio create project.

<Project Sdk="Microsoft.NET.Sdk.Web">
  <PropertyGroup>
    <TargetFramework>netcoreapp2.1</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.App" />
    <PackageReference Include="Microsoft.AspNetCore.Razor.Design" Version="2.1.2" PrivateAssets="All" />
  </ItemGroup>
</Project>

And the csproj from the CLI created project.

<Project Sdk="Microsoft.NET.Sdk.Web">
  <PropertyGroup>
    <TargetFramework>netcoreapp2.2</TargetFramework>
  </PropertyGroup>


  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.App" />
    <PackageReference Include="Microsoft.AspNetCore.Razor.Design" Version="2.2.0-preview3-35497" PrivateAssets="All" />
  </ItemGroup>
</Project>

Notice that the CLI created project is targeting .NET Core 2.2 which is in preview at the time of this writing. Not surprisingly Visual Studio has some limits on what can be done using preview bits.

Wrapping Up

I can’t believe that I let having preview bits installed cause me issues again. I’m guessing that most people aren’t installing the previews so hopefully, this isn’t an issue most of you will have to deal with.

If you do want to use the CLI to create an application targeting a specific version of .NET Core it can be done using a global.json file. Check out Controlling .NET Core’s SDK Version for more information.

Azure DevOps Project

After last week’s post on Azure Pipelines: Release to Azure App Service I came across a much easier way to get started using Azure DevOps and Azure App Servies. This post is going to walk through this process which is started from the Azure Portal side instead of Azure DevOps.

The names here are going to be a bit confusing. When I say Azure DevOps I am talking about the rebrand of Visual Studio Team Services which includes services for boards, repos, pipeline, etc. When I say Azure DevOps Project, or just DevOps Project, I am referring to the project type this post is going to be using from the Azure Portal side.

Create DevOps Project

From the Azure Portal click the Create a resource button.

For me, DevOps Project was in the list of popular items. If you don’t see it list you can use the search at the top to find it. Click on the DevOps Project to start the process.

On the next page, you have options to start a new application with a lot of different languages or to deploy an existing application. For this example, we are going to select .NET and click the Next button. This screen is a great example of how Microsoft is working hard to support more than just .NET.

For .NET the next choice is ASP.NET or ASP.NET Core. No surprise I’m sure that we are going to go with ASP.NET Core. There is also an option for adding a database, but we aren’t going to use it for this example. Click Next to continue.

The next step is to select which Azure Service the new application should run on. We are going to use a Linux Web App to match what last week’s sample was running on. Click Next to continue.

There are quite a few settings on this next screen, but they are all pretty clear. The first group of settings is for the Azure DevOps project and you can either use an existing account or the process will create one for you. A Project name is required.

The next group of settings is for the Azure App Service that will be created. Subscription should default in if you only have one. Web app name is going to control the URL Azure provides as well as play in the naming of the resources that are created. Click Done to start the creation process.

Deployment

After clicking down above the deployment of all the need resources will start. This process takes awhile. The portal will redirect you to a status page similar to the following while deployment is in progress. It took a little over 4 minutes for mine to complete.

When the deployment is complete click the Go to resource button.

Results

The Overview page for the project gives a great summary of the whole CI/CD pipeline that was created with links to the associated Azure DevOps pages to manage each step. The Azure resources section will have the URL you can use to access the running application.

The resulting application should look similar to the following.

Wrapping Up

This process is a much easier way to get started if you are going all in with Azure. If you read last week’s post you know there is a lot that goes into creating something close to this setup manually and even then it was missing the nice overview provided by this setup.

Azure Pipelines: Release to Azure App Service

This post is going to use the same Azure DevOps project used in last week’s Azure Repos and Azure Pipelines post which had a build pipeline and add a release pipeline that deploys to an Azure App Service.

This walkthrough is going to assume you have an Azure account already set up using the same email address as Azure DevOps. If you don’t have an Azure account one signup for an Azure Free Account.

Build Changes

The build set up in last week’s post proved that the code built, but it didn’t actually do anything with the results of that build. In order to use the results of the build, we need to publish the resulting files. This will be needed later when setting up the release pipeline.

In order to get the results we are looking for a few steps must be added to our build.  All the changes are being made in the azure-pipelines.yml. The following is my full yaml file with the new tasks.

pool:
  vmImage: 'Ubuntu 16.04'

variables:
  buildConfiguration: 'Release'

steps:
- task: [email protected]
  displayName: Build
  inputs:
    projects: '**/EfSqlite.csproj'
    arguments: '--configuration $(BuildConfiguration)'
- task: [email protected]
  displayName: Publish
  inputs:
    command: publish
    publishWebProjects: True
    arguments: '--configuration $(BuildConfiguration) --output $(build.artifactstagingdirectory)'
- task: [email protected]
  displayName: 'Publish Artifact'
  inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'

As you can see in the above yaml this build now has three different steps. The build (this is equivalent to what the post from last week was doing) and publish (this gets all the files in the right places) tasks are both handled using the [email protected] task.  Finally, the [email protected] takes the results of the publish and zips them to the artifact staging directory where they can be used in a release pipeline.

Create an Azure App Service

Feel free to skip this section if you have an existing App Service to use. To create a new App Service open the Azure Portal and select App Services from the left navigation.

Next, click the Add button.

On the next page, we are going to select Web App.

Now hit the Create button.

The next page you will need to enter an App name, select the OS, and Runtime Stack before hitting the Create button. The OS and Runtime Stack should match the target of your application.

Create a Release Pipeline

On your project from the left navigation select Pipelines > Releases and then click the New pipeline button. If you already have a release pipeline setup this page will look different.

The next page has a list of template you can select from. In this example, we will be selecting Azure App Service deployment and then click the Apply button.

 

 

Artifact Set Up

After clicking Apply you will hit the pipeline overview page with to sets of information. The first is the Artifacts which for us is going to be the results of the build pipeline we set up in last week’s post. Click the Add an artifact box.

The Add an artifact dialog is where you will select where the release will get its build artifact form. We will be using the source type of build and selecting our existing build pipeline.

Once you select your build pipeline as the Source a few more controls will show up. I took the defaults on all of them. Take note of the box highlighted in the screenshot as it will give you a warning if you build is missing artifacts. Click the Add button to complete.

 

Stage Setup

 

Above we selected the Azure App Service template which is now in our pipeline as Stage 1. Notice the red exclamation, which means the stage has some issues that need to be addressed before it can be used. Click on the stage box to open it.

 

As you can see in the following screenshot the settings that are missing are highlighted in red on the Deploy Azure App Service task.  On Stage 1 click the Unlink all button and confirm. Doing this means there is more setup on the Deploy Azure App Service task, but this is the only way to use .NET Core 2.1 at the time of this writing. For some reason, the highest version available at the Stage level for Linux is .NET Core 2.0.

After clicking unlink all the options other than the name of the stage will be removed. Next, click on Deploy Azure App Service task which handles the bulk to the work will place for this pipeline. There are a lot of setting on this task. Here is a screenshot of my setup and I will call out the important bits after.

First, select your Azure subscription. You may be required to Authorize your account so if you see an Authorize button click it and go through the sign in steps for your Azure account.

Take special note of App type. In this sample, we are using Linux so it is important to select Linux App from the drop down instead of the just using Web App.

With your Azure subscription and App type selected the App Service name drop-down should only let you select Linux based App  Services that exist on your subscription.

For Image Source, I went with the Built-in Image, but it does have the option to enter use a container from a registry if the built-in doesn’t meet your needs.

For Package or folder, the default should work if you only have a single project. Since this, I have two projects I used the file browser (the … button) to select the specific zip file I want to deploy.

Runtime Stack needs to to be .NET Core 2.1 for this application.

Startup command needs to be set up to tell the .NET CLI to run the assembly that is the entry point for the application. In the example, this works out to be dotnet EfSqlite.dll.

After all the settings have been entered hit the Save button in the top right of the screen.

Execute Release Pipeline

Navigate back to Pipelines > Release and select the release you want to run. Then click the Create a release button.

On the next page, all you have to do is select the Artifact you want to deploy and then click the Create button.

Create will start the release process and send you back to the main release page. There will be a link at the top of the release page that you can click to see the status of the release.

The following is the results of my release after it has completed.

Wrapping Up

I took me more trial and error to get this setup going that I would have hoped, but once all the pieces are get up the results are awesome. At the very minimum, I recommend taking the time to at least set up a build that is triggered when code is checked into your repos. Having the feedback that a build is broken as soon as the code hits the repo versus finding out when you need a deliverable will do wonders for your peace of mind.

Negative Responses

After 3.5 years I finally had a post that resulted in a negative reaction from a fair number of people. The morning after the .NET Parameterized Queries Issues with SQL Server Temp Tables post went live I woke up with 4 comments waiting for approval. Before I read them this was exciting. I thought that I had hit on an issue that a lot of people had faced. I was wrong.

The Negative

Three of the four comments were about that specific person’s view of best practices and how my post was a poor example and something that any seasoned developer would ever dare use.

I replied to all the comments and tried to clarify the points of misunderstanding. The post was based on a real issue we faced and something that was not at all clear on why. The post was meant to be simple in order to show the point and I did my best to communicate these points to the commenters. To say the least this was a hard day for me.

The post now has something like 33 comments, which is way more than any of my other posts. I think that for the most part, the commenters are more clear on what I was going for. The straight up negativity around the post in question is something I hope to avoid for the most part.

The Positive

On the plus side on the day in question, I hit a new record for single-day views on my blog. The previous record was from 3 years ago when I had a post featured on the ASP.NET homepage.

While less visible on my blog I also got a lot of support from people letting me know that they understood my point and through it was clear. The post has also appeared in a blog and newsletter that I highly respect. I really appreciate everyone who took the time to reach out and remind me that this type of thing will happen and to stay positive.

Future Strategies

If something like this happens again I will do my best to keep it in perspective. I want honest feedback on my posts. I welcome people posting issues and us working through them. It results in better information for my readers which at the end of the day is why I spend so much time doing this blog every week.

I think it is human nature for negative comments to stand out. It will be important for me to temper that initial negative reaction with the positive side of things.

Azure Repos and Azure Pipelines

Last week’s post looked at using GitHub with Azure Pipelines. This week I’m going to take the same project and walk through adding it to Azure Repos and setting a build up using Azure Pipelines. I will be using the code from this GitHub repo minus the azure-pipelines.yml file that was added in last weeks post.

Creating a Project

I’m not going to walk you through the sign-up process, but if you don’t have an account you can sign up for Azure DevOps here. Click the Create project button in the upper right corner.

On the dialog that shows enter a project name. I’m using the same name that was on the GitHub repo that the code came from. Click Create to continue.

Adding to the Repo

After the project finishes the creation process use the menu on the left and select Repos.

Since the project doesn’t currently have any files the Repos page lists a number of options for getting files added. I’m going to use the Clone in VS Code option and then copy the files from my GitHub repo. If I weren’t trying to avoid including the pipeline yaml file an easier option would be to use the Import function and clone the repo directly.

I’m not going to go through the details of using one of the above methods to get the sample code into Azure Repos. It is a git based repo and the options for getting code uploaded are outlined on the page above.

Set up a build

Now that we have code in a repo you should see the view change to something close to the following screenshot. Hit the Set up build to start the process of creating a build pipeline for the new repo. This should be pretty close to the last half of last week’s post, but I want to include it here so this post can stand alone.

On the next page select Azure Repos as the source of code.

Next, select the repo that needs to be built for the new pipeline.

Template selection is next. Based on your code a template will be suggested, but don’t just take the default. For whatever reason, it suggested a .NET Desktop template for my sample which is actually ASP.NET Core based. Select your template to move on to the next step.

The next screen will show you the YAML that will be used to build your code. My repo contains two projects so I had to tweak the YAML to tell it which project to build, but otherwise, the default would have worked. After you have made any changes that your project needs click Save and run.

The last step before the actual build is to commit the build YAML file to your Azure Repo. Make any changes you need on the dialog and then click Save and run to start the first build of your project.

The next page will show you the status of the build in real time. When the build is complete you should see something like the following with the results.

Wrapping Up

As expected using Azure Repos with Azure Pipelines works great. If you haven’t yet give Azure DevOps a try. Microsoft has a vast offering with this set of products that are consistently getting better and better.

GitHub and Azure Pipelines

A few weeks ago Microsoft announced that Visual Studio Team Services was being replaced/rebranded by a collection of services under the brand Azure DevOps. One of the services that make up Azure DevOps is Azure Pipelines which provides a platform for continuous integration and continuous delivery for a huge number of languages on Windows, Linux, and Mac.

As part of this change, Azure Pipelines is now available on the GitHub marketplace. In this post, I am going to pick one of my existing repos and see if I can get it building from GitHub using Azure Pipelines. I’m sure Microsoft or GitHub has documentation, but I’m attempting this without outside sources.

GitHub Marketplace

Make sure you have a GitHub account with a repo you want to build. For this post, I’m going to be using my ASP.NET Core Entity Framework repo. Now that you have the basic prep out of the way head over to the GitHub Marketplace and search for Azure Pipelines or click here.

Scroll to the bottom of the page to the Pricing and setup section. There is a paid option that is the default option. Click the Free option and then click Install it for free.

On the next page, you will get a summary of your order. Click the Complete order and begin installation button.

On the next page, you can select which repos to apply the installation to. For this post, I’m going to select a single repo. After making your choice on repos click the Install button.

Azure DevOps

After clicking install you will be thrown into the account authorization/creation process with Microsoft. After getting authorized you will get to the first set up in the setup process with Azure. You will need to select an organization and a project to continue. If you don’t have these setup yet there are options to create them.

After the process complete you will be land on the New pipeline creation process where you need to select the repo to use. Clicking the repo you want to use will move you to the next step.

The next step is a template selection. My sample is an ASP.NET Core application so I selected the ASP.NET Core template. Selecting a template will move you to the next step.

The next page will show you a yaml file based on the template you selected. Make any changes your project requires (my repo had two projects so I had to change the build to point to which project I wanted to build).

Next, you will be prompted to commit the yaml file to source control. Select your options and click Save and run.

After your configuration gets saved a build will be queued. If all goes well you will see your app being built. If everything works you will see something like this build results page.

Wrapping Up

GitHub and Microsoft have done a great job on this integration. I was surprised at how smooth the setup was. It was also neat to see a project that I created on Windows being built on Linux.

If you have a public repo on GitHub and need a way to build give Azure Pipelines a try.

Entity Framework Core: Logging

The other day I was having to dig into some performance issues around a process that is using Entity Framework Core. As part of the process, I need to see the queries generated by Entity Framework Core to make sure they were not the source of the issue (they were not). I’m going to be making these changes using the Contacts project from my ASP.NET Core Basics repo if you want to see where I started from.

First, we will cover adding a logging provider. Next, I’m going to show you what I came up with and then I will show you the method suggested by the Microsoft docs (which I didn’t find until later).

Logging Providers

First, we need to do pick how we want the information logged. A good starting place is using one of the Microsoft provides which can be found on NuGet. Right-click on the project you want to add the logging to and click Manage NuGet Packages.

In the search box enter Microsoft.Extensions.Logging for a list of good list of logging options. For this post, we will be using the console logger provided by Microsoft. Select Microsoft.Extensions.Logging.Console and then click the Install button on the upper right side of the screen.

First Go

For my first try at this, all the change are in the ConfigureServices function of the Startup class. The following is the code I added at the end of the function that will log all the queries to the console window (if you are using IIS Express then use the Debug logger instead).

var scopeFactory = services.BuildServiceProvider()
                           .GetRequiredService<IServiceScopeFactory>();

using (var scope = scopeFactory.CreateScope())
{
    using (var context = scope.ServiceProvider
                              .GetRequiredService<ContactsContext>())
    {
        var loggerFactory = context.GetInfrastructure()
                                   .GetService<ILoggerFactory>();
        loggerFactory.AddProvider(new ConsoleLoggerProvider((_, __) => true, true));
    }
}

This code is creating a scope to get an instance of the ContactsContext and then using the context to get it’s associated logger factory and adding a console logger to it. This isn’t the cleanest in the world but gets the job done especially if this is just for a quick debug session and not something that will stay.

Microsoft Way

While the above works I ended up finding a logging page in the Entity Framework Core docs. After undoing the changes made above open the ContactsContext (or whatever your DBContext is) and add a class level static variable for a logger factory. This class level variable will be used to prevent memory and performance issues that would be caused by creating a new instance of the logging classes every time a context is created.

public static readonly LoggerFactory LoggerFactory = 
       new LoggerFactory(new[] {new ConsoleLoggerProvider((_, __) => true, true)});

Next, add/update an override to the OnConfiguring to use the logger factory defined above. The following is the full function in my case.

protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
    base.OnConfiguring(optionsBuilder);

    optionsBuilder.UseLoggerFactory(LoggerFactory);
}

The Output

Either way, the following is an example of the output you will get with logging on.

The query is highlighted in the red box above. As you can see there is a lot of output, but there are options for filtering which are detailed in the docs.

Wrapping Up

Entity Framework Core does a great job, but the above gives you an option to check in on what it is doing. If you are using SQL Server you could also get the queries using SQL Server Profiler.

Back from Disney World

Today is my first day back from being at Disney World for a week with my family. We had a wonderful time. It was awesome to see our 6-year-old light up seeing all the sights and ride for the first time.

This trip was also the first full week of vacation I have had in years. There was a lot going on at work and I found it really hard to unplug. I can’t tell you how many times I checked my email and Teams while waiting in line for one thing or another. I’m sure this amounted to less than 1% of the time I was off, but I feel like even that level of work is more than enough to lose some of the mental recoveries that vacation should provide.

Future Strategies

I think the biggest thing I could do in the future is different timing. This trip ended up being during a client go live which greatly added to the feeling that I needed to stay connected in case something went wrong.

Second, I need to take more time to ensure I’m not the single point of knowledge on any subject. The one thing I did end up having to do was because I forgot to convey some information on how our QA team was running a service.

Third, I just need to take more time off. As I said above this is the first time in years that I have taken a full week off. This has been due to family medical issues and it has gotten me in the habit of saving all the time off I can to cover times I need to be out to help out the family. Now that things have calmed down I am going to have to adjust to being able to have time off and disconnecting.

Wrapping Up

I’m still in the process of getting back into the swing of things and this topic was on my mind. I would love to hear how you all deal with this issue.

.NET Parameterized Queries Issues with SQL Server Temp Tables

In the last few weeks at work, I have had multiple people have issues using a parameterized query in .NET that involved a temp table. It took a little bit of digging, but we finally tracked down the issue. This post is going to cover the cause of the issue as well as a couple of ways to fix it. The database used in this post is Wide World Importers sample database from Microsoft. For instructions on setting it up check out my Getting a Sample SQL Server Database post from last week.

Sample Project Creation

To keep things as simple as possible I am using a console application created using the following .NET CLI command.

dotnet new console

Followed by this command to add in the SQL Client from Nuget.

dotnet add package System.Data.SqlClient

The following is the full Program class with the sample code that will result in the exception this post is dealing with. Yes, I am aware this isn’t the be way to structure this type of code so please don’t judge it from that aspect. It is meant to be simple to demonstrate the issue.

class Program
{
    static void Main(string[] args)
    {
        Console.WriteLine("Running sample");

        using (var connection = 
                  new SqlConnection(@"Data Source=YourServer;
                                      Initial Catalog=YourDatabase;
                                      Integrated Security=SSPI;"))
        {
            connection.Open();

            using (var command = connection.CreateCommand())
            {
                SqlTest(command);
            }
        }

        Console.ReadLine();
    }

    private static void SqlTest(SqlCommand command)
    {
        command.CommandText = @"SELECT OrderId
                                      ,CustomerId
                                      ,SalespersonPersonID
                                      ,BackorderOrderId
                                      ,OrderDate
                                INTO #backorders
                                FROM Sales.Orders
                                WHERE BackorderOrderID IS NOT NULL
                                  AND OrderDate > @OrderDateFilter";

        command.Parameters.Add("@OrderDateFilter", 
                                SqlDbType.DateTime)
                          .Value = DateTime.Now.AddYears(-1);
        command.ExecuteNonQuery();

        command.CommandText = "SELECT OrderId FROM #backorders";

        using (var reader = command.ExecuteReader())
        {
            while (reader.Read())
            {
                Console.WriteLine(reader["OrderId"]);
            }
        }
    }
}

The Error

Running the application as it exists above will result in the following error.

Invalid object name ‘#backorders’.

Strange error since we just created the #backorder temp table. Let’s give it a try without the filter. The query now looks like the following.

command.CommandText = @"SELECT OrderId
                              ,CustomerId
                              ,SalespersonPersonID
                              ,BackorderOrderId
                              ,OrderDate
                        INTO #backorders
                        FROM Sales.Orders
                        WHERE BackorderOrderID IS NOT NULL";

command.ExecuteNonQuery();

Now the application runs without any issues. What if we try adding back the filter, but without using the command parameter?

command.CommandText = @"SELECT OrderId
                              ,CustomerId
                              ,SalespersonPersonID
                              ,BackorderOrderId
                              ,OrderDate
                        INTO #backorders
                        FROM Sales.Orders
                        WHERE BackorderOrderID IS NOT NULL
                          AND OrderDate > '2018-01-01'";

command.ExecuteNonQuery();

Again the application runs without any issues.

The Reason

Why is it that adding a command parameter is causing our temp table to disappear? I discovered the issue by using SQL Server Profiler (in SQL Server Management Studio it can be found in Tools > SQL Server Profiler). With the code back to the original version with the command parameter and Profiler connected to the same server as the sample application running the sample application shows the following command received by SQL Server.

exec sp_executesql N'SELECT OrderId 
                     FROM #backorders',
                   N'@OrderDateFilter datetime',
                   @OrderDateFilter='2017-08-28 06:41:37.457'

It turns out that when you use command parameters in .NET it gets executed on SQL Server using the sp_executesql stored procedure. This was the key bit of information I was missing before. Now that I know parameterized queries are executed in the scope of a stored procedure it also means the temp table used in our first query is limited to the usage within the stored procedure in which it was created.

Options to Fix

The first option is to not use parameters on your initial data pull. I don’t recommend this option. Parameters provide a level of protection that we don’t want to lose.

The second option and the way we addressed this issue is to create the temp table first. Now that the temp table has been created outside of a stored procedure it is scoped to the connection and then allows us to insert the data using parameters. The following code is our sample using this strategy.

command.CommandText = @"CREATE TABLE #backorders
                        (
                           OrderId int
                          ,CustomerId int
                          ,SalespersonPersonID int
                          ,BackorderOrderID int
                          ,OrderDate date
                        )";

command.ExecuteNonQuery();

command.CommandText = @"INSERT INTO #backorders
                        SELECT OrderId
                              ,CustomerId
                              ,SalespersonPersonID
                              ,BackorderOrderId
                              ,OrderDate
                        FROM Sales.Orders
                        WHERE BackorderOrderID IS NOT NULL
                          AND OrderDate > @OrderDateFilter";

command.Parameters.Add("@OrderDateFilter", 
                       SqlDbType.DateTime)
                  .Value = DateTime.Now.AddYears(-1);
command.ExecuteNonQuery();

Wrapping Up

I hope this saves someone some time. Once I understood what was going on the issue made sense. How .NET deals with a SQL command with parameters is one of those things that just always worked and I never had the need to dig into until now. Always something new to learn which is one of the reasons I love what I do.