Entity Framework Core: Postgres Concurrency Checks

Last week’s post on SQLite Concurrency Checks used this repo which contains examples of using Entity Framework Core with SQLite and Postgres. This post is going to tackle concurrency checks using Postgres to keep the projects in the repo with the same level of functionality. You can grab the sample code before any change here. This whole post will only be touch files found in the Postgres folder/project.

Context Changes and Data Migration

Unlike SQLite, Postgres has better built-in support for concurrency checks. If you read the official docs on Optimistic Concurrency and Concurrency Tokens you will find that all tables have an implicit/hidden system column call xmin which holds the ID of the latest updating transaction which means it gets changed automatically every time a row is changed.

The Postgres Entity Framework Core provide contains an extension that makes it very simple to use the xmin column as a concurrency token. In the ContactDbContext add the following to the OnModelCreating function to enable concurrency checking on the specified entity, in this case, a Contact.

modelBuilder.Entity<Contact>().ForNpgsqlUseXminAsConcurrencyToken();

Next, from a command prompt in the same directory as your project file using the following .NET CLI command to generate a migration for the above change. This migration is a bit strange since the column technically already exists, but the migration seemed to be needed.

dotnet ef migrations add Concurrency --context ContactsDbContext

Then, use the following command to apply the migration to your database.

dotnet ef database update --context ContactsDbContext

Testing it out

For a quick and dirty test, I added a ConcurrencyTest razor page under the Contacts directory. This function is going to ensure a specific contact exists, then pull the contact from two different DBContexts, make a mutation on the resulting contact objects, then attempt to save. The first save will work and the second should fail. Please note that this function isn’t an example of how things should be done just a quick and dirty way to prove that the concurrency check is happening.

public void OnGet()
{
    var context1 = 
        new ContactsDbContext(new DbContextOptionsBuilder<ContactsDbContext>()
                              .UseNpgsql("yourConnectionString")
                              .Options);
    var contactFromContext1 = context1.Contacts
                                      .FirstOrDefault(c => c.Name == "Test");

    if (contactFromContext1 == null)
    {
        contactFromContext1 = new Contact
                              {
                                  Name = "Test"
                              };

        context1.Add(contactFromContext1);
        context1.SaveChanges();
    }

    var context2 = 
        new ContactsDbContext(new DbContextOptionsBuilder<ContactsDbContext>()
                              .UseNpgsql("yourConnectionString")
                              .Options);
    var contactFromContext2 = context2.Contacts
                                      .FirstOrDefault(c => c.Name == "Test");

    contactFromContext1.Address = DateTime.Now.ToString();
    contactFromContext2.Address = DateTime.UtcNow.ToString();

    context1.SaveChanges();
    context2.SaveChanges();
}

Run the application and hit the ConcurrenctTest route which is https://localhost:44324/Contacts/ConcurrencyTest for my test. The following is the resulting exception.

An unhandled exception occurred while processing the request.

DbUpdateConcurrencyException: Database operation expected to affect 1 row(s) but actually affected 0 row(s). Data may have been modified or deleted since entities were loaded. See http://go.microsoft.com/fwlink/?LinkId=527962 for information on understanding and handling optimistic concurrency exceptions.

Wrapping Up

This process was much simpler using Postgres than SQLite. Not that the SQLite version was hard just not as simple of a path.

The code in its final state can be found here.

Welcome to 2019

Last year I did a Welcome to 2018 post due to a technical issue with my blog. This year I wanted to do the same type of post unprompted by technical issues. So to welcome 2019 let us do a little review of 2018 of the things that provided me value.

Newsletters

ASP.NET Weekly
Dev Tips Weekly
Shawn Wildermuth

Podcasts

I use these podcast to make sure I have a good pulse on the industry even and go deeper when a topic really sounds like something I want to dig into. My commute would be wasted time without these podcasts.

Software Related

.NET Rocks!
Complete Developer Podcast
Developer on Fire
Hanselminutes
Software Engineering Daily

Fun

Freakonomics
Radiolab

Other

Gary Vee
RunAsRadio

Books Read in 2018

Like last year I pulled heavily from  John Sonmez’s book reviews playlist as well as some read suggestions from last year. You will notice there are more fiction books this year. I’m playing with my learn something vs entertainment ratio to see if I can help with my stress level. Without further ado here is the list.

Blogging

From Zero to Thousands of Target Blog Subscribers in 60 Days

Personal Development

The 5 Second Rule
Algorithms to Live By
Antifragile
As a Man Thinketh
Be Obsessed Or Be Average
Breaking the Habit of Being Yourself
The Compound Effect
The New Kingmakers
The Power of Now
Presence
Principles
Psycho-Cybernetics
Secrets of the Millionaire Mind
Slipstream Time Hacking
So Good They Can’t Ignore You
Stop Being Lazy
The Subtle Are of Not Giving a F*ck
Unshakeable
Wait, What?
Willpower Doesn’t Work
Your One Word

Business

Crushing It!
The E-Myth Revisited
Zero to One

Fun

The Fellowship of the Ring
Fight Club
The Hobbit
The Sword of Shannara Trilogy
What If?

Health

The Plant Paradox

Software/Career

Clean Architecture
Curious Moon
Modern API Design with ASP.NET Core 2
Writing High-Performance .NET Code

Top Picks

The Compound Effect is my number one pick from this year’s books. It really brings home the fact that the little thing you do can have huge effects on your life. It was also through this that I found Darren Daily which is basically free daily mentoring by Darren Hardy the book’s author. Check it out it is how I start every weekday.

Willpower Doesn’t Work did an awesome job of explaining willpower vs our environments. Our environments are where we need to make changes to truly set ourselves up for success.

Principles by Ray Dailo round out my top 3 for the year. This book is a great look into how Ray runs his life and how he ran his business. I have never seen someone as successful as Ray lay everything our as he did it this book.

Entity Framework Core: SQLite Concurrency Checks

Most of the work I have done with SQLite has been on single-user systems, but recently I had to work on a SQLite project that was going to have a handful of concurrent users and a subset of the user activities would need to deal with concurrency issues. In the past, in a situation like this, I have been using SQL Server and use the rowversion or timestamp column type which places a unique value on the row on any updates or inserts.

There is a page in the official docs on Concurrency Tokens, but for me, it wasn’t super helpful. Thankfully after some searching, I came across the GitHub issue In ASP.Net Core 2.x with Entity Framework Core, Concurrency Control not working with SQLite which had a solid sample as one of the replies. This post is going to walk through an example implementation of that sample. The starting point of the code can be found in this GitHub repo.

Sample Background

The sample project being used is a simple web application to manage a contact list. The repo contains an implementation using Postgres and one using Sqlite. This whole post will only be touch files found in the Sqlite folder/project.

Model Changes and Data Migration

SQLite doesn’t have the concept of a timestamp column, but this solution is going to emulate one. To do this we are going to change the Contact model found in the Models folder. We are going to add a Timestamp property with a Timestamp data annotation. The following is the full model class with the new property at the bottom.

public class Contact
{
    public int Id { get; set; }
    public string Name { get; set; }
    public string Address { get; set; }
    public string City { get; set; }
    public string Subregion { get; set; }
    public string PostalCode { get; set; }
    public string Phone { get; set; }
    public string Email { get; set; }
    [Timestamp] public byte[] Timestamp { get; set; }
}

Next, let’s create a new migration with the change to the model. I’m using the .NET CLI so from a command prompt in the project directory run the following command.

dotnet ef migrations add ContactTimestamp --context ContactsDbContext

In the Migrations directory, open newly created migration. It should be named something like *_ContactTimestamp.cs.  In the Up function, we are going to add a couple of triggers to the new Timestamp column. These triggers are going to assign a random blob to the timestamp column when a row is inserted or updated which is how we are simulating the function of SQL Server’s Timestamp data type. The following is the full Up function with the added triggers.

protected override void Up(MigrationBuilder migrationBuilder)
{
    migrationBuilder.AddColumn<byte[]>(
        name: "Timestamp",
        table: "Contacts",
        rowVersion: true,
        nullable: true);

    migrationBuilder.Sql(
        @"
        CREATE TRIGGER SetContactTimestampOnUpdate
        AFTER UPDATE ON Contacts
        BEGIN
            UPDATE Contacts
            SET Timestamp = randomblob(8)
            WHERE rowid = NEW.rowid;
        END
    ");

    migrationBuilder.Sql(
        @"
        CREATE TRIGGER SetContactTimestampOnInsert
        AFTER INSERT ON Contacts
        BEGIN
            UPDATE Contacts
            SET Timestamp = randomblob(8)
            WHERE rowid = NEW.rowid;
        END
    ");
}

To apply the migration to the database you can use the following command.

dotnet ef database update --context ContactsDbContext

Testing it out

Now for a quick and dirty test, we are going to add a ConcurrencyTest function to the existing ContactsController. This function is going to ensure a specific contact exists, then pull the contact from two different DBContexts, make a mutation on the resulting contact objects, then attempt to save. The first save will work and the second should fail. Please note that this function isn’t an example of how things should be done just a quick and dirty way to prove that the concurrency check is happening.

[Route("ConcurrencyTest")]
public void ConcurrencyTest()
{
    var context1 = new ContactsDbContext(new DbContextOptionsBuilder<ContactsDbContext>()
                                         .UseSqlite("Data Source=Database.db").Options);
    var context2 = new ContactsDbContext(new DbContextOptionsBuilder<ContactsDbContext>()
                                         .UseSqlite("Data Source=Database.db").Options);

    var contactFromContext1 = context1.Contacts.FirstOrDefault(c => c.Name == "Test");

    if (contactFromContext1 == null)
    {
        contactFromContext1 = new Contact
        {
            Name = "Test"
        };

        context1.Add(contactFromContext1);
        context1.SaveChanges();
    }

    var contactFromContext2 = context2.Contacts.FirstOrDefault(c => c.Name == "Test");

    contactFromContext1.Address = DateTime.Now.ToString();
    contactFromContext2.Address = DateTime.UtcNow.ToString();

    context1.SaveChanges();
    context2.SaveChanges();
}

Run the application and hit the ConcurrenctTest route which is http://localhost:1842/ConcurrencyTest for my test. The following is the resulting exception.

An unhandled exception occurred while processing the request.

DbUpdateConcurrencyException: Database operation expected to affect 1 row(s) but actually affected 0 row(s). Data may have been modified or deleted since entities were loaded. See http://go.microsoft.com/fwlink/?LinkId=527962 for information on understanding and handling optimistic concurrency exceptions.

Wrapping Up

While the information wasn’t the easiest in the world to locate, as you can see Entity Framework Core using SQLite has good support for concurrency control. The above is just one option for its implementation. I hope this saves you all so time.

The code in its final state can be found here.

Entity Framework Core: String Filter Tips

I have been working on an application that is backed by Entity Framework Core using SQLite and I have hit a couple of things that were not super clear to me at first when dealing with string filters. This post is going to cover setting up a sample application and demoing a couple of things to keep in mind when working with string in filters.

Sample Application

I’m using the Razor Pages template with individual auth as the base for this sample since it comes with Entity Framework Core already set up and ready to go. Using a command prompt in the directory you want the sample project created run the following command.

dotnet new razor --auth Individual

Open the resulting project in your editor of choice and add a Models folder. As usual, I’m going to be using a contact as my example so inside the Models folder create a Contact class matching the following. The override on ToString is just to make it easy to see the results when debugging.

public class Contact
{
    public int Id { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public string Address { get; set; }
    public string City { get; set; }
    public string State { get; set; }
    public string Zip { get; set; }

    public override string ToString()
    {
        return $"{Id} - {LastName}, {FirstName}";
    }
}

Next, open the ApplicationDbContext found in the Data folder and add the following DbSet property to expose our contacts.

public DbSet<Contact> Contacts { get; set; }

Add the following OnModelCreating function to the ApplicationDbContext which will create some test data for us.

protected override void OnModelCreating(ModelBuilder builder)
{
    builder.Entity<Contact>()
           .HasData(new Contact
                    {
                        Id = 1,
                        FirstName = "Bob",
                        LastName = "Smith",
                        Address = "123 Main Street",
                        City = "Nashville",
                        State = "TN",
                        Zip = "35970"
                    },
                    new Contact
                    {
                        Id = 2,
                        FirstName = "Sam",
                        LastName = "Smith",
                        Address = "1 Sun Lane",
                        City = "Knoxville",
                        State = "TN",
                        Zip = "48909"
                    },
                    new Contact
                    {
                        Id = 3,
                        FirstName = "Clark",
                        LastName = "Swift",
                        Address = "750 10th Street",
                        City = "Chattanooga",
                        State = "TN",
                        Zip = "91590"
                    }
                   );

    base.OnModelCreating(builder);
}

Back in the command prompt run the following command in the same directory as the csproj file to create a migration for our new contact model.

dotnet ef migrations add Contacts

Finally, run the following command to apply the migration to your database.

dotnet ef database update

Data Execution

For this example, I don’t really care to display the results in a UI so I am using the OnGetAsync of the Index page to run my queries. The following is my full index page model with a query that returns all the contacts. The rest of the post will just be showing the LINQ statements to query the database and not the full page model.

public class IndexModel : PageModel
{
    private readonly ApplicationDbContext _context;

    public IndexModel(ApplicationDbContext context)
    {
        _context = context;
    }

    public async Task<IActionResult> OnGetAsync()
    {
        var contacts = await _context.Contacts.ToListAsync();

        return Page();
    }
}

The above results in all the of the contacts seeded being return.

1 - Smith, Bob
2 - Smith, Sam
3 - Swit, Clark

Like Queries

As part of the Entity Framework Core 2.0 release EF.Functions.Like was added which allows usages of wildcards that were not possible using string function translation that was previously the only option.  The following query is an example of using like.

_context.Contacts
        .Where(c => EF.Functions.Like(c.LastName, "S_i%"))
        .ToListAsync();

While this would have been possible before I would imagine the query would have been nasty and involved some level of client-side evaluation. The result of the query is the same as above.

Greater Than/Less Than

Using String.Compare or value.CompareTo will allow you to do greater than or less than comparison on strings. For example String.Compare(value) > 0 give you a greater than and less than zero would be for less than. For example, here is a string comparison query along with the SQL that is generated.

_context.Contacts
        .Where(c => String.Compare(c.FirstName, "D") > 0 )
        .ToListAsync();

SELECT [c].[Id], [c].[Address], [c].[City], [c].[FirstName], [c].[LastName], [c].[State], [c].[Zip]
FROM [Contacts] AS [c]
WHERE [c].[FirstName] > N'D'

It is important to not try and use any of the overloads of String.Compare or you will end up with client-side evaluation of your query. The following is a query that uses one of the overloads and the SQL that is generated.

_context.Contacts
        .Where(c => String.Compare(c.FirstName, "D", StringComparison.Ordinal) > 0)
        .ToListAsync();

SELECT [c].[Id], [c].[Address], [c].[City], [c].[FirstName], [c].[LastName], [c].[State], [c].[Zip]
FROM [Contacts] AS [c]

Notice that the first query has a Where clause and the second one doesn’t. This means the second query will pull all the records to the client and then apply the filter.  While this is fine to a small amount of data please be careful with queries that are evaluated client-side as they can cause performance issues.

Both of the above queries return the following result.

2 - Smith, Sam

Wrapping Up

This is one of those posts that will be a reminder for me as much as anything. I do hope that is save you some time of clears up a bit how some of the way that Entity Framework Core handles strings.

Migration from ASP.NET Core 2.1 to 2.2

On December 4th .NET Core 2.2 including ASP.NET Core 2.2 and Entity Framework 2.2. In this post, I will be taking one of the projects used in the ASP.NET Basics series and converting it from ASP.NET 2.1.x to the new 2.2 version of ASP.NET Core. This will all be based on the official 2.2 migration guide.

The code before any changes can be found here. In the sample solution, this guide will be working with the Contacts project only.

Installation

Head over to the .NET download page and download the new version of the .NET Core SDK for version 2.2 which is available for Window, Linux and Mac.

After installation is done run the following command if you want to verify the SDK is installed.

dotnet --list-sdks

You should see 2.2.100 listed. If you are like me you might also see a few preview versions that would be good to uninstall.

If you are using Visual Studio make sure you are on at least version 15.9. If not updates can be downloaded from here.

Project File Changes

Right-click on the project and select Edit projectName.csproj.

Change the TargetFramework to netcoreapp2.2.

Before:
<TargetFramework>netcoreapp2.1</TargetFramework>
After:
<TargetFramework>netcoreapp2.2</TargetFramework>

Update any Microsoft packages with a version to 2.2.x the following is an example.

Before:
<PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="2.1.1" PrivateAssets="All" />

After:
<PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="2.2.0" PrivateAssets="All" />

If you want to use the new IIS in-process hosting model you also need to add the following line to a property group.

<AspNetCoreHostingModel>InProcess</AspNetCoreHostingModel>

The following is my full csproj for reference.

<Project Sdk="Microsoft.NET.Sdk.Web">

  <PropertyGroup>
    <TargetFramework>netcoreapp2.2</TargetFramework>
    <AspNetCoreHostingModel>InProcess</AspNetCoreHostingModel>
    <UserSecretsId>aspnet-Contacts-cd2c7b27-e79c-43c7-b3ef-1ecb04374b70</UserSecretsId>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.App" />
    <PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="2.2.0" PrivateAssets="All" />
    <PackageReference Include="Swashbuckle.AspNetCore" Version="4.0.1" />
  </ItemGroup>

</Project>

Startup Changes

In Startup.cs update the compatibility version to enable the new 2.2 features.

Before:
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1);

After:
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_2);

Wrapping Up

As with the migration to 2.1 the move to 2.2 is really easy to do. Make sure you check out the official migration guide for more details that may have not been covered by this project.

The code in its final state can be found here.

GitHub and Azure Pipelines: Build Triggers

In response to my post on GitHub and Azure Pipelines, I got the following question on Reddit.

Does this automatically detect branches? From your screenshot, you’re building master. If you copy to feature-A, does a new pipeline automatically get created and built?

When I initially answered this question I didn’t go deep enough. The answer to does a new pipeline automatically get created and built is no as I replied, but I think the intent of the question is do I have to go set up a new pipeline every time I create a new branch and the answer to that is also no. The existing pipeline will be triggered when any change is checked in on any branch by default. Do note that it won’t be triggered when the branch is created only when a change is checked in.

Limiting Builds

There are a couple of ways to control what branches trigger continuous integration builds. The first is by making edits to the azure-pipeline.yml file in the repo and the second is via an override in the Azure Pipeline.

YAML

The official Build pipeline triggers docs are really good, but I will cover the basic here for including branches and excluding branches. Check for docs for information on path includes/excludes as well as how to control PR validation. As an example here is the yaml file used to define a build in this repo.

pool:
  vmImage: 'Ubuntu 16.04'

variables:
  buildConfiguration: 'Release'

steps:
- script: dotnet build Sqlite --configuration $(buildConfiguration)
  displayName: 'dotnet build $(buildConfiguration)'

In order to control what branches get built, we need to add a trigger section. The smilest example is to list the branches you want to build. Ending wildcards are allowed. See the following example (trigger section taken from the official docs).

pool:
  vmImage: 'Ubuntu 16.04'

variables:
  buildConfiguration: 'Release'

steps:
- script: dotnet build Sqlite --configuration $(buildConfiguration)
  displayName: 'dotnet build $(buildConfiguration)'

trigger:
- master
- releases/*

This would build master and all branches under releases, but nothing else. The following shows how to use includes and excludes together. Again the triggers section is taken from the official docs.

pool:
  vmImage: 'Ubuntu 16.04'

variables:
  buildConfiguration: 'Release'

steps:
- script: dotnet build Sqlite --configuration $(buildConfiguration)
  displayName: 'dotnet build $(buildConfiguration)'

trigger:
  branches:
    include:
    - master
    - releases/*
    exclude:
    - releases/old*

This would build master and everything in under releases that does not start with old. Really go read the official docs on this one to see all the ins and outs.

Azure Pipelines

To override the CI build from Azure DevOp go to the build in question and click Edit.

Next, select Triggers and Continuous integration and check Override YAML.

After checking the override you will see a lot more options light up. As you can see in the following screenshot the same include and exclude options are available with the same options for wildcards.

Wrapping Up

As you can see Azure Pipelines provides a lot of flex ability in how a build gets triggered. On top of what I covered here, there are also options for setting up scheduled builds as well as trigging a build with another build is completed. If you hit a scenario that couldn’t be covered I would love to hear about it in the comments.

Visual Studio Missing Scaffolding for ASP.NET Core

This morning I set out to try the Scaffold Identity option that was added as part of the ASP.NET 2.1 release. The docs for this feature is really good so I didn’t think I would have any issue, but I was wrong.

Sample Application

To start I used the .NET CLI to create a new application using the following command.

dotnet new razor

I then opened the project in Visual Studio. Then following the docs I right-clicked on the project and expanded the Add option, but the New Scaffolded item is missing.

Sample Application 2

Since the project created using the CLI didn’t have the option I thought I would try creating a new project in Visual Studio. From in Visual Studio using File > New > Project.

Under Visual C# > Web select ASP.NET Core Web Application and click OK.

Select Web Application and then click OK.

After the creation process finishes I right-clicked on the project and expanded the Add option and the New Scaffolded item is there.

What’s the difference?

After trying a lot of different things I finally got to the point of looking at the csproj files for both projects. Here is the csproj from the Visual Studio create project.

<Project Sdk="Microsoft.NET.Sdk.Web">
  <PropertyGroup>
    <TargetFramework>netcoreapp2.1</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.App" />
    <PackageReference Include="Microsoft.AspNetCore.Razor.Design" Version="2.1.2" PrivateAssets="All" />
  </ItemGroup>
</Project>

And the csproj from the CLI created project.

<Project Sdk="Microsoft.NET.Sdk.Web">
  <PropertyGroup>
    <TargetFramework>netcoreapp2.2</TargetFramework>
  </PropertyGroup>


  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.App" />
    <PackageReference Include="Microsoft.AspNetCore.Razor.Design" Version="2.2.0-preview3-35497" PrivateAssets="All" />
  </ItemGroup>
</Project>

Notice that the CLI created project is targeting .NET Core 2.2 which is in preview at the time of this writing. Not surprisingly Visual Studio has some limits on what can be done using preview bits.

Wrapping Up

I can’t believe that I let having preview bits installed cause me issues again. I’m guessing that most people aren’t installing the previews so hopefully, this isn’t an issue most of you will have to deal with.

If you do want to use the CLI to create an application targeting a specific version of .NET Core it can be done using a global.json file. Check out Controlling .NET Core’s SDK Version for more information.

Azure DevOps Project

After last week’s post on Azure Pipelines: Release to Azure App Service I came across a much easier way to get started using Azure DevOps and Azure App Servies. This post is going to walk through this process which is started from the Azure Portal side instead of Azure DevOps.

The names here are going to be a bit confusing. When I say Azure DevOps I am talking about the rebrand of Visual Studio Team Services which includes services for boards, repos, pipeline, etc. When I say Azure DevOps Project, or just DevOps Project, I am referring to the project type this post is going to be using from the Azure Portal side.

Create DevOps Project

From the Azure Portal click the Create a resource button.

For me, DevOps Project was in the list of popular items. If you don’t see it list you can use the search at the top to find it. Click on the DevOps Project to start the process.

On the next page, you have options to start a new application with a lot of different languages or to deploy an existing application. For this example, we are going to select .NET and click the Next button. This screen is a great example of how Microsoft is working hard to support more than just .NET.

For .NET the next choice is ASP.NET or ASP.NET Core. No surprise I’m sure that we are going to go with ASP.NET Core. There is also an option for adding a database, but we aren’t going to use it for this example. Click Next to continue.

The next step is to select which Azure Service the new application should run on. We are going to use a Linux Web App to match what last week’s sample was running on. Click Next to continue.

There are quite a few settings on this next screen, but they are all pretty clear. The first group of settings is for the Azure DevOps project and you can either use an existing account or the process will create one for you. A Project name is required.

The next group of settings is for the Azure App Service that will be created. Subscription should default in if you only have one. Web app name is going to control the URL Azure provides as well as play in the naming of the resources that are created. Click Done to start the creation process.

Deployment

After clicking down above the deployment of all the need resources will start. This process takes awhile. The portal will redirect you to a status page similar to the following while deployment is in progress. It took a little over 4 minutes for mine to complete.

When the deployment is complete click the Go to resource button.

Results

The Overview page for the project gives a great summary of the whole CI/CD pipeline that was created with links to the associated Azure DevOps pages to manage each step. The Azure resources section will have the URL you can use to access the running application.

The resulting application should look similar to the following.

Wrapping Up

This process is a much easier way to get started if you are going all in with Azure. If you read last week’s post you know there is a lot that goes into creating something close to this setup manually and even then it was missing the nice overview provided by this setup.

Azure Pipelines: Release to Azure App Service

This post is going to use the same Azure DevOps project used in last week’s Azure Repos and Azure Pipelines post which had a build pipeline and add a release pipeline that deploys to an Azure App Service.

This walkthrough is going to assume you have an Azure account already set up using the same email address as Azure DevOps. If you don’t have an Azure account one signup for an Azure Free Account.

Build Changes

The build set up in last week’s post proved that the code built, but it didn’t actually do anything with the results of that build. In order to use the results of the build, we need to publish the resulting files. This will be needed later when setting up the release pipeline.

In order to get the results we are looking for a few steps must be added to our build.  All the changes are being made in the azure-pipelines.yml. The following is my full yaml file with the new tasks.

pool:
  vmImage: 'Ubuntu 16.04'

variables:
  buildConfiguration: 'Release'

steps:
- task: [email protected]
  displayName: Build
  inputs:
    projects: '**/EfSqlite.csproj'
    arguments: '--configuration $(BuildConfiguration)'
- task: [email protected]
  displayName: Publish
  inputs:
    command: publish
    publishWebProjects: True
    arguments: '--configuration $(BuildConfiguration) --output $(build.artifactstagingdirectory)'
- task: [email protected]
  displayName: 'Publish Artifact'
  inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'

As you can see in the above yaml this build now has three different steps. The build (this is equivalent to what the post from last week was doing) and publish (this gets all the files in the right places) tasks are both handled using the [email protected] task.  Finally, the [email protected] takes the results of the publish and zips them to the artifact staging directory where they can be used in a release pipeline.

Create an Azure App Service

Feel free to skip this section if you have an existing App Service to use. To create a new App Service open the Azure Portal and select App Services from the left navigation.

Next, click the Add button.

On the next page, we are going to select Web App.

Now hit the Create button.

The next page you will need to enter an App name, select the OS, and Runtime Stack before hitting the Create button. The OS and Runtime Stack should match the target of your application.

Create a Release Pipeline

On your project from the left navigation select Pipelines > Releases and then click the New pipeline button. If you already have a release pipeline setup this page will look different.

The next page has a list of template you can select from. In this example, we will be selecting Azure App Service deployment and then click the Apply button.

 

 

Artifact Set Up

After clicking Apply you will hit the pipeline overview page with to sets of information. The first is the Artifacts which for us is going to be the results of the build pipeline we set up in last week’s post. Click the Add an artifact box.

The Add an artifact dialog is where you will select where the release will get its build artifact form. We will be using the source type of build and selecting our existing build pipeline.

Once you select your build pipeline as the Source a few more controls will show up. I took the defaults on all of them. Take note of the box highlighted in the screenshot as it will give you a warning if you build is missing artifacts. Click the Add button to complete.

 

Stage Setup

 

Above we selected the Azure App Service template which is now in our pipeline as Stage 1. Notice the red exclamation, which means the stage has some issues that need to be addressed before it can be used. Click on the stage box to open it.

 

As you can see in the following screenshot the settings that are missing are highlighted in red on the Deploy Azure App Service task.  On Stage 1 click the Unlink all button and confirm. Doing this means there is more setup on the Deploy Azure App Service task, but this is the only way to use .NET Core 2.1 at the time of this writing. For some reason, the highest version available at the Stage level for Linux is .NET Core 2.0.

After clicking unlink all the options other than the name of the stage will be removed. Next, click on Deploy Azure App Service task which handles the bulk to the work will place for this pipeline. There are a lot of setting on this task. Here is a screenshot of my setup and I will call out the important bits after.

First, select your Azure subscription. You may be required to Authorize your account so if you see an Authorize button click it and go through the sign in steps for your Azure account.

Take special note of App type. In this sample, we are using Linux so it is important to select Linux App from the drop down instead of the just using Web App.

With your Azure subscription and App type selected the App Service name drop-down should only let you select Linux based App  Services that exist on your subscription.

For Image Source, I went with the Built-in Image, but it does have the option to enter use a container from a registry if the built-in doesn’t meet your needs.

For Package or folder, the default should work if you only have a single project. Since this, I have two projects I used the file browser (the … button) to select the specific zip file I want to deploy.

Runtime Stack needs to to be .NET Core 2.1 for this application.

Startup command needs to be set up to tell the .NET CLI to run the assembly that is the entry point for the application. In the example, this works out to be dotnet EfSqlite.dll.

After all the settings have been entered hit the Save button in the top right of the screen.

Execute Release Pipeline

Navigate back to Pipelines > Release and select the release you want to run. Then click the Create a release button.

On the next page, all you have to do is select the Artifact you want to deploy and then click the Create button.

Create will start the release process and send you back to the main release page. There will be a link at the top of the release page that you can click to see the status of the release.

The following is the results of my release after it has completed.

Wrapping Up

I took me more trial and error to get this setup going that I would have hoped, but once all the pieces are get up the results are awesome. At the very minimum, I recommend taking the time to at least set up a build that is triggered when code is checked into your repos. Having the feedback that a build is broken as soon as the code hits the repo versus finding out when you need a deliverable will do wonders for your peace of mind.

Negative Responses

After 3.5 years I finally had a post that resulted in a negative reaction from a fair number of people. The morning after the .NET Parameterized Queries Issues with SQL Server Temp Tables post went live I woke up with 4 comments waiting for approval. Before I read them this was exciting. I thought that I had hit on an issue that a lot of people had faced. I was wrong.

The Negative

Three of the four comments were about that specific person’s view of best practices and how my post was a poor example and something that any seasoned developer would ever dare use.

I replied to all the comments and tried to clarify the points of misunderstanding. The post was based on a real issue we faced and something that was not at all clear on why. The post was meant to be simple in order to show the point and I did my best to communicate these points to the commenters. To say the least this was a hard day for me.

The post now has something like 33 comments, which is way more than any of my other posts. I think that for the most part, the commenters are more clear on what I was going for. The straight up negativity around the post in question is something I hope to avoid for the most part.

The Positive

On the plus side on the day in question, I hit a new record for single-day views on my blog. The previous record was from 3 years ago when I had a post featured on the ASP.NET homepage.

While less visible on my blog I also got a lot of support from people letting me know that they understood my point and through it was clear. The post has also appeared in a blog and newsletter that I highly respect. I really appreciate everyone who took the time to reach out and remind me that this type of thing will happen and to stay positive.

Future Strategies

If something like this happens again I will do my best to keep it in perspective. I want honest feedback on my posts. I welcome people posting issues and us working through them. It results in better information for my readers which at the end of the day is why I spend so much time doing this blog every week.

I think it is human nature for negative comments to stand out. It will be important for me to temper that initial negative reaction with the positive side of things.