.NET Core

Create an Application with Web Template Studio

Creating new applications is something I do a fair amount of. Most of the time they are throwaway projects used to test something out or demo projects used for this blog. With all the project creation going on I try and keep an eye out for tools that make the process easier. This post is going to cover a tool a came across a few weeks ago from Microsoft call Web Template Studio (WebTS).

What is WebTS?

From the project’s GitHub read me:

Microsoft Web Template Studio (WebTS) is a Visual Studio Code Extension that accelerates the creation of new web applications using a wizard-based experience. WebTS enables developers to generate boilerplate code for a web application by choosing between different front-end frameworks, back-end frameworks, pages and cloud services. The resulting web app is well-formed, readable code that incorporates cloud services on Azure while implementing proven patterns and best practices.

For the front-end, we have options of Angular, React, or Vue and on the back-end, there are options for ASP.NET Core, Flask, Molecular, or Node.

Installation

We are assuming you already have a recent version of Visual Studio Code (VSCode) installed, but if not you can download it from here. Open VSCode and from the left side of the screen select Extensions. In the search box at the top enter Web Template Studio and then from the list select Web Template Studio. In the detail page that opens click the green Install button at the top.

Project Creation

Now that we have the extension installed press Ctrl + Shift + P to open the VSCode command palette and enter Web Template Studio: Launch (or as much as needed for the option to show) and then select Web Template Studio: Launch.

This will launch the Web Template Studio project creation process. The first page that shows gives you a very basic set of options, but enough for our example. To see the full array of options Web Template Studio provides hit next on this first screen instead of Create Project like we do in this example. Also, note that if you want an ASP.NET [Core] back-end you need to select it on this summary screen as it isn’t yet an option later in the process. Back to our example, on this page, you will need to enter a Project Name, pick a Front-end Framework, and a Back-end Framework before finally clicking Create Project.

After the process finishes a dialog will show letting us know it is complete. Click Open Project to load the created project in a new instance of VSCode.

Running the Project

Before getting to a runnable state we need to run a couple of terminal commands first. This can be done from VSCode’s built-in terminal or from an external terminal of your choice. In VSCode if you don’t see a terminal you can use Cntrl + Shift + ` to open a new one, or from the menu select Terminal > New Terminal. The following are the two commands that need to be run using npm, but they can be adjusted to use yarn instead if you prefer it. The second command may be specific to an ASP.NET Core back-end, make sure and check the project README.md for verification.

npm install
npm run restore-packages

Now the project can be run with the following command.

npm start

The resulting application will look something like this.

Wrapping Up

WebTS seems to be a great tool for quickly getting a project up and running. I do recommend going through the full project creation wizard as it has options to set up Azure integration as well as the ability to add up to 20 different pages to the generated application. Also, keep in mind that the ASP.NET back-end framework option is pretty new so I wouldn’t be surprised to see some changes as it progresses through the preview stage. Make sure and check out the project’s GitHub repo.

Migrations in Transactions with DbUp

This post will be continuing our exploration of DbUp with the addition of transactions as part of migration execution. If you are new to DbUp check out the following post to catch up on this series.

Database Migrations with DbUp
Code-based Database Migrations with DbUp
Always Run Migrations with DbUp
Logging Script Output with DbUp

Transaction Types and Restrictions

DbUp has three different options for transactions. The default is no transaction, which is what we have been using so far. The other two options are a transaction per migration and finally a single transaction for all the migrations in a single upgrader. As far as I can tell so far there is no way to use the same transaction across multiple upgraders so that means in our sample application we could have our normal migrations work correctly and get committed, but have an issue in our always run migrations. The other thing to keep in mind is that some database providers don’t support transactions when changing the structure of the database. As pointed out by the DbUp docs, the Postgres docs have a good summary of transactional support for DDL of different database providers.

Adding Transactions

In the following example, we have changed the upgrader to use a single transaction. This is the full code for the upgrader with the change highlighted.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                       f => !f.Contains(".AlwaysRun."))
                 .LogToConsole()
                 .WithTransaction()
                 .Build();

The other options are WithTransactionPerScript for, surprise, a transaction per migration, and WithoutTransaction for no transactions (or just leave it out since this is the default value). The following is a sample of the output with transactions on with two different upgraders in play and the begin transactions highlighted.

Beginning transaction
Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning transaction
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.01-EverytimeNoPrints.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.02-EverytimeNoPrintsNoCountOn.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.03-EverytimePrints.sql'
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.04-EverytimePrintNoCountOn.sql'
Upgrade successful
Success!

Wrapping Up

DbUp makes adding transactions to your migrations super simple so this was a really short post. It would be nice to have an option to use a transaction across multiple upgraders. The official DbUp docs on transactions can be found here.

Logging Script Output with DbUp

In today’s post, we will be going over how to log script output during a DbUp run. If you are new to this series of posts or DbUp in general you might find the following post helpful to review.

Database Migrations with DbUp
Code-based Database Migrations with DbUp
Always Run Migrations with DbUp

Enable Script Output Logging

I will be using the always run migrations we covered last weekend to simplify testing, but the concepts are the same for normal migrations. The change to enable script output logging is a single line added to our upgrader setup as you can see with the highlighted line in the following code.

var alwaysRunUpgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(), 
                                                       f => f.Contains(".AlwaysRun."))
                 .JournalTo(new NullJournal())
                 .LogToConsole()
                 .LogScriptOutput()
                 .Build();

Running the application will result in something like the following. Notice the records affected line which is the output from our migration.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
RecordsAffected: 0

Upgrade successful
Success!

Improve the Script Output

So the above is great for a simple migration that has a single statement, but for more complex migrations it might be helpful to use PRINT statements to provide more information. Let’s tweak our script to print the start and end times of the execution. The following is the sample script (and yes I know it would never actually update anything).

PRINT CONVERT(VarChar, GETDATE(), 121)  + ' Start every time';

UPDATE Application.Cities SET CityName = 'Nothing' WHERE 1 = 0;

PRINT CONVERT(VarChar, GETDATE(), 121)  +  + ' End every time';

The above migration results in the following output.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
2020-08-05 06:15:21.877 Start every time

2020-08-05 06:15:21.877 End every time

RecordsAffected: 0

Upgrade successful
Success!

The prints are helpful, but the records affected is noise in this case, especially since it is the records affected for the last statement and does print per SQL statement. To suppress the records affected message we and use SET NOCOUNT ON in our script like the following.

SET NOCOUNT ON;

PRINT CONVERT(VarChar, GETDATE(), 121)  + ' Start every time with NoCount ON';

UPDATE Application.Cities SET CityName = 'Nothing' WHERE 1 = 0;

PRINT CONVERT(VarChar, GETDATE(), 121)  +  + ' End every time with NoCount ON';

SET NOCOUNT OFF;

And as you can see in the following we got a cleaner output.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Beginning database upgrade
Executing Database Server script 'DbupTest.Scripts.AlwaysRun.Everytime.sql'
2020-08-05 06:38:18.863 Start every time

2020-08-05 06:38:18.863 End every time

Upgrade successful
Success!

Wrapping Up

Logging the output of scripts can be really helpful at times, but if you are going to use it as we saw above it is important to make sure and plan what that output is going to look like. Having 500 instances of the number of records affected isn’t necessarily helpful.

Always Run Migrations with DbUp

In this post, we are going to walk through how to use DbUp to run specific migrations on every run instead of just once. If you are new to DbUp check out the following post to see how the sample project got to its current state.

Database Migrations with DbUp
Code-based Database Migrations with DbUp

 

Differentiating for Always Run

One of the keys to getting migrations that should run every time is being able to identify them separately from migrations that should only run once. This can be done at either the filename level or by placing all the always run files in a specific directory. This sample is going with the directory methods, but the file method would work basically the same way. In the sample project under the existing Scripts directory, I added an AlwaysRun directory. I also added an example script in this directory so we could verify that it is getting executed on every run. In the sample, the file is named Everytime.sql.

Based on the above when DbUp is processing it will see Everytime.sql as DbupTest.Scripts.AlwaysRun.Everytime.sql.

Filtering Out Always Run

Now that we have a way to identify the migrations we want to always run we need to filter them out of the existing upgrade process so that they don’t get included in the log of already executed migrations. In DbUp this log is referred to as the journal. The extension methods that setup how migrations will be located, either WithScriptsEmbeddedInAssembly or WithScriptsAndCodeEmbeddedInAssembly in our examples, can be provided with a filter which we will use to exclude migrations that are in the AlwaysRun directory. The following code is the upgrader with the filter in place.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                       f => !f.Contains(".AlwaysRun."))
                 .LogToConsole()
                 .Build();

var result = upgrader.PerformUpgrade();

Executing Always Run Migrations

Now that we have adjusted our original upgrader to exclude migrations in the AlwaysRun directory we need something to execute the always run migrations. To accomplish this we add a second upgrader that is filtered just to AlwaysRun scripts. The second really important bit about this second upgrader is that it uses a NullJournal. The NullJournal is what keeps the execution of the scripts from being logged which results in them always getting run. The following code was inserted after the above code. The highlighted bits point out the filter and null journal.

if (result.Successful)
{
    var alwaysRunUpgrader =
        DeployChanges.To
                     .SqlDatabase(connectionString)
                     .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                           f => f.Contains(".AlwaysRun."))
                     .JournalTo(new NullJournal())
                     .LogToConsole()
                     .Build();

    result = alwaysRunUpgrader.PerformUpgrade();
}

The following is the full Main function just in case the above code didn’t give enough context on how the code should fit together.

static int Main(string[] args)
{
    var connectionString =
        args.FirstOrDefault()
        ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

    var upgrader =
        DeployChanges.To
                     .SqlDatabase(connectionString)
                     .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(),
                                                           f => !f.Contains(".AlwaysRun."))
                     .LogToConsole()
                     .Build();

    var result = upgrader.PerformUpgrade();

    if (result.Successful)
    {
        var alwaysRunUpgrader =
            DeployChanges.To
                         .SqlDatabase(connectionString)
                         .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly(), 
                                                               f => f.Contains(".AlwaysRun."))
                         .JournalTo(new NullJournal())
                         .LogToConsole()
                         .Build();

        result = alwaysRunUpgrader.PerformUpgrade();
    }

    if (!result.Successful)
    {
        Console.ForegroundColor = ConsoleColor.Red;
        Console.WriteLine(result.Error);
        Console.ResetColor();
#if DEBUG
        Console.ReadLine();
#endif                
        return -1;
    }

    Console.ForegroundColor = ConsoleColor.Green;
    Console.WriteLine("Success!");
    Console.ResetColor();

    return 0;
}

Wrapping Up

With the above changes we now have a project that can run normal migrations, code-base migrations, and always run migrations. No matter what style of database change management you decide on having the changes in some sort of source control system will be super helpful.

For more information on DbUp’s journalling check out the official docs.

Code-based Database Migrations with DbUp

In today’s post, we are going to check out the code-based migration feature of DbUp which allows code to run as part of the migration process instead of just SQL based scripts. The ability to run code as part of the migration provides a ton of flexibility in the migration process. This builds on last week’s post, Database Migrations with DbUp, make sure and check it out if you are new to DbUp.

Script Provider Change

In our example application from last week’s post in the Program class when setting up our upgrader we used the following setup.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

The above only picks up embedded SQL type files. In order to also pick up code-based migrations we have to change from the WithScriptsEmbeddedInAssembly script provider to WithScriptsAndCodeEmbeddedInAssembly. The following is the upgrader with the new script provider highlighted.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsAndCodeEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

Code-based Migrations

To add a code-based migration add a new class and set it to implement DbUp’s IScript interface. The script provider we changed to above will pick up any non-abstract class that implements IScript as well as SQL script files. The results are combined and sorted by name to determine execution order so make sure whatever naming convention you pick will sort properly for both your SQL scripts as well as code.

The IScript interface defines a single function, ProvideScript, that has a single parameter that provides a function to create a database command and returns a string. Depending on the need you can return a string with the SQL to be executed or you can create a command and execute any commands you need directly on the database or some combination of both if needed.

To test out code-based migrations, I added the following new class in the Scripts folder. It doesn’t actually change any data but instead logs some information to the console. Even without changing data it still conveys how code-based migrations work.

using System;
using System.Data;
using DbUp.Engine;

namespace DbupTest.Scripts
{
    public class Code0002Test : IScript
    {
        public string ProvideScript(Func<IDbCommand> dbCommandFactory)
        {
            using (var cmd = dbCommandFactory())
            {
                cmd.CommandText = "SELECT DeliveryMethodName FROM Application.DeliveryMethods";

                using (var dr = cmd.ExecuteReader())
                {
                    Console.WriteLine("  Delivery Methods");

                    while (dr.Read())
                    {
                        Console.WriteLine($"    {dr["DeliveryMethodName"]}");
                    }
                }
            }

            return string.Empty;
        }
    }
}

The following is the console output from running the application.

Beginning database upgrade
Checking whether journal table exists..
Fetching list of already executed scripts.
  Delivery Methods
    Air Freight
    Chilled Van
    Courier
    Customer Collect
    Customer Courier to Collect
    Delivery Van
    Post
    Refrigerated Air Freight
    Refrigerated Road Freight
    Road Freight
Executing Database Server script 'DbupTest.Scripts.Code0002Test.cs'
Checking whether journal table exists..
Upgrade successful
Success!

Wrapping Up

The ability to run code as part of the database migration process is an awesome set of functionality. I would advise sticking with SQL based migrations as default and only using code when SQL doesn’t provide a way to do what is needed or doing the migration in code is much more understandable. Make sure and check out the official DbUp docs on this feature for more information.

Database Migrations with DbUp

Managing database schemas can be a challenging problem. It is also an area where there is no one size fits all solution (not sure that is ever really true for anything). Solutions range from manual script management, database vendor-specific options (such as DACPAC for Microsoft SQL Server), to migrations. This post will be looking at one of the options for a migration based approach using DbUp.

Sample Database

I needed a sample database to start off with so I dug up one of my posts for Getting a Sample SQL Server Database to guide me through getting Microsoft’s WideWorldImporters sample database downloaded and restored.

DbUp Console Application

There are quite a few ways DbUp can be used, for this example, we are going to be using it from a .NET Core Console application. From a terminal use the following command to create a new console application in the directory you want the application in.

dotnet new console

Now we can use the following command to add the DbUp NuGet package to the sample project. In this case, we are using the SQL Server package, but there are packages for quite a few database providers so install the one that is appropriate for you.

dotnet add package dbup-sqlserver

Next, open the project in Visual Studio (or any editor but part of how this example is setup is easier in Visual Studio). In the Program class replace all the code with the following. We will look at a couple big of this code that below.

using System;
using System.Linq;
using System.Reflection;
using DbUp;

namespace DbupTest
{
    class Program
    {
        static int Main(string[] args)
        {
            var connectionString =
                args.FirstOrDefault()
                ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

            var upgrader =
                DeployChanges.To
                             .SqlDatabase(connectionString)
                             .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                             .LogToConsole()
                             .Build();

            var result = upgrader.PerformUpgrade();

            if (!result.Successful)
            {
                Console.ForegroundColor = ConsoleColor.Red;
                Console.WriteLine(result.Error);
                Console.ResetColor();
#if DEBUG
                Console.ReadLine();
#endif                
                return -1;
            }

            Console.ForegroundColor = ConsoleColor.Green;
            Console.WriteLine("Success!");
            Console.ResetColor();

            return 0;
        }
    }
}

The bit below is trying to pull the connection string for SQL Server out of the first argument passed from the terminal to the application and if no arguments were passed in then it falls back to a hardcoded value. This works great for our sample, but I would advise against the fallback value for production use. It is always a bad day when you think your migrations have run successfully but it was on the wrong database because of a fall back value.

var connectionString = 
    args.FirstOrDefault() 
    ?? "Server=localhost; Database=WideWorldImporters; Trusted_connection=true";

This next section is where all the setup happens for which database to deploy to, where to find the scripts to run, and where to log. There are a lot of options provided by DbUp in this area and I recommend checking out the docs under the More Info section for the details.

var upgrader =
    DeployChanges.To
                 .SqlDatabase(connectionString)
                 .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                 .LogToConsole()
                 .Build();

Finally, the following is where the scripts are actually executed against the database.

var result = upgrader.PerformUpgrade();

The rest of the function is dealing with displaying to the console the results of the scripts running.

Adding Scripts

Keep in mind that we are using the Embedded Scripts option so DbUp is going to find all the embedded files that end with ‘.sql’. I have added a Scripts directory to the project so the scripts don’t clutter the root of the project, but that isn’t a requirement. Do not that how the files are named will control the order in which the scripts get executed so make sure you establish a good naming convention upfront. For our sample, I added a file named 0001-TestScript.sql to the Scripts directory. Since we are using the embedded scripts provider it is critical that after adding the file we go to its properties and set the Build Action to Embedded resource.

The script file itself doesn’t do anything in this case except select a value.

Trying it out

At this point, we can hit Run in Visual Studio and it will execute our new script. The output will look something like the following.

If we were to run the application again it would tell us that no new scripts need to be executed. DbUp like all migration based solutions I have encountered use a table in the database to keep a record of what migrations have been run. The default table for DbUp is SchemaVersion and it consists of an Id, ScriptName, and Applied columns. Given this structure, it is important to keep in mind that if you rename a script that has already been executed on a database DbUp will see that as a different script and execute it again.

Wrapping Up

In the docs for DbUp there is a philosophy section and one of the points is that a database is a result of a set of transitions, not just its new state vs. its old state. This point is key to why I am a fan of a migration based approach.

Check out the DbUp docs and GitHub repo for more information.

GitHub: Use Actions to Run Multiple Jobs

In the post, we are going to take our sample Workflow that builds two ASP.NET Core web applications and split it so each web application is built individually. This post is using the repo and Workflow built in the following posts if you need to catch up.

GitHub: Import an Azure DevOps Repo
GitHub: Use Actions to build ASP.NET Core Application
GitHub: Use Actions to Publish Artifacts

Starting Point and the Plan

Our Workflow currently contains a single job that just happens to build two ASP.NET Core web application based on the fact that the .NET CLI picks up and builds both applications. The following is the YAML for our current Workflow.

name: .NET Core

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/[email protected]
      
    - name: Setup .NET Core
      uses: actions/[email protected]
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore
      
    - name: Build
      run: dotnet build --configuration Release --no-restore
      
    - name: Test
      run: dotnet test --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish 
    
    - name: Upload WebApp1 Build Artifact
      uses: actions/[email protected]
      with:
        name: WebApp1
        path: /home/runner/work/Playground/Playground/src/WebApp1/bin/Debug/netcoreapp3.1/publish/
        
    - name: Upload WebApp2 Build Artifact
      uses: actions/[email protected]
      with:
        name: WebApp2
        path: /home/runner/work/Playground/Playground/src/WebApp2/bin/Debug/netcoreapp3.1/publish/

This post is going to take this Workflow and split the build and publish of the two web applications into two jobs. By splitting the Workflow into multiple jobs we open the possibility that the jobs can run in parallel. One reason to do this would be to speed up the total Workflow run time if you have parts of your build that are independent. Another example of why you would need multiple jobs is if the different jobs need different needed to run on different operating systems such as one needing to run Windows and another a Linux.

Creating the Jobs

The following is the Workflow set up along with the job to build the first web application. The first change was of the ID from build to build_web_app1 since each job has to have a unique ID. Most of the rest of the highlighted changes are related to the .NET CLI commands that are now directed at a specific project. Do also note that we changed from a hardcoded path to using expression to get the workspace path which is the ${{ github.workspace }} bit instead of /home/runner/work/Playground/Playground/. See the expression syntax docs for more info.

name: .NET Core

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build_web_app1:

    name: Build WebApp1
    runs-on: ubuntu-latest

    steps:
    - uses: actions/[email protected]
      
    - name: Setup .NET Core
      uses: actions/[email protected]
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore ${{ github.workspace }}/src/WebApp1/WebApp1.csproj
      
    - name: Build
      run: dotnet build ${{ github.workspace }}/src/WebApp1/WebApp1.csproj --configuration Release --no-restore
      
    - name: Test
      run: dotnet test ${{ github.workspace }}/src/WebApp1/WebApp1.csproj --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish ${{ github.workspace }}/src/WebApp1/WebApp1.csproj
    
    - name: Upload Build Artifact
      uses: actions/[email protected]
      with:
        name: WebApp1
        path: ${{ github.workspace }}/src/WebApp1/bin/Debug/netcoreapp3.1/publish/

Here is the full file with both jobs defined. As you can see the second job is basically the same thing as the first one with a different ID, name, and project.

name: .NET Core

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build_web_app1:

    name: Build WebApp1
    runs-on: ubuntu-latest

    steps:
    - uses: actions/[email protected]
      
    - name: Setup .NET Core
      uses: actions/[email protected]
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore ${{ github.workspace }}/src/WebApp1/WebApp1.csproj
      
    - name: Build
      run: dotnet build ${{ github.workspace }}/src/WebApp1/WebApp1.csproj --configuration Release --no-restore
      
    - name: Test
      run: dotnet test ${{ github.workspace }}/src/WebApp1/WebApp1.csproj --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish ${{ github.workspace }}/src/WebApp1/WebApp1.csproj
    
    - name: Upload Build Artifact
      uses: actions/[email protected]
      with:
        name: WebApp1
        path: ${{ github.workspace }}/src/WebApp1/bin/Debug/netcoreapp3.1/publish/
        
  build_web_app2:

    name: Build WebApp2
    runs-on: ubuntu-latest

    steps:
    - uses: actions/[email protected]
      
    - name: Setup .NET Core
      uses: actions/[email protected]
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore ${{ github.workspace }}/src/WebApp2/WebApp2.csproj
      
    - name: Build
      run: dotnet build ${{ github.workspace }}/src/WebApp2/WebApp2.csproj --configuration Release --no-restore
      
    - name: Test
      run: dotnet test ${{ github.workspace }}/src/WebApp2/WebApp2.csproj --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish ${{ github.workspace }}/src/WebApp2/WebApp2.csproj
       
    - name: Upload Build Artifact
      uses: actions/[email protected]
      with:
        name: WebApp2
        path: ${{ github.workspace }}/src/WebApp2/bin/Debug/netcoreapp3.1/publish/

After all the edits are done commit the changes to the repo to run the Workflow. From the results of the Workflow run, you will see that it now has two jobs and we still got the artifacts for both applications as we had before.

Wrapping Up

If your applications need it breaking them up into different jobs can be helpful not only with Workflow runtimes but it can also help your ability to reason about what each part of your build process is doing.

GitHub: Use Actions to Publish Artifacts

This post is going to take the GitHub Actions Workflow we set up in the last post and add a couple of steps that will provide us with access to our application’s binaries. If you are new to this series the following post will catch you up if needed.

GitHub: Import an Azure DevOps Repo
GitHub: Use Actions to build ASP.NET Core Application

Edit the Workflow

Our first step is to get back to the Workflow we want to edit. At the top of the repo click Actions.

On the left side of the screen select the specific Workflow, .NET Core in this case. Now that the list is filtered to just the Workflow we are interested in select the three dots on the most recent run and then click View workflow file.

On the next screen click the pencil above the Workflow file to edit the Workflow.

Add the end of the file adds a call to the .NET CLI to publish. The following is the full file and the last two lines are the publish step.

name: .NET Core

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/[email protected]
      
    - name: Setup .NET Core
      uses: actions/[email protected]
      with:
        dotnet-version: 3.1.101
 
    - name: Install dependencies
      run: dotnet restore
      
    - name: Build
      run: dotnet build --configuration Release --no-restore
      
    - name: Test
      run: dotnet test --no-restore --verbosity normal

    - name: Publish
      run: dotnet publish

Click the Start commit button and commit the change to the branch of your choice. I committed the change directly to master which triggered the Workflow to run. From the logs of the Workflow, you can see that the publish step executed successfully.

Publish Build Artifacts

Now that we have the two web applications publishing we need a way to get the file for the applications. To do this we are going to use the Upload Artifacts Action. I used the output of the Publish step above to find the path to the publish folder for each application and then used an Upload Artifacts Action for each application. The following are the two steps added to the bottom of our existing Workflow.

- name: Upload WebApp1 Build Artifact
  uses: actions/[email protected]
  with:
    name: WebApp1
    path: /home/runner/work/Playground/Playground/src/WebApp1/bin/Debug/netcoreapp3.1/publish/
    
- name: Upload WebApp2 Build Artifact
  uses: actions/[email protected]
  with:
    name: WebApp2
    path: /home/runner/work/Playground/Playground/src/WebApp2/bin/Debug/netcoreapp3.1/publish/

After checking in the changes let the Workflow run. Once complete if you click on the details of the Workflow run you will see it now has Artifacts that can be downloaded.

Wrapping Up

We now have a Workflow that results in files we could actually deploy on top of verifying that the applications builds and the tests pass. I hope this has given you a good jumping-off point to build your own Workflows. We have only scratched the surface of what can be done with GitHub Actions and I’m looking forward to seeing what else we can do.

GitHub: Use Actions to build ASP.NET Core Application

In this week’s post, we are going to use GitHub’s Actions to build one of the applications that we imported from an Azure DevOps Repo. The sample repo we are using can be found here.

Create an Action Workflow

From the repo in GitHub click the Actions option at the top center of the screen.

The Actions page will make suggestions based on the contents of the repo you are working with. In our case, the suggested .NET Core workflow is the one we are interested in. Click the Set up this workflow button.

The next screen that shows will be an editor loaded with the YAML for the .NET Core workflow we selected. For now, we are going to keep the YAML that was defaulted in and click the Start commit button. This workflow may or may not work for our repo at this point we are still exploring and can change as needed after we get a feel for how Actions work.

The next dialog is the commit details. For this initial change, we are going commit directly to master with the default commit message. Click Commit new file to continue.

View Workflow Status

Now that we have a workflow set up click on the Actions tab of the repo again to view the list of workflows and their status. As you can see in this screenshot the commit queued our new workflow to run.

The workflow finished quickly so I didn’t get to see the details while it was running, but if you click on commit title, Create dotnetcore.yml in this example, it will take you to the detail of this workflow run. From this view, you will see the jobs for the workflow listed on the left side of the screen, we only have one job which is the build. When you click on a job you will see the logs from that job. The following screenshot is the sample build job with the details of the build step expanded to show that both WebApp1 and WebApp2 were built.

Wrapping Up

Hopefully, this post will give you a good jumping-off point to create your own GitHub Actions. I was impressed with how easy it was to get started and the wide verity of languages supported especially for a feature set that has been out for less than a year. Check back in next week for more exploration of Actions.

Azure DevOps Pipelines: Reusable YAML

In this post, we are going to refactor our sample Azure DevOps Pipeline to move some of the redundant YAML to a new file and replace the redundant parts of our main YAML file. This post is going to build on the Azure DevOps project created in previous posts. If you are just joining this series check out the previous posts to find out how the project has progressed.

Getting Started with Azure DevOps
Pipeline Creation in Azure DevOps
Azure DevOps Publish Artifacts for ASP.NET Core
Azure DevOps Pipelines: Multiple Jobs in YAML

Starting YAML

The following is the YAML for our current pipeline that builds two different web applications using two different jobs. Looking at the two jobs you will notice that they both have the same steps. The only difference in the steps is which project to build (WebApp1.csproj or WebApp2.csproj) and what to call the published artifact (WebApp1 or WebApp2). When developing applications we would never stand for this level of duplication and the same should apply to our pipelines.

trigger: none

variables:
  buildConfiguration: 'Release'

jobs:
- job: WebApp1
  displayName: 'Build WebApp1'
  pool:
    vmImage: 'ubuntu-latest'

  steps:
  - task: [email protected]
    displayName: 'Use .NET 3.1.x'
    inputs:
      packageType: 'sdk'
      version: '3.1.x'

  - task: [email protected]
    displayName: 'Build'
    inputs:
      command: 'build'
      projects: '**/WebApp1.csproj'
      arguments: '--configuration $(buildConfiguration)' 
  
  - task: [email protected]
    displayName: 'Publish Application'
    inputs:
      command: 'publish'
      publishWebProjects: false
      projects: '**/WebApp1.csproj'
      arguments: '--configuration $(buildConfiguration) --output $(Build.ArtifactStagingDirectory)'

  - task: [email protected]
    displayName: 'Publish Artifacts'
    inputs:
      targetPath: '$(Build.ArtifactStagingDirectory)'
      artifact: 'WebApp1'
      publishLocation: 'pipeline'

- job: WebApp2
  displayName: 'Build WebApp2'
  pool:
    vmImage: 'ubuntu-latest'

  steps:
  - task: [email protected]
    displayName: 'Use .NET 3.1.x'
    inputs:
      packageType: 'sdk'
      version: '3.1.x'

  - task: [email protected]
    displayName: 'Build'
    inputs:
      command: 'build'
      projects: '**/WebApp2.csproj'
      arguments: '--configuration $(buildConfiguration)' 
  
  - task: [email protected]
    displayName: 'Publish Application'
    inputs:
      command: 'publish'
      publishWebProjects: false
      projects: '**/WebApp2.csproj'
      arguments: '--configuration $(buildConfiguration) --output $(Build.ArtifactStagingDirectory)'

  - task: [email protected]
    displayName: 'Publish Artifacts'
    inputs:
      targetPath: '$(Build.ArtifactStagingDirectory)'
      artifact: 'WebApp2'
      publishLocation: 'pipeline'

Add a New File

To attack the duplication above we need to take the shared steps from above and move them somewhere they can be reused. We will be walking through the steps using the Azure DevOps web site and committing directly to the master branch, but these same steps could be performed locally or on the web on any branch. First, from the Repos section of the site we need to add a new file by clicking the three dots at the level we want the file added. In this case, we are adding to the root of the repo but the same option is available on any folder.

A dialog will show where you can enter the New file name, we are going to use build.yml in this case. Next, click Create to continue.

Shared YAML

Now that we have a new file we can start building the new YAML that will handle the repeated steps from the original jobs. The first thing we are going to do is define a set of parameters that this set of steps can be called with. We are going to use this to pass what project to build, which build configuration to use, and what name the published artifact. The following is the definition of our parameters.

parameters:
- name: buildConfiguration
  type: string
  default: 'Release'
- name: project
  type: string
  default: ''
- name: artifactName
  type: string
  default: ''

We can then use these parameters in the rest of the file using the ${{ parameterName }} syntax. Note that any pipeline variables are also available using the $(variableName) syntax. The following bit of YAML shows both types in the arguments line.

- task: [email protected]
  displayName: 'Publish Application'
  inputs:
    command: 'publish'
    publishWebProjects: false
    projects: '**/${{ parameters.project }}'
    arguments: '--configuration ${{ parameters.buildConfiguration }} --output $(Build.ArtifactStagingDirectory)'

While you can use pipeline variables I recommend passing all the values you need via parameters for the same reason that we try to avoid global variables when doing general programming. I’m using both here to show the usage of each. The following is the full YAML in our new file.

parameters:
- name: buildConfiguration
  type: string
  default: 'Release'
- name: project
  type: string
  default: ''
- name: artifactName
  type: string
  default: ''

steps:
  - task: [email protected]
    displayName: 'Use .NET 3.1.x'
    inputs:
      packageType: 'sdk'
      version: '3.1.x'

  - task: [email protected]
    displayName: 'Build'
    inputs:
      command: 'build'
      projects: '**/${{ parameters.project }}'
      arguments: '--configuration ${{ parameters.buildConfiguration }}' 
  
  - task: [email protected]
    displayName: 'Publish Application'
    inputs:
      command: 'publish'
      publishWebProjects: false
      projects: '**/${{ parameters.project }}'
      arguments: '--configuration ${{ parameters.buildConfiguration }} --output $(Build.ArtifactStagingDirectory)'

  - task: [email protected]
    displayName: 'Publish Artifacts'
    inputs:
      targetPath: '$(Build.ArtifactStagingDirectory)'
      artifact: ${{ parameters.artifactName }}
      publishLocation: 'pipeline'

Finally, commit the changes to the new file.

Using Shared YAML

Not that we have the YAML that is the same between our two build jobs we can switch back over to our main YAML file, azure-pipelines.yml in the sample, and remove the steps we are wanting to replace. While the jobs will both have a steps section the only thing we will have left in them is a template call to our other YAML file, build.yml for the sample, that passes the parameters to run the other file with. The following is the resulting YAML file with the call to the shared file in both jobs highlighted.

trigger: none

variables:
  buildConfiguration: 'Release'

jobs:
- job: WebApp1
  displayName: 'Build WebApp1'
  pool:
    vmImage: 'ubuntu-latest'

  steps:
  - template: build.yml
    parameters:
      buildConFiguration: $(buildConfiguration)
      project: WebApp1.csproj
      artifactName: WebApp1

- job: WebApp2
  displayName: 'Build WebApp2'
  pool:
    vmImage: 'ubuntu-latest'

  steps:
  - template: build.yml
    parameters:
      buildConFiguration: $(buildConfiguration)
      project: WebApp2.csproj
      artifactName: WebApp2

Wrapping Up

Being able to remove duplication from your YAML files should help improve the maintainability of your pipelines. I know the samples don’t show it, but the template is just a step and you could have other steps before or after it just like you would with normal tasks.