PostgreSQL

Entity Framework Core: Postgres Concurrency Checks

Last week’s post on SQLite Concurrency Checks used this repo which contains examples of using Entity Framework Core with SQLite and Postgres. This post is going to tackle concurrency checks using Postgres to keep the projects in the repo with the same level of functionality. You can grab the sample code before any change here. This whole post will only be touch files found in the Postgres folder/project.

Context Changes and Data Migration

Unlike SQLite, Postgres has better built-in support for concurrency checks. If you read the official docs on Optimistic Concurrency and Concurrency Tokens you will find that all tables have an implicit/hidden system column call xmin which holds the ID of the latest updating transaction which means it gets changed automatically every time a row is changed.

The Postgres Entity Framework Core provide contains an extension that makes it very simple to use the xmin column as a concurrency token. In the ContactDbContext add the following to the OnModelCreating function to enable concurrency checking on the specified entity, in this case, a Contact.

modelBuilder.Entity<Contact>().ForNpgsqlUseXminAsConcurrencyToken();

Next, from a command prompt in the same directory as your project file using the following .NET CLI command to generate a migration for the above change. This migration is a bit strange since the column technically already exists, but the migration seemed to be needed.

dotnet ef migrations add Concurrency --context ContactsDbContext

Then, use the following command to apply the migration to your database.

dotnet ef database update --context ContactsDbContext

Testing it out

For a quick and dirty test, I added a ConcurrencyTest razor page under the Contacts directory. This function is going to ensure a specific contact exists, then pull the contact from two different DBContexts, make a mutation on the resulting contact objects, then attempt to save. The first save will work and the second should fail. Please note that this function isn’t an example of how things should be done just a quick and dirty way to prove that the concurrency check is happening.

public void OnGet()
{
    var context1 = 
        new ContactsDbContext(new DbContextOptionsBuilder<ContactsDbContext>()
                              .UseNpgsql("yourConnectionString")
                              .Options);
    var contactFromContext1 = context1.Contacts
                                      .FirstOrDefault(c => c.Name == "Test");

    if (contactFromContext1 == null)
    {
        contactFromContext1 = new Contact
                              {
                                  Name = "Test"
                              };

        context1.Add(contactFromContext1);
        context1.SaveChanges();
    }

    var context2 = 
        new ContactsDbContext(new DbContextOptionsBuilder<ContactsDbContext>()
                              .UseNpgsql("yourConnectionString")
                              .Options);
    var contactFromContext2 = context2.Contacts
                                      .FirstOrDefault(c => c.Name == "Test");

    contactFromContext1.Address = DateTime.Now.ToString();
    contactFromContext2.Address = DateTime.UtcNow.ToString();

    context1.SaveChanges();
    context2.SaveChanges();
}

Run the application and hit the ConcurrenctTest route which is https://localhost:44324/Contacts/ConcurrencyTest for my test. The following is the resulting exception.

An unhandled exception occurred while processing the request.

DbUpdateConcurrencyException: Database operation expected to affect 1 row(s) but actually affected 0 row(s). Data may have been modified or deleted since entities were loaded. See http://go.microsoft.com/fwlink/?LinkId=527962 for information on understanding and handling optimistic concurrency exceptions.

Wrapping Up

This process was much simpler using Postgres than SQLite. Not that the SQLite version was hard just not as simple of a path.

The code in its final state can be found here.

Entity Framework Core: Postgres Concurrency Checks Read More »

Entity Framework Core with Postgres

With last week’s post, I now have Postgres running in a Docker container. This post is going to cover using Entity Framework Core with Postgres which will happen to be running in the Docker container from last week, but that bit isn’t a requirement for this post. This post is going to be very similar to this one which covered Entity Framework Core with SQLite.

Starting Point

Using Visual Studio 2017 I started with a new ASP.NET Core project using the Web Application template with Individual User Accounts for Authentication. Using Individual User Accounts is an easy way to get all the Entity Framework Core stuff setup.

Add Postgres Nuget Package

Right-click on the project file and click Manage NuGet Packages.

On the Browse tab search for Npgsql.EntityFrameworkCore.PostgreSQL and then click install.

Configuration Changes

Open the appsettings.json file and change the DefaultConnection in the ConnectionString section to a valid Postgres connection string as in the following example.

Before:
"ConnectionStrings": {
  "DefaultConnection": "Server=(localdb)\\mssqllocaldb;Database=aspnet-Postgres-53bc9b9d-9d6a-45d4-8429-2a2761773502;Trusted_Connection=True;MultipleActiveResultSets=true"
}

After:
"ConnectionStrings": {
  "DefaultConnection": "Server=localhost;Database=aspnet-Postgres-53bc9b9d-9d6a-45d4-8429-2a2761773502;UserId=yourUserName;Password=yourPassword"
}

Not that a user ID and password are needed. If you are going to be checking your project into source control I recommend in addition to the above you only store the connection string with the real user ID and password in user secrets which don’t get checked into source control. You can find more information on user secrets here.

Startup Changes

The final change needed before running the application is in the ConfigureServices function of the Startup class to switch out SQL Server for Postgres. The following shows an example of the change for the ApplicationDbContext.

Before:
services.AddDbContext<ApplicationDbContext>(options =>
    options
      .UseSqlServer(Configuration
                    .GetConnectionString("DefaultConnection")));

After:
services.AddDbContext<ApplicationDbContext>(options =>
    options
      .UseNpgsql(Configuration
                 .GetConnectionString("DefaultConnection")));

Run the App

Running the application at this point will work fine until you hit a function that wants to talk to the database. For example, attempting to register a user would result in the following error.

If you see this error don’t panic just follow the instructions and they will get you going. The simplest solution is to just click the blue Apply Migrations button and continue your testing.

Wrapping Up

The application is now ready to go using Postgres. The Entity Framework Core team, as well as the Postgres Entity Framework Core Provider team, have done a great job making everything work with no pain.

The official docs for the provider can be found here. I would love to offer some tooling recommendations for working with Postgres, but I haven’t been working with it enough yet to provide any. If you have any Postgres tools you recommend please leave a comment.

The code in its final state, which has been expanded to include my standard contacts example, can be found here.

Entity Framework Core with Postgres Read More »

Setup PostgreSQL on Windows with Docker

Over the weekend I finally got the chance to start reading A Curious Moon by Rob Conery which is a book on learning PostgreSQL by following the fictional Dee Yan as she is thrown into database administrator role at an aerospace startup.

I have a lot of experience using Microsoft’s SQL Server, but up until now, I haven’t touched PostgreSQL. For personal projects SQL Server’s cost and be prohibitive and the release of Rob’s book added up to a good time to give PostgreSQL a try.

Install Directly or not?

On the download section of the official Postgres site, there is an option to download an installer. This is the route I was going to at first, but in Rob’s book, he suggests using a VM for Postgres installation on Windows. This kicked off a lot of searching on my part and didn’t find a good definitive answer on why that is or isn’t the way to do.

In the end, I decided to try and run the Postgres process using Docker instead installing directly on Windows or dealing with a full VM.

Installing Docker

Head to this link and click the Get Docker link to download the installer. After the install is complete you will have to log out and back in. When I logged back in I got a message about Hyper-V not being enabled.

After logging back in I then got the following message about hardware-assisted virtualization not being enabled.

After tweaking my BIOS settings and logging back in I was greeted by the Docker welcome screen.

Open a command prompt and run the following command.

docker run hello-world

You should output that starts with the following if your installation is working.

Hello from Docker!
This message shows that your installation appears to be working correctly.

What about Postgres?

Getting up and going with a container running Postgres was pretty simple and could be done with the following command which will create a container and expose the port used by Postgres so it can be accessed from the host.

docker run -p 5432:5432 --name yourContainerName -e POSTGRES_PASSWORD=yourPassword -d postgres

The problem with this approach is if you ever need to rebuild the container for some reason, like a new version of Postgres is released, your data will be lost. Thankfully I found this blog post which shows how to use a secondary container for the data leaving the Postgres container able to be destroyed and recreated as needed. The following is the command I used to create my data container.

docker create -v /var/lib/postgresql/data --name PostgresData alpine

The above creates a container named PostgresData based on the Alpine image. It is important that the -v parameter matches the path that Postgres expects.

Now that we have a container that will keep our data safe let’s create the actual Postgres container with the following command.

docker run -p 5432:5432 --name yourContainerName -e POSTGRES_PASSWORD=yourPassword -d --volumes-from PostgresData postgres

The only difference from the first example run command is the addition of –volumes-from PostgresData which tells the container to use the PostgresData container.

If you run the docker ps -a command it will show you all your containers.

As you can see in my example I have two containers only one of which is actually running. Make sure you don’t remove the data container just because it will never show as running.

Connect to Postgres

To verify all was working I downloaded pgAdmin from here. Run the installer and then open the application. Right-click on Server and click Create > Server.

On the Create Server dialog enter a Name for your server and then switch over to the Connection tab.

On the Connection tab for Host use localhost and in the Password field use the password you used for POSTGRES_PASSWORD on the docker run command.

Click Save to close the dialog and connect to the server. The following is an example screenshot of what you will see showing the available databases on the server, which is just the default database in this case.

Wrapping Up

Make sure to check out the official docs here for more information as needed.

Other than the storage portion getting Postgres up and running in Docker was pretty simple. I hope like me this will give you a good jumping off point to learn more about both Docker and Postgres.

If anyone has any alternate ways to deal with persistent storage please leave a comment.

Setup PostgreSQL on Windows with Docker Read More »