Over the weekend I finally got the chance to start reading A Curious Moon by Rob Conery which is a book on learning PostgreSQL by following the fictional Dee Yan as she is thrown into database administrator role at an aerospace startup.
I have a lot of experience using Microsoft’s SQL Server, but up until now, I haven’t touched PostgreSQL. For personal projects SQL Server’s cost and be prohibitive and the release of Rob’s book added up to a good time to give PostgreSQL a try.
Install Directly or not?
On the download section of the official Postgres site, there is an option to download an installer. This is the route I was going to at first, but in Rob’s book, he suggests using a VM for Postgres installation on Windows. This kicked off a lot of searching on my part and didn’t find a good definitive answer on why that is or isn’t the way to do.
In the end, I decided to try and run the Postgres process using Docker instead installing directly on Windows or dealing with a full VM.
Head to this link and click the Get Docker link to download the installer. After the install is complete you will have to log out and back in. When I logged back in I got a message about Hyper-V not being enabled.
After logging back in I then got the following message about hardware-assisted virtualization not being enabled.
After tweaking my BIOS settings and logging back in I was greeted by the Docker welcome screen.
Open a command prompt and run the following command.
docker run hello-world
You should output that starts with the following if your installation is working.
Hello from Docker!
This message shows that your installation appears to be working correctly.
What about Postgres?
Getting up and going with a container running Postgres was pretty simple and could be done with the following command which will create a container and expose the port used by Postgres so it can be accessed from the host.
docker run -p 5432:5432 --name yourContainerName -e POSTGRES_PASSWORD=yourPassword -d postgres
The problem with this approach is if you ever need to rebuild the container for some reason, like a new version of Postgres is released, your data will be lost. Thankfully I found this blog post which shows how to use a secondary container for the data leaving the Postgres container able to be destroyed and recreated as needed. The following is the command I used to create my data container.
docker create -v /var/lib/postgresql/data --name PostgresData alpine
The above creates a container named PostgresData based on the Alpine image. It is important that the -v parameter matches the path that Postgres expects.
Now that we have a container that will keep our data safe let’s create the actual Postgres container with the following command.
docker run -p 5432:5432 --name yourContainerName -e POSTGRES_PASSWORD=yourPassword -d --volumes-from PostgresData postgres
The only difference from the first example run command is the addition of –volumes-from PostgresData which tells the container to use the PostgresData container.
If you run the docker ps -a command it will show you all your containers.
As you can see in my example I have two containers only one of which is actually running. Make sure you don’t remove the data container just because it will never show as running.
Connect to Postgres
To verify all was working I downloaded pgAdmin from here. Run the installer and then open the application. Right-click on Server and click Create > Server.
On the Create Server dialog enter a Name for your server and then switch over to the Connection tab.
On the Connection tab for Host use localhost and in the Password field use the password you used for POSTGRES_PASSWORD on the docker run command.
Click Save to close the dialog and connect to the server. The following is an example screenshot of what you will see showing the available databases on the server, which is just the default database in this case.
Make sure to check out the official docs here for more information as needed.
Other than the storage portion getting Postgres up and running in Docker was pretty simple. I hope like me this will give you a good jumping off point to learn more about both Docker and Postgres.
If anyone has any alternate ways to deal with persistent storage please leave a comment.
Also published on Medium.
19 thoughts on “Setup PostgreSQL on Windows with Docker”
Here’s an alternative way I’ve used:
$PGDATA = “/var/lib/postgres/data”
[Environment]::SetEnvironmentVariable(“POSTGRES_PASSWORD”, “pAssw0rd”, “Process”)
[Environment]::SetEnvironmentVariable(“PGDATA”, $PGDATA, “Process”)
# Create a local Docker volume
docker volume create –name test-db-volume -d local
# Run the container
docker run -d `
–name test-db `
-p 15432:5432 `
-v “test-db-volume:$PGDATA” `
-e POSTGRES_PASSWORD `
-e PGDATA `
# Review the create volume
docker volume inspect test-db-volume
Awesome Elijah, thank you for sharing!
Personally, I never bother with volume only containers to store the data. I just map a volume to my Postgres container.
The command to create a new container with ports, volumes, etc can get cluttered quickly so I also use docker-compose with a docker-compose.yml file that includes the port and volume mappings.
I check the docker-compose.yml file into source control so it’s easy to get my dev environment, including Postgres container, going on another machine, where it’s Linux or Windows.
Sounds like I need to look into docker-compose. Thank you for the tip Matt!
I agree with you on mapping a local volume on the host to the postgres container. My logic is that I can still be able to access the database on the host even if the docker engine is not running.
Suppose db data is sitting on the host at this location: c:/users/dockerdb (I am using windows OS)
How do you modify Eric’s command:
docker run -p 5432:5432 –name yourContainerName -e POSTGRES_PASSWORD=yourPassword -d postgres
so that the database can be located on local host and inside the PostgreSQL container. And changes in the database within the host directory or within the container’s directory can be reflected in both directory locations simultaneously?
task would be,
First ! on windows laptop, created one ubuntu virtual box and installed docker on top (i.e. host)
Second ! on docker host created one container using image, that container cunning with postgresql database.
Now ! i am able to access that postgre db from host machine, but unable to access from windows laptop (pgadmin) . can you help me on that
Windows laptop–>Ubuntu virtual box(Host), docker–> container(postgres) should access from
Windows laptop(pgadmin) —->container
Sorry, Sudheekar not a situation I have attempted. My guess is that the correct port isn’t exposed by Ubuntu by default.
I was wondering if you have found a solution. We are having a similar issue: we have a Docker Galaxy that runs fine on both Mac and Linux but failed on Windows 10 box due to a Postgre error (abnormal termination) that points to an inability to connect to the underlying db. Have tried a few things but no success thus far.
Thank you so much for the information! ? it was very helpful!
Glad to hear it!
Sorry, in that way we can create more than 01 PS server in on host windows?
But the client connection will be to the same server IP:port???
Sorry, but I’m not sure I understand what you are asking.
Thanks Eric you saved my week.
Glad to hear it!
Great article. But I have a different problem.
Using docker and postgresql is great, but how to sort of enable the WAL en replicate mode ? How to accomplish that ?
Thanks, Andrew. I have no idea about your question. Maybe someone will see this and be able to answer it.
Hello, it’s giving an error the moment I’m going to connect the db: “connect ECONNREFUSED 127.0.0.1:5432.
Can you help me?
Hey Anderson. I’m not sure what your issue is. From the error it sounds like it could be a permissions issue with Postgres.
My friend, thanks for the tip.
1. Opened the virtual box, stopped the pattern, removed.
2. I right-clicked docker quickstart, selected to run because the administrator.