Dev Containers: the (not so) hidden gem?
written by martin · published on · 9 minutes to read

Ever since I started my first software engineering project, I felt a huge pain when setting up the development environment. Either the documentation was slightly outdated or did not work for WSL2 or only worked for Ubuntu or missed a super obvious step that was clear for everyone on the team but not for me. Also, do you know the super awkward moment when you need to explain hours to just fix a button in an application, because your JAVA path needed some update?

I know that pain all to well! …when fixing a simple typo requires multiple hours because you also need to fix you local development environment.

When I learned Dev Containers were a thing back in 2022, I was instantly hooked. In this document I will go over my endeavor of adopting Dev Containers, the highlights and lows when using Dev Containers in cooperate or private projects and of course how I got to keep vim.

Tldr; Dev Containers are cool if you want to automate your development environment. The “it works on my machine” will likely be reduced.

Preamble

During my time at university back in 2012 I got introduced to vagrant. Back than it felt revolutionary to have multiple environments that could contain totally different configurations and are versioned… sort of. That was also the time when pyenv was released and allowed me to work on different environments on the go. Vagrant simply was to taxing on CPU, memory and battery.

Fast-forward to 2016 where I worked primarily with Node.JS, React and Webpack. That was that time where cattle and heroku was trending, not K8S. For some reason Webpack did not like Windows or OSX that much (it was extremely slow), so we packaged the entire project into a docker container. It was still slow, but less slow… let’s say acceptably slow. Without knowing we got our first Dev Container!

Docker works on my machine! I always tried to use Linux based systems when ever possible. Although I see the elegance in Apple products, I was never drawn to it. Naturally, Docker running natively on Linux systems, I never had any performance or memory issues. For my coworkers it looked a bit different. WSL at that time, was working barely. And the OSX implementation is to this date a bit awkward.

In 2021, I applied for the preview of GitHub Codespaces. This was essentially a hosted, web-based Dev Container as we know it today. The one-click configuration for existing projects made the adoption very easy. Especially the integration in VSCode made it easy setup and connect.

Cloud hosted

Before you now get a jump scare just because you read cloud, I can assure you Dev Containers can (and I am most often use them that way) on your local machine.

Codespaces or products like Cloud Workstations allow you to have a powerful development environment. Your computer becomes a simple terminal.

Reality is that more often than not, as a developer, your life is much easier if you can develop against running APIs. Especially when implementing a complex microservice architecture. So if you have the choice between a correctly setup-up database, running dev-instances and hopefully up-to-date dataset, or a docker compose up that requires some prayers to work correctly, what would you choose?

And let’s face it, no developer should ever have production data on his or her machine. And yet, I have seen it all too often. A simple solution for the compliance aspect then becomes hosting a VSCode server instance somewhere in a controlled environment. Small companies might also be drawn to that idea that internet connectivity is a given. And besides that, how cool is it that you can fix a bug in production on your phone while you are somewhere on a pool?

I often resorted to the locally installed pip or node CLI when ever a Dev Container was not set up correctly. That was the main reason why I (maybe subconsciously) embraced cloud based development. But this came with a set of problems. Depending on where I was working (coworking, home-office, train, …) the internet connection was not always perfect. I waited for countless re-connects and lost too many file changes that where not properly saved, or the machine timed out before I committed the changes. In most instances of data loss it was clearly operator error and I take full responsibility. Still - these things seldom happen when developing locally.

And then there is my love for vim. I am not the only one with this obsession. I must say vim over ssh works very well, as long as you have a stable internet connection. This is one thing that VSCode does much better. Now, more than 3 years into using Dev Containers I have found a mix where most projects can be developed in a cloud based container or locally on my machine. I use VSCode for projects that only need a quick touch up and where the web-based UI is typically enough and local Dev Containers with vim for everything else.

VSCode or Vim

Let me prefix this paragraph with this: the choice of editors is highly personal - and I respect that!

Many firms I worked for use IntelliJ or JetBrains products. Most of these enterprise products should support Dev Containers by now. Although not all bleeding edge features are ready everywhere, I fully suspect that to be solved in the coming months and years. VSCode even allows you to configure the extensions that you want to have configured and enabled as per project. I found that around 90% of all developers use these company approved IDEs and are pretty happy with it.

And then there is the species of vim users. I thought if you can configure your dotfiles in your Dev Container config, what stands between me and my beloved vim IDE? As I found out: Nothing! You can even have different configs for different projects, clients, environments! Awesome! All you need to do is pass to the devcontainercli is --dotfiles-repository https://git.sr.ht/\~eigenmannmartin/dotfiles. You can also install Dev Container features without the need to edit the projects devcontainer.json by adding --additional-features='{"ghcr.io/eigenmannmartin/devcontainer-features/neovim:0": {"version":"nightly"}}'. That way you can have your neovim and fzf binaries without annoying your mates.

What will not work, or at least I haven’t found a nice solution yet, is the browser based OAuth flows.

CLI

There are multiple implementation of the Development Container standard. I stuck with the official reference implementation. For some time this defeated the whole Idea that eventually I wanted to get rid of the node.js binaries on my machine. I ended up packaging the CLI into a container and writing some aliases that would abstract any complexity.

One feature that was a huge pain, was the X11 forwarding. Microsoft fixed that first in WSL2, where it works out of the box. For Wayland based Linux you have to allow connections with sudo xhost +. The only other gripe I have is the port-forwarding is not as intuitive as when using VSCode.

Testing

A huge benefit of developing in a Dev Container is that you can have the envifonment relatively close to what will eventually be production. I know, there are some edge cases like if you rely on eBPF and the production cluster has a kernel version lower than 5.4 but your machine runs > 6.0. But in general one can be pretty shure that if it works on your machine, it will als work on my machine.

I do not know about you, but have you ever tried to debug an error that only occurs inside the CI/CD pipeline? Sadly I have! Commit and pray that you have logged the right variables or fixed the right test data entry. And although not all container based pipelines are perfect, it allows you to easily debug on your machine and most importantly forces fellow developer to also check in all relevant test and mock data.

Over the years I have seen and tried many different test setups. From docker-compose over overloaded docker containers and docker-in-docker to Testcontainers. All approaches work with their own respective weaknesses and strengths. Regardless of what you use, in my view, the most important aspect is repeatability.

Development

The development is pretty smooth once the Dev Containers are set up. When using docker without the need for a VM (Windows, OSX, *cough*) the performance is very good. Apart from the initial download of the docker images you get near native speeds.

I especially love the fact that my CLI-logins are now held per project/container. No more sweat: ‘sh** - was what gcloud project was that?!??’ And now I can literally re-install my dev environment with two commands! And this is also the biggest weakness if you work in a bigger team. Every time someone changes the Dev Container config, you will have to rebuild that container. You will lose all logins (unless you have persistency configured) and it will take a few minutes out of your day. This is especially annoying if you have to jump different branches and review changes to Dev Container config.

And keep in mind that Dev Containers when run locally will not time out and will drain your battery, even if all GUIs are already closed.

Why should you care?

If you are happy with your current setup, please don’t! Otherwise, it is a another tool that can speed up or reduce toil if you work on different projects with very different dependencies. Or if you have projects that you need to touch once in a year. But be aware that if you decide to adopt Dev Containers, it has a learning courve and it will take some getting used to, until you can reap the full benefits.

Where to start?

  1. Install VSCode and create your first container. If you feel fancy, install the Dev Container CLI and then use devcontainer exec to run commands in your Dev Container.
  2. Once you got that running, checkout existing features or create your own feature.
  3. You could also try to run GUI applications (e.g. playwright) inside your Dev Container.

Now you are ready to migrate your first project Dev Containerized project.

Updates

26.8.2024: Updated styling for Where to start? paragraph.