I LOVE Dev Containers. Let me start with that. As a developer, I've tried others to manage dev environments, but I always came back to Dev Containers. Once you set it correctly, you don't have to worry about managing dependencies or differences between my machine and other settings. If you're tired of hearing "it works on my machine," you should give it a shot! And if you want to use Dev Containers, here are 5 things you should do.
But before jumping in, I want to start by explaining why it's better than virtual environments.
Why Dev Containers Can Be Better Than Virtual Environments:
Isolation & Reproducibility:
While Virtual Environments only isolate your code(e.g. python) dependencies, Dev Containers provides complete isolation from your host machine. Your development environment, including the OS, system libraries, and all dependencies, is entirely contained within the Dev Container, so you don't have to deal with "it works on my machine" again.
Testing:
As a developer, you often face a scenario where your code works on a virtual environment on your machine, but CI/CD jobs on remote servers still fail. Virtual environments are just sufficient for running unit tests focusing on your code and its dependencies, but Dev Containers is ideal for not only unit tests but also integration tests and system-level tests. Once you start using Dev Containers, you don't have to worry about your code failing in CI/CD jobs for a random reason.
I hope you are convinced a little by now and willing to give it a shot. If so, here are 5 things you should do when using Dev Containers.
1. Always Leverage Pre-Built Images to Speed Up Setup
Unless you are a docker expert who can make the world's most efficient docker image, please use pre-built official images. They provide a well-maintained, optimized base for your projects. Also, the community widely tests and updates them, making sure your environment always starts with a good foundation.
Example dockerfile
# Use an official Python runtime as a parent image
FROM python:3.9-slim
# Set the working directory in the container
WORKDIR /workspace
# Copy the current directory contents into the container at /workspace
COPY . /workspace
# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
# Make port 5000 available to the world outside this container
EXPOSE 5000
# Run app.py when the container launches
CMD ["python", "app.py"]
2. Automate Environment Setup with postCreateCommand
Automating your environment setup with the postCreateCommand is a best practice. You can consistently configure your development environment across all machines without manual intervention. It will reduce human errors and make sure everyone's environment is set up the same way. Again, you don't have to hear, "It works on my machine."
Example devcontainer.json
{
"name": "Python Dev Container",
"build": {
"dockerfile": "Dockerfile"
},
"settings": {
"python.pythonPath": "/workspace/.venv/bin/python"
},
"extensions": [
"ms-python.python"
],
"postCreateCommand": "python -m venv /workspace/.venv && . /workspace/.venv/bin/activate && pip install -r requirements.txt",
"forwardPorts": [5000],
"workspaceFolder": "/workspace"
}
3. Use Volume Mounts to Persist Data
As you may know, containers are generally stateless, meaning that any data you put inside is lost once containers are stopped. So, using volumes will save you from a catastrophic disaster when data is lost.
Example docker-compose.yaml
version: '3'
services:
app:
image: my-python-app:latest
build: .
ports:
- "5000:5000"
volumes:
- ./src:/workspace/src # Mount the source code directory
- ./data:/workspace/data # Mount a data directory to persist application data
- ./venv:/workspace/.venv # Mount the virtual environment to persist it across container runs
environment:
- FLASK_ENV=development
command: ["python", "src/app.py"]
4. Customize VS Code Extensions and Settings for Your Project
If your team uses VS code, customize its extensions and settings and share them. This will save you a lot of time, especially when onboarding a new joiner. And, of course, enhancing productivity and keeping consistency are the plus. Even small things like linting strategy or formatting settings, by specifying these in your devcontainer.json
, you will continue maintaining consistency across all developers on your team.
Example devcontainer.json
:
{
"name": "Python Dev Container",
"extensions": [
"ms-python.python", // Python support
"ms-python.vscode-pylance",// Pylance for enhanced Python analysis
"ms-python.black-formatter",// Black code formatter
"ms-python.flake8" // Flake8 for linting
],
"settings": {
"python.formatting.provider": "black",
"editor.formatOnSave": true,
"python.linting.flake8Enabled": true,
"python.linting.enabled": true
}
}
5. Share Your Dev Containers Configuration with Your Team
Sharing is caring. All the benefits of Dev Containers come from consistency across the team, so make sure you share your devcontainer.json
with others. This will always ensure standardizing the dev environment for your team, and one last time, you don't have to hear "It works on my machine" again. So, include these in your version control:
.devcontainer/devcontainer.json
-
.devcontainer/Dockerfile
- Any other scripts or files for settings
So here are five solid tips to get the most out of Dev Containers in VS Code. As a tech lead, I often feel very frustrated by hearing, "It works on my machine." And I know I am not the only one! So, if you want to avoid the same frustration, please give it a try on Dev Containers, and let me know how it works :)