11 minute read

Your machine is not a dev environment

How I run any project locally without trusting it or setting it up. One command, fully isolated, and I'm in.

When I first started developing for the web, I mainly touched on 3 kinds of files: HTML, CSS and JS. This is mostly fine while everything you do are static websites.

But as soon as you start dealing with backend dependencies such as PHP, Node, Ruby, etc, you quickly realise that version mismatch can be tough to deal with.

Overview

Well, it works on my machine.

This classic is the perfect illustration for one of the things I’m trying to solve here. Actually, I feel like this should be so relevant for any developer that I’m surprised I didn’t find a standard out there.

Well, there is devcontainers, which are an open specification for how to define development environments using containers. They’re backed by Microsoft and supported in multiple editors (VS Code, JetBrains and Neovim too). But the reality is that it does way more than I need to, and also feels too coupled with VS Code, which I don’t use anymore.

So, as I couldn’t find an industry standard for this, I attempted to do it myself.

Let’s look at what I’m aiming at with my new workflow:

  1. System-agnostic dependencies;
  2. Consistent workflow across projects;
  3. Safe execution of third party code;
  4. Zero modifications to the project;
  5. Disposable environments;

And now let’s go over each one and see why they are important to me, and might just be to you as well.

System-agnostic dependencies

You know when you have a legacy project running on Node 7 or whatever, that you can’t update because some other dependency doesn’t support a newer version of Node?

But at the same time, you still need a recent version of Node like Node 20, because you also work on recent projects.

This problem already has a solution. Tools like nvm or even mise allow you to install and manage multiple versions of the same tool locally on your machine. But I’m aiming for an even cleaner solution. I want to have a clean system. I don’t want to install any project dependency locally. Any! Not even system-wide third party tools that I will need to remember next time I set up a new computer.

Consistent workflow across projects

This is a very first-world kind of problem. But since I was going to think about this workflow in the first place, I wanted something that would allow me to just start developing without even thinking how does a specific project run. Is it npm start? Perhaps npm run dev? Perhaps it’s python manage.py runserver for a Python project? Or bundle exec rails server for a Ruby on Rails one?

I don’t want this in my life. I want something as simple as dev. That’s it. I want to be able to just cd into a project folder and run dev without ever thinking about what the project is, how old it might be, what the tech stack is and perhaps attempting to remember what the run command might be.

Safe execution of third party code

This was a big motivation behind all this workflow. Probably the biggest one, actually. You know, one thing is developing your own projects from start to finish having complete control over what you are doing.

Another thing is collaborating on projects with a team or open source projects, and for some reason, a someone pushes code that happens to have a malicious npm postinstall script. Maybe they didn’t write it themselves. Maybe it came from an abandoned package they depended on, but it executes code you never intended to run. I honestly don’t care.
What I care is since it runs on my host machine with my user permissions, it has access to my files, my SSH keys, my Git configuration—everything!

And lastly, you could also need to collaborate on open source projects you don’t fully understand or even trust. Running them on your local machine could potentially compromise your machine. Sometimes without you even noticing.

Zero modifications to the project

The reality is that for most modern projects, they will already have some kind of container for you to work on top of—more on that later. But if they don’t, introducing a Dockerfile might not be what the author of the project wants at that time.

But if you are concerned with the previous points, running the code without a container is not going to be an option either.

So this workflow should be able to create a container and run the project code inside the container, without ever adding container-specific files to the project.

This way I’m happy and repo authors are happy too.

Disposable environments

And lastly, if something goes wrong within the container, because you installed the wrong version of a package, but then it left some files behind and then you hammered some things and now can’t get the project to run… you can just trash the container and create a new one, without ever worrying it may break your host machine.

This kind of freedom is something I have been appreciating when self hosting my own instances of third party projects. We not always get things right the first time. And sometimes going back to a clean slate is the easiest way forward.

This new workflow allows me to work in environments very close to what production will be running on, minimizing the risk of stumbling upon situations where certain projects don’t quite run the same way on my machine and in the server.

Besides, I just need to set up the container information once, and then run it as many times as I want.

And as this is not tied to any specific IDE or text editor, I’m free to use whatever I want and still benefit from the same workflow.

So what is this dev command after all?

Before I explain how it works, let me show you the folder structure I’ve adopted for all my projects. It’s dead simple:

Plain Text
123
my-project/
├── repo/       # The actual project code lives here
└── container/  # (generated) Docker config lives here

The idea is simple: code stays in repo/, and everything Docker-related stays in container/. This keeps the project clean—no Dockerfiles or compose files cluttering up the actual codebase.

Now, the dev command is a bash script that lives in my ~/dotfiles/scripts folder and gets added to my $PATH. The workflow goes like this:

First, I run dev init to initialize the container environment in any project:

Bash
12
$ cd my-project
$ dev init

The script will ask me to pick a runtime (like node, python, ruby, etc.) and then fetch available versions from Docker Hub so I can pick the one I need:

Plain Text
12
Select runtime > node
Select node version > 20

It then creates a minimal container/ folder inside my project with just two files:

YAML
container/Dockerfile
123
FROM node:20
WORKDIR /app
CMD ["bash"]

I can then tweak the CMD line, which is the command that is going to run when this container starts. For instance, I could set it to CMD ["npm", "start"].

YAML
container/compose.yml
1234567891011121314
name: my-project-dev
services:
  app:
    build: .
    network_mode: host
    working_dir: /app
    volumes:
      - ../repo:/app
    environment:
      - PORT
    stdin_open: true
    tty: true
    init: true
    user: "${HOST_UID}:${HOST_GID}"

The key parts here:

Then, from the project root folder, I just run:

Bash
1
$ dev

And I’m dropped into a shell inside the container, with my code mounted at /app. That’s it! No more remembering if it’s npm start, npm run dev, bundle exec rails server, or python manage.py runserver. I just dev and I’m in.

The script also supports:

But what if the project already has a Dockerfile?

Then my dev command uses it instead of creating a new one. The script detects if there’s an existing Dockerfile in the project root and adapts accordingly. If the project maintainers have already put thought into how the project should run in a container, I benefit from their work without needing to understand their specific setup.

This is the workflow I’ve been using for a while now, and it’s completely changed how I approach unfamiliar projects. Whether it’s a 5-year-old Node project with ancient dependencies, a Python Flask app, or something I’ve never seen before—I dev into it and start working immediately, safely isolated from my host machine.

The script

You can check the script in my dev-command repo if you want to try it out yourself.

Save the dev file called, make it executable (chmod +x dev), put it somewhere in your $PATH, and you’re good to go.

Just remember: create a repo/ folder in your project, put your code in there, and run dev init (or just dev really).

Give it a try. Your future self will thank you.

Takeaways

This workflow removes the need to install project dependencies on my machine. Everything runs in an isolated container, which keeps my system clean and predictable across projects.

It standardizes how I interact with any codebase. Instead of remembering commands or stacks, I use a single entry point and adapt only when necessary.

It also adds a safety layer. Running third-party code inside a container reduces exposure to malicious scripts and unintended side effects on my host machine.

Finally, it embraces disposability. If something breaks, I reset the environment instead of debugging a polluted setup, making iteration faster and less frustrating.

Photo of Pedro