Mastering Your Dev Environment

A guide to simplifying your projects

– by Justin Rhoades –

The Problem

While working as a consultant at Software Technology Group I have the opportunity to bounce from client to client and see all the kinds of dev environments. Over the years I have built for a range of companies from large public companies to small private startups and one thing I’ve noticed as a painful trend is the lack of good dev environment workflows. Be it people are comfortable in the workflows that they’ve always done or lack of knowledge of modern tools to help build reliable dev environments. I’ve been on the pain end of having to take hours or days to setup a machine just to start development.   

With the hope of helping people build better dev environments to ease the burden of onboarding new developers, contractors, and consultant I’d like to share my workflow on building my dev environment for my personal projects and how you can use them to simplify yours. 

The Tools

The tools that are going to help us build these dev environments are Docker, Docker-Compose and .env files. Thanks to tools like this, our dev environments are reproducible on any machine that can run Docker and gives us confidence that our applications are all closer to production ready. Let’s dive right in and go through each tool with some examples on how it’s going to help us improve our experience.  

Docker

In today’s world, docker took over the devops scene and provided us a powerful tool for deploying our code to production, creating testing environments, and give us the foundation for great automation in infrastructure. After this blog I’m hoping you’ll see it as a development tool as well as a devops tool. 

The above code snippet is of an application I run for personal use. Here’s a quick break down of what the file is doing: I have a Vuejs application that I’m copying into a prebuilt node-lts alpine linux container. I’m installing my dependencies for the project and building it in the first stage of this image.  The second stage is using an alpine linux image with Nginx to serve the Vuejs application on port 80.  

Having a Dockerfile describe how your computer needs to be setup to run your application does 2 things. First you can use the dockerfile to test any changes you make so your code is always working against the “production machine”. Or you can use this dockerfile to make sure you have all the system dependencies on your local machine to run the application in dev mode with hot reloading. There is a method to get hot reloading working in a Dockerfile but that is beyond the scope of this post.

Docker-Compose

This is where the time saver exists for your entire team. Docker-Compose is a utility tool that let’s you define a group of services to run at the same time and hooks them up on a “network” on your machine. These services can be Dockerfile, containers you have on your machine, ones you can pull down from remote registries like Dockerhub, or pull down from internal registries like Gitlab registries. Let’s look at a `docker-compose.yml` and break down what’s going on.

This is a working docker-compose file from another project of mine. In this one I’m running a Nextjs frontend, golang api, and a postgres database. All of these services are on a bridge network called `ihann-network` letting them communicate with each other as if they were just visible on my machine. I also have ports being exposed so that my machine can contact them. For example, I can use my DataGrip IDE and access that postgres database just like it was installed on my machine directly but is installed in a docker container running on my machine.

Now as a developer if I pull a repository with a docker compose I have one command and I should be able to run the entire project on my local machine with little to no outward dependencies. No need to have a cloud dev environment for your frontend devs to work against. Your backend devs can walk through the frontend while having their favorite debugger running on the backend to make sure no weird calls are being made. Both frontend and backend can work directly with the database on their local machine to run odd data based bugs or tests. There are more use cases to think about but the nice thing is if there is problems they can purge their containers and rebuild it to the state they pulled it down as to have a “latest and greatest” setup no matter what they’ve done.

.env

`.env` files are a heavily under utilized tool that can help you keep your environments safe. In the use cases where you need to connect to an external service or you are working across different environments `.env` files makes it so the details you are sharing are not damaging to your other systems.

In the above example you can see how I broke the `.en` by the service and can see all my credentials, urls, and any other important variables I’d like to set on a per environment basis up.

Let’s Build Something

Let’s run through an example. We are going to build the environment for a fullstack application using SvelteKit, Postgres, and have an external API in Rust. Don’t get too excited, we are just focusing on building our environment, not the actual application. 

To start I’m gonna build the `docker-compose.yml` to host our database.

Notice that this DB service is way more stripped down then the one I showed earlier. I don’t have a network I want it to connect to nor do I want to keep the DB persistent at the start of my process. With our docker-compose file setup we can run `docker-compose up` to start the DB container and have a DB that we can develop against for the rest of our project.

Next we should build some dockerfiles to be our base for the SvelteKit and Rust programs.

Here is the SvelteKit dockerfile split into multi-stage dockerfile. This will help with caching and speeding up the builds. In this example I have it install dependencies in a dedicated stage so as long as my `package.json` doesn’t change this stage can be cached only running the build and deployment stages for testing or actual production deployment. 

Here’s the Rust application’s Dockerfile. Notice that, since Rust builds a binary, we don’t need any overhead of something like Node and can use a slimmer deployment image.  Also notice that we can run specific scripts in Dockerfile while it’s building for debugging purposes.  These should be use wisely but I wanted to show off the capabilities while you were here.

Now that we have our Dockerfiles and let’s assume we’ve done great work on the actual programming and it’s time to test things out in the “network” to make sure there isn’t any sort of hard coded or localhost dependent thing going on.

Now that we’ve put the client and api service in the network, we can run `docker-compose up` and all 3 services will come alive and you can begin your testing. All 3 of these services can now see each other using the service name `http://api:8001` which will be used for the url instead of `http://localhost:8001`. This allows us to catch bugs like this quickly and because the containers are isolated, this allow us to iterate in a “network” like setting on our local machine.

Final Thoughts

I want to share another tip with you all – Now that you have a Docker-Compose dev environment setup and you’ll be wanting to work on your development be it client or be it your api. One of the nice tricks with `docker-compose` you can run specific services you want to be running at a time. So let’s say you were focusing on your client code but didn’t want to run your api in dev mode, you can use the following `docker-compose up db api` to kick up just those services so you can run your client in dev mode and work against a stable and in this case more performant api again getting it all the closer to as if developing against production.

There is other tools out there to further this process like code first ORMs that will build the DB schema and seed it with data that way your developer has “real” data to work with and is ready for whatever ticket you’ve given them.    

Moreover, by leveraging containerization, you’re not just achieving consistency in development environments, but you’re also preparing your applications for scalable and resilient production deployments. As Docker continues to dominate the DevOps landscape, understanding how to utilize its capabilities for both development and operations is an invaluable skill.  

It’s also important to remember that while Docker provides an essential foundation, it’s only a part of the puzzle. Integrating it with continuous integration and continuous deployment (CI/CD) processes, testing tools, and monitoring solutions can elevate your software development lifecycle to new heights.  

In conclusion, I hope that this post has given you a clear insight into how Docker and Docker-Compose can be utilized beyond just deployment and operations. Embracing such tools not only modernizes your workflows but also ensures that you remain agile and efficient in the ever-evolving world of software development. Remember, a little investment in setting up a well-structured development environment now can save countless hours of troubleshooting and inconsistencies in the future.