How to Use a Remote Docker Server to Speed Up Your Workflow

How to Use a Remote Docker Server to Speed Up Your Workflow

has this ever happened to you you’re
working on your cool project and you go and build your docker image and push it
it just takes forever and goes on and on and docker takes its sweet time doing it
you might be building a resource intensive image compiling lots of code
or just pushing a large image that your internet connection can’t handle well
luckily there is an easy fix for these issues docker lets us offload all those
tasks to a remote server so our local machine doesn’t have to do all the hard
work all this needs is docker running on a remote server and really a one-line
configuration on the client so let’s set up the server I’ll go with the docker
image from the marketplace this is an Ubuntu 18.04 image with docker
pre-installed I want to have a decent amount of processing power so I’ll go to
the CPU optimize plan which provides dedicated hyper threads this one is fine
I’ll go with Frankfurt make sure my SSH key is selected and create the droplet the droplet is now created and we have a
ready to use docker server but before we start using it as a good security
measure will create a new user to use instead of root so I’ll copy the IP
address I’ll go to my terminal and SSH into it as root to create the user I
will run add user I’ll name it sammy enter a password these can be empty
and it’s created let’s add it to the docker group so we can give it access to
the docker engine running on the server and I’ll copy my SSH key to it so I can
use it to SSH and I’ll change the ownership from root to the user so it
can use the files and that’s it let’s try to login with it now that works
great I’m back to my terminal on my local machine here and all we need to do
to use it is set the docker host environment variable
so I’ll set it to ssh://[email protected] droplet’s IP address ssh://[email protected] droplet’s IP address and now every
docker command I run will be actually run on the host and to verify we can run
docker info and you will see the name of the docker host is my droplet’s hostname
and not my laptop’s hostname okay so let’s try to build our image again and
see how it goes now so it’s pulling the go image which I already had cached on my
computer but once that’s done it will be an apples to apples comparison so it built the image fairly quickly and
it’s pushing it and you can see it uploading the layers much faster than on
my computer this is because the droplet’s upload speeds are much much better than
my internet connection and there you go just like that we set up a remote docker
server that we can use to save our laptops from overheating one thing to
note is that when you run docker build the build context which is all the files
and folders accessible from the Dockerfile will be sent to the hosts and then
the build process will be run so depending on your build context and the
amount of files it may take a longer time to build compared to your local
machine so you’ll have to find the right balance there one solution would be to
create a new directory that is dedicated to the docker image and copy or link only
the files that will be used in the image so that no unneeded files will be
uploaded to the docker host slowing down your process also once you’ve set the
docker host variable using export its value will persist until you close your
shell or you can unset it by running export DOCKER_HOST=
and just setting it to an empty string
and once I clear it I will go back to using my local docker

About the author


Leave a Reply

Your email address will not be published. Required fields are marked *