If you speak to any DevOps engineer about the best way to deploy applications today, they'll very likely talk to you about Docker. Docker is a tool that lets you pack and deploy applications in containers. Before containers, applications were built and deployed on virtual machines - and there are people who still use virtual machines. But Docker has changed the game in several ways.Docker is "shaping the future by helping us build highly distributed, resilient, fault-tolerant applications. Without these, building very scalable complex architectures for web applications will be difficult if not close to impossible." says Osazeme Usen, a Senior Developer at Andela. "Before docker," he says, " there was a problem of having a standardized environment to run your apps in a predictable way no matter what platform you put it in."
"It was working fine on my computer just now..."
One of most common problems that Docker solves for is the problem of programs failing to run or breaking when deployed on another machine with custom settings differing from the one on which they were built. Docker brings a level of standardization that ensures that all libraries and other dependencies for an application to run are packaged in one container and everything can be deployed as one package. That way, no matter the custom settings on the machines an application gets to run on, it works perfectly.
Stanley, checking out the event schedule[/caption]In June 2018, Andela developer Stanley Ndagi attended this year's Docker Conference in San Francisco and has written an extensive recap of the event. It was a 5-day event with top speakers from Google, GE, Docker, Pinterest, etc. Stanley's goal at the conference, he says in his recap, was to get answers to the following:
What considerations do I make when using Docker in Production?
What Docker security tools (not necessarily paid-for) can I leverage on?