+++ to secure your transactions use the Bitcoin Mixer Service +++

 

         

Linux.com

Home News Enterprise Computing Cloud Computing Docker: A 'Shipping Container' for Linux Code

Docker: A 'Shipping Container' for Linux Code

Not so very long ago, shipping goods over long distances was a very different matter than it is today. Numerous types of products were often jumbled together in a single vessel, sometimes with untoward consequences. Stack the bricks next to the bananas, for example, and you may have a mess on your hands when the shipment arrives.

It's a similar challenge, in many ways, in today's heterogeneous computing world of multiple stacks and multiple hardware environments. "Dependency Docker logohell" is just one possible result, in the words of PaaS provider dotCloud, as applications get deployed across different hardware environments, including public, private and virtualized servers.

It was the invention of the intermodal shipping container back in the 1950s that relieved a considerable amount of shipper and carrier pain, of course. For Linux developers, there's open source Docker.

Build Once, Run Anywhere

"Docker enables any application and its dependencies to be packaged up as a lightweight, portable, self-sufficient container," dotCloud explains. "Containers have standard operations, thus enabling automation. And, they are designed to run on virtually any Linux server."

So, the same container that a developer builds and tests on a laptop will run at scale, in production, on virtual machines, bare-metal servers, OpenStack clusters and public instances -- or combinations thereof.

Developers can build their application once, in other words, and rest assured that it can run consistently anywhere. Operators, meanwhile, can configure their servers once and know that they can run any application.

An Enduring Goal

Docker vs VMs"As long as people have been developing applications, they've looked for ways to make these applications more portable from environment to environment," Stephen O'Grady, cofounder and principal analyst at RedMonk, told Linux.com. "The approaches over the years have varied, from middleware to virtual machines to virtual appliances. Docker represents a lightweight, container-based approach that is seeing an uptick in popularity at the moment."

Tapping Linux Containers (LXC), cgroups and the Linux kernel itself, Docker launched just a few months ago. Since then, dotCloud joined the Linux Foundation, took on a new CEO and -- just recently -- debuted Docker 0.5.0.

Docker has also been integrated into open source projects such as OpenStack, Chef, Puppet, Vagrant and mcollective, and numerous open source projects have been “dockerized” by the community, including Redis, Memcached, PostgreSQL and Ruby.

'Linux Made Sense'

"Flexibility and portability are becoming paramount in developing, deploying and managing applications in the cloud, particularly at scale and among large enterprise organizations," Jay Lyman, a senior analyst for enterprise software at 451 Research, told Linux.com.

"Docker is a tool that can package an application and its dependencies in a virtual container that can run on any Linux server," Lyman explained. "This helps enable flexibility and portability on where the application can run, whether on premise, public cloud, private cloud, bare metal, etc.

"While its backers hope to expand to other platforms, Linux made sense to start with given its prominence in cloud computing," Lyman added. "Docker also leverages Linux kernel capabilities for management and security."

Editor's Note: Don't miss DotCloud Senior Software Engineer Jerome Petazzoni's talk at LinuxCon/Cloud Open in New Orleans on Monday, Sept. 16 on "LXC, Docker and the Future of Software Delivery." 

 

Comments

Subscribe to Comments Feed
  • Kieran Grant Said:

    The use of Linux Containers is an excellent idea, the combined use of cgroups, the isolation mechanisms, namespace/pid/mount etc, will make it very successful. Will be good for portable platforms. This could even be used for user application, provide an isolated and standard environment across distros, of course, not for performance sensitive programs that need optimisations for each host. This would allow programs to come pre-packaged with their dependencies, but of course, with an increase in size of the programs, and an increase in security management. Overall, for servers, it appears to be a good idea. Might also work on Desktop computers.

  • peacengell Said:

    we should think about security as well.. look good I will check it in more details.. future computing = more security help ever.hurts never

Great LinuxCon Wild Goose Chase Rankings

  1. bobim6 8300 pts
  2. devLNX 7700 pts
  3. ryanteck 4400 pts
  4. Piglizard9 2900 pts
  5. oscailt 1600 pts

Join the game and enter to win up to $500!

Upcoming Training Courses

  1. LF411 Embedded Linux Development
    19 Aug » 23 Aug - Silicon Valley
    Details
  2. LF242 Linux System Administration
    26 Aug » 29 Aug - Virtual
    Details
  3. LF426 Linux Performance Tuning
    26 Aug » 29 Aug - Virtual
    Details

View All Upcoming Courses

Become an Individual Member
Check out the Friday Funnies

Sign Up For the Linux.com Newsletter


Who we are ?

The Linux Foundation is a non-profit consortium dedicated to the growth of Linux.

More About the foundation...

Frequent Questions

Join / Linux Training / Board