DockerCon day 2

Docker in the enterprise

DockerCon Day 2

DockerCon Day 2

Yesterday I sat in some good talks on Docker internals, security, and the latest and greatest additions to Docker. I have to say, they were inspiring and I’m looking forward to getting deeper with Docker. Today is about Docker for the enterprise.

Docker promotes container as a service (CaaS). Docker Data Center rolls up deploying, monitoring, and remediation of containers as a service . Data Center gives you a CaaS that can run in the cloud, virtual hardware, or your on-premise physical hardware. The platform seems to be nudging other applications that compete for the same space. So it’s important to research the discriminators of each platform. The vendor space at DockerCon is packed with vendors that bring their own value to the ecosystem. It seems like a playing field filled with opportunity.

So back to Docker Data Center. We were given a demo of what it can do. One example during the day two keynote was security monitoring. This seems to dovetail with one of the security talks from the day before. Typically Docker containers are built from previous containers. For example, the NGINX container that you can inherit for your PHP app probably inherited from the stock/trusted Ubuntu image. In a way, establishing a chain of trust for images. If you inherit from trusted sources you can take advantage of the patches that are available in your chain.

The demo walked though the easy, peasy deployment process to push a container to production. Signing of containers and enforced verification is configurable. Once a container is deployed, a security process monitors docker containers for vulnerabilities. The container configuration is analyzed for vulnerabilities from the container library on Docker hub. Emphasis on the importance of building from trusted sources. Data center includes a security scanner that automatically monitors all deployed code. If a container is flagged with a vulnerability, the dashboard notifies an administrator and remediation begins. In this case, the Dockerfile was updated with the patched Ubuntu image version, recompiled and pushed to data center. Data center scans the package and gives it a green light. The package is deployed and the swarm turns green again. Yay, Docker.

Docker Data Center monitoring

Docker Data Center monitoring

Docker Data Center gives the enterprise tools to manage their Docker infrastructure. You can deploy quickly and securely. It provides a mechanism to scale up, load balance, and perform security scanning. Security sensitive organizations are using Docker to enforce policies and procedures in a completely repeatable way.

There was more good stuff today. Docker announced an App Store (whaaat?). Ok, everybody seems to have an app store. It makes sense. It will be interesting how vendors and the community embrace it. My last thought after the morning keynote is, is that environments are disappearing to the background. The monuments to enterprise software we spent years building, maintaining, and complaining about are fading into the background. If you can be modular, you don’t need to be a Microsoft solution shop or Linux shop. You can pull in the pieces that you need and concentrate on writing great code. Actually, that’s all we developers want to do anyway.

#DockerCon 2016

DockerCon First Impression



This is my first DockerCon and first big dev conference. So far it has been a well organized event. Ok it’s just the keynote. But so far so good.

Notes to self…

  • Get Docker for Mac now!
  • Give Virtual Studio code another look. (Atom still won’t launch on any of my machines.)

The morning opened with a coffee sacrifice to the demo gods. No really, DockerCon is a demo heavy conference. They take their demos seriously. Opening remarks were around the state of the Union. 2900+ contributors to the ecosystem and a growing application base on Dockercloud. The cool thing about Docker is that a complete application infrastructure can be packaged an deployed from a Github repository. Pretty impressive.

The first demo sold me. From a developer point of view, debugging live web applications can be a challenge. The simplest level of debugging is logging and instrumenting the code. This demo is running Node.js app which is challenging for live debugging. Step one of the demo was to clone the conference’s instavote-app Github repository. Step two was to issue the command to launch the container. That was it. A python app, talking to a Redis queue, talking to a Java worker node, talking to a Postgres database, with a Node.js front end is running and operational. Pretty sweet.


DockerCon instavote-app

An even more impressive part was the debugging. By dropping the Node.js client code in Virtual Studio Code and setting a breakpoint, the editor/docker container just paused the code. No configuration necessary. Just this one thing can be incredibly useful for developers. I should point out the container manages the software too. No additional installs like Node, Java, Postgres, etc were needed to run the system. This seems like a good way to deploy code and enforce static platform versions in development environments.

We’re moving on to Swarm now. Swarm is Docker’s new platform for node clustering with built in security, self-healing, and fault tolerance. Another impressive demo. I’m looking forward to this now. It seems like magic, but it’s a powerful tool for both developers and DevOps. This should be a good conference.

#DockerCon 2016

APIs and Web Clients with MEAN


My background is in Java where I build web services and APIs. My main job as an API developer is to move, organize, combine, and publish data. I found the MEAN stack is a great way to prototype and build APIs with a flexible document-based data store. Finding ways to include Node as an option in my technology stack is also a plus for me.

The truth is, I started experimenting with a MEAN project because of a failed certification test. It was a typical technology exam: study, memorize, practice, and hope for the best. Unfortunately this test had no practice exam to help flesh out potential questions. Frankly, it’s also a topic that wasn’t particularly interesting, but still essential for me to prove knowledge.

The study guide for this exam was a list of general subject areas and three or four, off the shelf, books as suggested reading. So while I’m in the cooldown period to try again, I took a sidetrack into something more interesting and trying to make my studying easier. I wanted to build an app with some realistic questions that I could study in flash card style. Basically, it was an effort to create a practice exam.

The next series of posts will document the steps I took to set up the stack, create the API, and build the client. The client is a single-page application that lets the user create the data set and run through the practice flash cards.

What is the MEAN stack?

The MEAN stack is a web solution stack (like the ever-popular LAMP stack) made up of four technologies: MongoDB, Express.js, AngularJS, and Node.js.

Solution stacks represent a complete suite of technical components used together for applications and services. For example, a typical web application requires a web server, a client interface, and a persistent data store. MongoDB provides the persistent data store. Express.js provides the HTTP engine to service both API and client requests. AngularJS is key for our data-driven client. Finally, Node.js provides the foundation for a data API.

There are different packages that can help you get up and running with the mean stack. and MEAN.JS offer a template system and scaffolding to get started. The approach I wanted take was to walk through an example using an enterprise ready MEAN framework, StrongLoop.

StrongLoop is a tool I learned about at a Node.js meetup about two years ago but never explored with clients. At the time, I was impressed with how quickly you could craft an API that allowed you to build and store simple JSON data. To be honest, coming from building relational data services and integrating with strict SOAP schemas using Java and Spring, it blew my mind how easy it was to throw data at an API and have it stick. That’s the beauty and challenge of a document based data store.

The StrongLoop scaffolding provides project creation and management tools like the other packages. It also provides enterprise capable API management and monitoring services. Since my introduction at that meetup, IBM acquired StrongLoop and did a soft rebrand as API Connect. This is a good move for StrongLoop and gives them more enterprise cred with IBM. My experience so far is just as slick as I remember. I really hope to see more of them in the future. My example covers the basic scaffolding, API design, and creation using StrongLoop. I haven’t explored the API management tools that set them apart. I’ll save that topic for another time.


Step 1: Building an API with StrongLoop.