COPENHAGEN, Denmark — Amazon Web Services wants to be a welcome home for developers and organizations looking to deploy containers. At the DockerCon EU conference here, a pair of AWS technical evangelists shared their wisdom on the best ways to benefit from container deployments.
The terms microservices and containers are often used interchangeably by people. Abby Fuller, technical evangelist at AWS, provided the definition of microservices coined by Adrian Crockford, VP of Cloud Architecture at AWS and formerly the cloud architect at Netflix.
So what are Microservices?
Microservices are "service-oriented architecture composed of loosely coupled elements that have founded bonded contexts," according to Fuller.
With that definition out of the way, Fuller provided a few general best practices for running containers on AWS:
- Rely on the pubic API
- Use the right tool for the job
- Secure your services
- Automate where possible
When it comes to automation, the basic idea is to actually do less work. In the container space, automation is often achieved with orchestration tools. Among the most popular in use today is Kubernetes, which can also run on AWS. AWS evangelist Tiffany Jernigan noted Kubernetes can group containers that make up an application into logical units for easy management and discovery.
Avoiding Points of Failure When Managing Container Microservices
Jernigan advised the session attendees to beware of points of failure when managing container microservices. One of the ways to do that is by using source and version control technologies such as git or the AWS CodeCommit service.
Another best practice advocated by Jernigan is to always try to use smaller container application images. Smaller images mean faster builds and deploys, she said.
Monitoring and logging tools are also an important part of the container lifecycle. Among the tools mentioned by Jernigan were AWS CloudWath and datadog. She suggested that container users enable logging from both containers and the hosts. While monitoring is important, she cautioned that it's also important to log the right data to avoid unnecessary noise.
Jernigan additionally recommended that container users set reasonable resource limits and have a scaling policy to scale up or down and preclude resources from running idle.
"The bottom line: use what works for you," she said.
Sean Michael Kerner is a senior editor at ServerWatch and InternetNews.com. Follow him on Twitter @TechJournalist.