Microservices is an approach to software architecture that builds a large, complex application from multiple small components that each perform a single function, such as authentication, notification, or payment processing. Each microservice is a distinct unit within the software development project, with its own codebase, infrastructure, and database. The microservices work together, communicating through web APIs or messaging queues to respond to incoming events.

A useful, succinct definition from Sam Newman’s book, Building Microservices, is that microservices are “small, autonomous services that work together”. This encompasses three key aspects of microservices. They are small enough to be worked on by individual developers or small teams, and they focus on doing one thing well. They are autonomous, enabling them to be deployed and scaled as needed, and without consulting the teams in charge of other microservices when the internals of the microservice change. This is possible because as the microservices work together, they communicate through well-defined APIs or similar mechanisms.

Developing with Microservices

A microservices architecture is frequently adopted to solve the problems that arise with other architectural models as projects grow. Traditional, monolithic architectures might logically separate functions into component modules, but all modules are kept in a single codebase and there are usually complex interdependencies between them, which makes it difficult to change the code for one module without breaking others. Even if developers concentrate on just a few modules, they have to spend time and energy tracking changes across the entire codebase because changes in other modules might affect their modules. Hiring new developers to fuel growth yields quickly diminishing returns, because it takes a long time to master the huge codebase before they can safely add a feature or fix bugs.

Componentizing software functions into microservices makes it easier to scale up a project. With individual codebases for separate systems, it becomes easier for developers to reason about the effects of changing code. With individual deployments and infrastructure for different services, it becomes easier for DevOps teams to add more computing resources only where they’re needed.

Building microservices-based applications requires understanding how the components of your application work together, and designing interfaces to those components that allow you to tease them apart. As Adrian Cockcroft, formerly the lead Cloud Architect at Netflix explains, in a microservices architecture the goal is for the component microservices in an application to interact with one another with the same kind of loose coupling and independence as in interactions with systems from an external service provider.

Managing Microservices

Microservices are frequently combined with some form of containerization, and most of the management tools for services are centered on managing and scaling containers. Common management tools like Kubernetes and Docker Swarm are designed with microservices in mind. Microservices are most often deployed on platforms for managing containers or virtual machines, such as Amazon’s EC2 Container Service, the Google Cloud Platform Container Engine, and the Microsoft Azure Container Service.

Deploying microservices is often one of the most challenging aspects of switching over from a monolithic architecture, because it requires taking into account API versions and integration testing across multiple domains, which are nonissues for a monolith. As such, automated monitoring is critical to microservices deployment to ensure that each component is working smoothly. Partial failures in microservices applications are much more common than in monoliths, and the system needs to be designed with fault management in mind.

How Can NGINX Plus Help?

NGINX Plus and NGINX are the best-in-class load-balancing solutions used by high-traffic websites such as Dropbox, Netflix, and Zynga. More than 266 million websites worldwide, including the majority of the 100,000 busiest websites, rely on NGINX Plus and NGINX to deliver their content quickly, reliably, and securely.

As a software-based application delivery controller (ADC), NGINX Plus is designed to provide the speed, configurability, and reliability that’s essential to modern microservices architectures:

  • NGINX Plus provides on-the-fly reconfiguration for simple service management and integrates easily with common microservices management tools such as Kubernetes. Leading companies like Netflix use NGINX at the core of their microservices deployments.
  • Operating at scale demands detailed monitoring. NGINX Plus offers robust live activity monitoring so you can see how services are responding to load and focus your resources where they’re needed.
  • As a software-based load balancer, NGINX Plus is perfect for service discovery in multi-service deployments where configurations are constantly changing.