NGINX.COM
Web Server Load Balancing with NGINX Plus

Application development and delivery has been changing drastically in the last number of years. The growth of public cloud is a large catalyst for this change. Public cloud provides the ability to provision and scale infrastructure on demand, with almost limitless possibilities for distributing services across all corners of the world. These changes have not only caused a shift in how applications are developed, but in the approach to application delivery.

The legacy approach to application development typically involves hosting your own data center or building an application‑centric infrastructure within a colocation facility. Many organizations distribute their infrastructure across numerous physical locations, whether it be a set of regional offices well‑disposed to running IT infrastructure or a group of colocations strategically located near a user base.

In order to provide the proper availability to these applications, hardware load balancers are commonly deployed at each physical location to deliver access to all of the enterprise’s applications. Updates to these applications often require a configuration change on the load balancer, which typically means opening a ticket with the network operations team. These types of implementations tend to be fairly static due to the nature of deploying physical hardware that is dedicated to a specific location.

Fast forward to the present era of application delivery in the cloud. Rather than needing to build a private data center from the ground up or expand an existing infrastructure for new application builds, users of public cloud can consume resources on demand. Developers can rapidly deploy resources across a handful of public cloud providers within any region of the world they choose with just a few clicks of a mouse.

To take advantage of the agility and scalability provided by the public cloud, you need a lightweight, multi‑cloud application delivery platform like NGINX. Its small footprint and high performance mean you can deploy a load balancer for each application, rather than having to rely on the kind of heavyweight load balancer that was originally built for north‑south traffic in the data center.

In many modern applications, NGINX starts life as part of an application stack, which empowers developers to build application delivery directly into the application itself. NGINX also allows for the type of scale that can be required in the public cloud. As applications grow and shrink in response to demand, their application delivery needs can be met in real time. The dynamic nature of public cloud coupled with the small footprint and open source nature of NGINX provide a powerful solution for modern‑day application delivery.

While public cloud adoption continues to see tremendous growth, private cloud data centers are not going away any time soon. Many organizations have made large investments in private cloud infrastructure that will still be viable for years to come. The market for hybrid cloud is forecast to be very strong for at least the next handful of years, which means physical appliances will continue to have a place in the application delivery stack.

A hybrid cloud approach augments the traditional app‑delivery design, by keeping hardware appliances at the front of the stack where they can perform Layer 4 functionality. Deploying a lightweight and flexible tool such as NGINX closer to the application for Layer 7 functionality enables DevOps teams to scale the application itself. The hardware appliances have the dedicated resources to run more sophisticated and complex traffic management and enforce security policies, ensuring that global traffic is getting delivered to the proper location. NGINX runs a subset of those services closer to the app and dynamically changes with the application as needed.

IT continues to evolve and change the landscape of how applications are delivered. Traditional environments will continue to intersect with newer hyperscale approaches in the foreseeable future. NGINX provides tools that empower efficient application delivery across the entire spectrum.

Editor – Want to try NGINX Plus? Start your free 30-day trial today or contact us to discuss your use cases.

Hero image

Learn how to deploy, configure, manage, secure, and monitor your Kubernetes Ingress controller with NGINX to deliver apps and APIs on-premises and in the cloud.



About The Author

Adam Fisher

Cloud & DevOps Engineer

About F5 NGINX

F5, Inc. is the company behind NGINX, the popular open source project. We offer a suite of technologies for developing and delivering modern applications. Together with F5, our combined solution bridges the gap between NetOps and DevOps, with multi-cloud application services that span from code to customer.

Learn more at nginx.com or join the conversation by following @nginx on Twitter.