NGINX.COM
Web Server Load Balancing with NGINX Plus

This blog is the third in our five‑part series about Kubernetes networking for Microservices March 2022:

Also be sure to download our free eBook, Managing Kubernetes Traffic with NGINX: A Practical Guide, for detailed guidance on implementing Kubernetes networking with NGINX.

Are you planning to serve API requests from Kubernetes and wondering about best practices for deploying API gateways? Then Unit 2 is for you!

Three activities guide you progressively from a high‑level overview to practical application. We suggest you complete all three to get the best experience.

Step 1: Watch the Livestream (1 Hour)

Each Microservices March livestream provides a high‑level overview of the topic featuring subject matter experts from learnk8s and NGINX. If you miss the live airing on March 14 – don’t worry! You can catch it on demand.

In this episode, we answer the question “How do I expose APIs in Kubernetes?” It covers:

  • Best practices for deploying API gateways in Kubernetes
  • Authorization and authentication
  • OpenID Connect (OIDC)
  • Rate limiting

Step 2: Deepen Your Knowledge (1–2 Hours)

We expect you’ll have more questions after the livestream – that’s why we curated a collection of relevant reading and videos. This Unit’s deep dive covers two topics: tool options for deploying API gateways in Kubernetes and use cases you can accomplish with these tools.

Blog | How Do I Choose? API Gateway vs. Ingress Controller vs. Service Mesh
Start by reading this blog, which guides you through the decision about which technology to use for API gateway use cases, with sample scenarios for north‑south and east‑west API traffic.
Webinar | API Gateway Use Cases for Kubernetes
In addition to discussing the various tools and use cases, our experts demo how you can use an Ingress controller and service mesh to accomplish API gateway use cases.

Now that you have a good foundation in how API gateways can be deployed in Kubernetes, it’s a good time to dive into common use cases. The following blogs walk through how to implement two common use cases using Kubernetes tools: identity verification and rate limiting.

Blog | Implementing OpenID Connect Authentication in Kubernetes with Okta and NGINX Ingress Controller
The Ingress controller is an ideal location for centralized authentication and authorization in Kubernetes, which can help you reduce errors and increase efficiency. In this blog, we walk through how to implement single sign‑on with NGINX Ingress Controller as the relaying party and Okta as the identity provider in the OIDC Authorization Code Flow.

Blog & Video | How to Use NGINX Service Mesh for Rate Limiting
High request volume can overwhelm your Kubernetes services, which is why limiting each client to a reasonable number of requests is a crucial component of your resilience strategy. In this blog and video, we demonstrate how it can take less than 10 minutes to define and apply a rate‑limiting policy using NGINX Service Mesh.

Bonus Research

If you’re keen to deepen your knowledge on APIs and API gateways – and have more than 1–2 hours to spend – then we suggest a series of additional resources:

eMagazine | Real-Time APIs: Design, Operation, and Observation
This eMagazine addresses effective planning, architecture, and deployment of APIs for reliable flexibility.
eBook | The NGINX Real-Time API Handbook
This eBook is a comprehensive guide to reducing latency in your applications and APIs without making any compromises. It covers trends driving real‑time APIs, a practical reference architecture for any enterprise, and helpful vendor comparisons and case studies to make the business case for investing in the proper API management solution.

eBook | API Traffic Management 101: From Monitoring to Managing and Beyond
In this O’Reilly eBook, Mike Amundsen introduces developers and network administrators to the basic concepts and challenges of monitoring and managing API traffic. You’ll learn approaches for observing and controlling external (north‑south) traffic and for optimizing internal (east‑west) traffic. You’ll also examine the business value of good API traffic practice that connects your business goals and internal progress measurements to useful traffic monitoring, reporting, and analysis.

 

Step 3: Get Hands On (1 Hour)

Even with all the best webinars and research, there’s nothing quite like getting your hands on the tech. The labs run you through common scenarios to reinforce your learning.

In our second self‑paced lab, you use NGINX Ingress Controller to implement rate limiting and prevent an API and app from getting overwhelmed by too many requests. Watch this walkthrough of the lab to see it in action and learn the “why” behind each step.

To access the lab, you need to register for Microservices March 2022. If you’re already registered, the email you received with the Unit 2 Learning Guide includes access instructions. Alternatively, you can try out the lab in your own environment, using NGINX Tutorial: Protect Kubernetes APIs with Rate Limiting as a guide.

Why Register for Microservices March?

While some of the activities (the livestreams and blogs) are freely available, we need to collect just a little personal information to get you setup with the full experience. Registration gives you:

  • Access to four self‑paced labs where you can get hands‑on with the tech via common scenarios
  • Membership in the Microservices March Slack channel for asking questions of the experts and networking with fellow participants
  • Weekly learning guides to help you stay on top of the agenda
  • Calendar invites for the livestreams

What’s Next?

Unit 3: Microservices Security Pattern in Kubernetes begins on March 21. Learn about the sidecar pattern, policies that can make your services more secure and resilient, service meshes, mTLS, and end-to-end encryption.

For detailed guidance on implementing Kubernetes networking with NGINX, download our eBook, Managing Kubernetes Traffic with NGINX: A Practical Guide.

Hero image
Managing Kubernetes Traffic with F5 NGINX: A Practical Guide

Learn how to manage Kubernetes traffic with F5 NGINX Ingress Controller and F5 NGINX Service Mesh and solve the complex challenges of running Kubernetes in production.



About The Author

Jenn Gile

Jenn Gile

Head of Product Marketing, NGINX

About F5 NGINX

F5, Inc. is the company behind NGINX, the popular open source project. We offer a suite of technologies for developing and delivering modern applications. Together with F5, our combined solution bridges the gap between NetOps and DevOps, with multi-cloud application services that span from code to customer.

Learn more at nginx.com or join the conversation by following @nginx on Twitter.