For online retailers, every second counts. NGINX provides the easy-to-deploy tools your DevOps team needs to create a seamless and secure shopping experience.
Traffic surges and server overload are serious concerns for online retailers anytime, but especially during the year-end "Cyber Season" of shopping. We detail ways you can leverage NGINX's production-grade app delivery solutions to prepare for sudden traffic spikes.
NGINX Service Mesh Release 1.1.0 introduces three key enhancements that make it easier to deploy and manage our production-ready service mesh in Kubernetes: Helm support, air-gap installation, and in-place upgrades.
A service mesh can make a Kubernetes environment more complicated if it must be configured separately from the Ingress controller. In this demo and blog we show how to integrate NGINX Plus Ingress Controller with NGINX Service Mesh to control both ingress and egress mTLS traffic.
NGINX recently became the most popular web server in the world, according to W3Techs. We're profoundly grateful to the NGINX community, who've brought us to this milestone, and look forward to providing even more tools to help you optimize delivery of your modern and cloud-native apps.
If services in a Kubernetes environment exchange sensitive data, it's important to control communication among them. With NGINX Service Mesh, it takes less than 10 minutes to set up fine‑grained control. Watch the video demo and check out our step-by-step tutorial.
Traffic splitting is a valuable tool for app development, reducing the risk of outages during app upgrades. With NGINX Service Mesh, it takes less than 10 minutes to implement blue-green and canary deployments. Watch the video demo and check out our step-by-step tutorials.
High request volume can overwhelm your Kubernetes services. With NGINX Service Mesh, it takes less than 10 minutes to define a rate-limiting policy that limits each client to a reasonable number of requests. Watch the video demo and follow along in the transcript provided.
NGINX Service Mesh is free and scales with your Kubernetes environment no matter where you are in your microservices journey, from open source project to secure and scalable enterprise-grade solution.