Oracle Cloud Infrastructure (OCI) Flexible Load Balancer is a highly available, cloud native service to distribute incoming application connections automatically, from the internet and internally, to multiple compute resources for resiliency and performance. Load balancers can distribute traffic across multiple fault domains, availability domains, and OCI regions based on persistence, request, and URL characteristics.
A load balancer improves resource utilization by directing requests across application services that operate in parallel. As demand increases, the number of application services can be increased, and the load balancer will use them to balance the processing of requests.
Legacy applications that are monolithic typically scale by running on larger hardware. Using load balancers, smaller but multiple instances can be run in parallel while still presenting a single entry point. For both legacy and cloud native application resources, the load balancer will stop using backend resources that become non-responsive, directing requests to healthy resources.
Application services can live in multiple locations, including OCI, on-premises, and other clouds. A load balancer provides a convenient, single point of entry, and can direct requests to the appropriate backend, which can be in OCI, on-premises, or on other clouds.
OCI Flexible Load Balancer supports web requests (HTTP, HTTPS) and application-layer traffic using TCP. A public load balancer accepts traffic from the internet while a private load balancer does not.
A load balancer has listeners that accept a single request type (HTTP, HTTPS, TCP). It can support multiple listeners in order to accept multiple streams.
Load balancers are regional services. Each load balancer has two load balancer devices that provide failover capability. In a region with multiple availability domains, the devices will be automatically distributed among two of the availability domains.
Define one or more back-end sets and then include compute resources as back-end servers in these sets. Then you can define health checks so the load balancer can determine whether a compute resource is operational or should be excluded.
Session persistence is available, which helps ensure that requests from a particular client will always go to the same compute resource.
Requests are directed to the compute resources based on one of multiple routing strategies, such as the least load.
Optionally, you can define routing policies based on HTTP header or URL to further direct requests to specific compute resources.
This reference architecture shows a highly available web application running in OCI using load balancers.
This reference architecture shows how to implement modern DevOps architecture using load balancers.
This reference architecture uses native OCI monitoring and notification services to respond to load balancer threshold conditions, call Oracle Functions to evaluate the condition, and use redirect rules to forward custom error messages stored in OCI Object Storage.
Build, test, and deploy applications on Oracle Cloud—for free. Sign up once, get access to two free offers.
Interested in learning more about Oracle Cloud Infrastructure? Let one of our experts help.