Simplify operations of enterprise-grade Kubernetes at scale. Easily deploy and manage resource-intensive workloads such as AI with automatic scaling, patching, and upgrades.
CIO magazine recognizes OCI for its expertise in delivering cutting-edge Kubernetes solutions, supporting scalable and efficient application development.
OKE is the lowest cost Kubernetes service amongst all hyperscalers, especially for serverless.
OKE automatically adjusts compute resources based on demand, which can reduce your costs.
GPUs can be scarce, but OKE job scheduling makes it easy to maximize resource utilization.
OKE is consistent across clouds and on-premises, enabling portability and avoiding vendor lock-in.
OKE reduces the time and cost needed to manage the complexities of Kubernetes infrastructure.
Automatic upgrades and security patching boost reliability for the control plane and worker nodes.
Kubernetes is the go-to platform to deploy AI workloads. OKE powers Oracle Cloud Infrastructure (OCI) AI services.
– The initial build stage of an AI project involves defining the problem and preparing data to create models.
– Kubernetes clusters can significantly improve efficiency by granting shared access to expensive and often limited GPU resources while providing secure and centrally managed environments.
– Kubeflow, a Kubernetes-related open source project, provides a comprehensive framework designed to streamline the building, training, and deployment of models.
OKE is built on top of OCI, offering a complete stack of high performance infrastructure designed for AI/ML workloads such as:
– The full range of NVIDIA GPUs including H100, A100, A10, etc.
– Ultrafast RDMA networks
Using OKE self-managed nodes, you can run AI/ML building workloads on your Kubernetes clusters.
“Many OCI AI services run on OCI Kubernetes Engine (OKE), Oracle’s managed Kubernetes service. In fact, our engineering team experienced a 10X performance improvement with OCI Vision just by switching from an earlier platform to OKE. It’s that good.”
VP of OCI AI Services, Oracle Cloud Infrastructure
Deploy simple microservices packaged as Docker containers and communicate via a common API.
Discover best practices for deploying a serverless virtual node pool using the provided Terraform automation and reference architecture.
Find out how Tryg Insurance reduced their costs by 50% via dynamic rightsizing.
Mickey Boxell, Product Management
OKE add-ons provide an opportunity to offload the management of cluster operational software to Oracle. They also offer the flexibility to customize or fully opt out of default Kubernetes operational software to bring equivalent software. We are excited to share the release of four additional add-ons: the Kubernetes Cluster Autoscaler, the Istio service mesh, the OCI native ingress controller, and the Kubernetes Metrics Server—as well as support for new configuration arguments to provide greater control over the add-ons deployed to your clusters.
Read the complete postGet 30 days of access to CI/CD tools, managed Terraform, telemetry, and more.
Explore deployable reference architectures and solutions playbooks.
Empower app development with Kubernetes, Docker, serverless, APIs, and more.
Reach our associates for sales, support, and other questions.