OCI Generative AI services improve performance and context awareness thanks to OCI Cache

OCI Generative AI services use Oracle Cloud Infrastructure Cache to store session history with low latency and high security.

Share:

OCI Cache lets us maintain full chat history with low latency, high performance, and secure multitenant access—a critical part of OCI Generative AI services.

Jun QianVice President, AI Sciences and Development, Oracle

Oracle Cloud Infrastructure (OCI) Generative AI services need to keep track of conversation history so users can interact with intelligent agents naturally and with context. The engineering team relies on OCI Cache, a fully managed in-memory store that provides low latency and high availability for storing session data. The addition of access controls enables secure multitenant separation and global consolidation of clusters, reducing complexity as the services scale across regions. OCI Cache is now a critical part of the OCI GenAI real-time architecture, facilitating fast responses while reducing operational overhead so the team can focus on advancing the intelligence of OCI AI agents.

Published:October 8, 2025