Industry First In-Database LLMs And Automated In-Database Vector Store

It enables customers to build generative AI applications without AI expertise, data movement, or additional cost, while being 30X faster than Snowflake, 18X faster than Google BigQuery, and 15X faster than Databricks for vector processing.

0
118

It enables customers to build generative AI applications without AI expertise, data movement, or additional cost, while being 30X faster than Snowflake, 18X faster than Google BigQuery, and 15X faster than Databricks for vector processing.

Oracle has announced the general availability of HeatWave GenAI, which introduces the industry’s first in-database large language models (LLMs), an automated in-database vector store, scale-out vector processing, and the capability for contextual conversations in natural language informed by unstructured content. This allows customers to leverage generative AI for their enterprise data without needing AI expertise or moving data to a separate vector database. HeatWave GenAI is immediately available in all Oracle Cloud regions, Oracle Cloud Infrastructure (OCI) Dedicated Region, and across clouds at no additional cost for HeatWave customers.

It enables developers to create a vector store for enterprise unstructured content with a single SQL command, using built-in embedding models. Users can perform natural language searches using either in-database or external LLMs. Due to the company’s extreme scale and performance, there is no need to provision GPUs, reducing application complexity, increasing performance, improving data security, and lowering costs.

Key features include:

  • In-database LLMs: Simplifies generative AI application development at a lower cost, allowing customers to perform various tasks such as data search, content generation, and retrieval-augmented generation (RAG) within the database.
  • Automated in-database Vector Store: Enables generative AI use with business documents without moving data to a separate vector database, automating all steps within the database.
  • Scale-out vector processing: Delivers fast semantic search results with high accuracy, leveraging a new VECTOR data type and optimized distance function for standard SQL queries.
  • HeatWave Chat: A Visual Code plug-in for MySQL Shell, providing a graphical interface for HeatWave GenAI and enabling natural language or SQL queries with contextual conversation history and citations.

P Saravanan, VP of Cloud Engineering at Oracle India, emphasized HeatWave GenAI’s potential to revolutionize data management and analysis for Indian enterprises, enabling more efficient insight extraction and data-driven decision-making at no additional cost. “HeatWave’s stunning pace of innovation continues with the addition of HeatWave GenAI,” said Edward Screven, Oracle’s chief corporate architect. “Integrated and automated AI enhancements now allow developers to build rich generative AI applications faster, without requiring AI expertise or moving data.”

Vijay Sundhar, CEO of SmarterD, highlighted the benefits of HeatWave GenAI, stating, “Support for in-database LLMs and vector creation leads to a significant reduction in application complexity and predictable inference latency, all at no additional cost. This democratizes generative AI and boosts productivity.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here