Oracle bakes LLMs and vector support directly into HeatWave GenAI database

by | Jun 26, 2024 | Technology

Don’t miss OpenAI, Chevron, Nvidia, Kaiser Permanente, and Capital One leaders only at VentureBeat Transform 2024. Gain essential insights about GenAI and expand your network at this exclusive three day event. Learn More

Oracle is expanding its HeatWave cloud database service with a set of new generative AI services, known collectively as HeatWave GenAI.

The HeatWave platform was formerly branded as MySQL HeatWave. It provides a cloud managed extended version of the MySQL database with both transactional and analytical database functionality. Last year Oracle extended the platform with the HeatWave Lakehouse, which provides data lakehouse capabilities. 

With HeatWave GenAI, Oracle is bringing vector processing and advanced AI functionality to the database. While there is no shortage of database vendors adding vector support to help enable gen AI and retrieval augmented generation (RAG), few if any vendors are actually integrating Large Language Models (LLMs) directly as an in-database capability. That’s what Oracle is doing though, directly integrating quantized versions of Llama 3 and the Mistral LLMs. According to Oracle, having an LLM directly as part of the database can enable better performance and new types of applications that complement HeatWave’s existing AutoML automated machine learning functionality.

“Customers don’t need to wait to provision a GPU or to pay for the cost of invocation of an external service,” Nipun Agarwal, senior VP MySQL and HeatWave at Oracle told VentureBeat. “Furthermore, since all the LLM invocation is also happening inside HeatWave, it provides a lot more synergy with AutoML and other capabilities of HeatWave which are running inside the database.”

Countdown to VB Transform 2024

Join enterprise leaders in San Francisco from July 9 to 11 for our flagship AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register Now

Bringing AI closer to data with in-database LLMs

Oracle claims that the introduction of in-database LLMs is an industry first. 

Agarwal explained that having in-database LLM capabilit …

Article Attribution | Read More at Article Source

[mwai_chat context=”Let’s have a discussion about this article:nn
Don’t miss OpenAI, Chevron, Nvidia, Kaiser Permanente, and Capital One leaders only at VentureBeat Transform 2024. Gain essential insights about GenAI and expand your network at this exclusive three day event. Learn More

Oracle is expanding its HeatWave cloud database service with a set of new generative AI services, known collectively as HeatWave GenAI.

The HeatWave platform was formerly branded as MySQL HeatWave. It provides a cloud managed extended version of the MySQL database with both transactional and analytical database functionality. Last year Oracle extended the platform with the HeatWave Lakehouse, which provides data lakehouse capabilities. 

With HeatWave GenAI, Oracle is bringing vector processing and advanced AI functionality to the database. While there is no shortage of database vendors adding vector support to help enable gen AI and retrieval augmented generation (RAG), few if any vendors are actually integrating Large Language Models (LLMs) directly as an in-database capability. That’s what Oracle is doing though, directly integrating quantized versions of Llama 3 and the Mistral LLMs. According to Oracle, having an LLM directly as part of the database can enable better performance and new types of applications that complement HeatWave’s existing AutoML automated machine learning functionality.

“Customers don’t need to wait to provision a GPU or to pay for the cost of invocation of an external service,” Nipun Agarwal, senior VP MySQL and HeatWave at Oracle told VentureBeat. “Furthermore, since all the LLM invocation is also happening inside HeatWave, it provides a lot more synergy with AutoML and other capabilities of HeatWave which are running inside the database.”

Countdown to VB Transform 2024

Join enterprise leaders in San Francisco from July 9 to 11 for our flagship AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register Now

Bringing AI closer to data with in-database LLMs

Oracle claims that the introduction of in-database LLMs is an industry first. 

Agarwal explained that having in-database LLM capabilit …nnDiscussion:nn” ai_name=”RocketNews AI: ” start_sentence=”Can I tell you more about this article?” text_input_placeholder=”Type ‘Yes'”]

Share This