The Oracle AI Explorer for Apps (the AI Explorer) provides a streamlined environment where developers and data scientists can explore the potential of Generative Artificial Intelligence (GenAI) combined with Retrieval-Augmented Generation (RAG) capabilities. By integrating Oracle Database AI Vector Search, the AI Explorer enables users to enhance existing Large Language Models (LLMs) through RAG. This method significantly improves the performance and accuracy of AI models, helping to avoid common issues such as knowledge cutoff and hallucinations.
- GenAI: Powers the generation of text, images, or other data based on prompts using pre-trained LLMs.
- RAG: Enhances LLMs by retrieving relevant, real-time information allowing models to provide up-to-date and accurate responses.
- Vector Database: A database, including Oracle Database 23ai, that can natively store and manage vector embeddings and handle the unstructured data they describe, such as documents, images, video, or audio.
AI Explorer Features
- Configuring Embedding and Chat Models
- Splitting and Embedding Documentation
- Modifying System Prompts (Prompt Engineering)
- Experimenting with LLM Parameters
- Testbed for auto-generated or existing Q&A datasets
The AI Explorer streamlines the entire workflow from prototyping to production, making it easier to create and deploy RAG-powered GenAI solutions using the Oracle Database.
Getting Started
The AI Explorer is available to install in your own environment, which may be a developer’s desktop, on-premises data center environment, or a cloud provider. It can be run either on bare-metal, within a container, or in a Kubernetes Cluster.
Prefer a Step-by-Step?
The Walkthrough is a great way to familiarize yourself with the AI Explorer and its features in a development environment.
Prerequisites
- Oracle Database 23ai incl. Oracle Database 23ai Free
- Python 3.11 (for running Bare-Metal)
- Container Runtime e.g. docker/podman (for running in a Container)
- Access to an Embedding and Chat Model:
- API Keys for Third-Party Models
- On-Premises Models*
*Oracle recommends running On-Premises Models on hardware with GPUs. For more information, please review the AI Explorer documentation.
Bare-Metal Installation
To run the application on bare-metal; download the [source]({{ .Site.Params.GitHubRepo }}) and from the src/
directory:
Create and activate a Python Virtual Environment:
cd src/ python3.11 -m venv .venv source .venv/bin/activate pip3.11 install --upgrade pip wheel
Install the Python modules:
pip3.11 install -e ".[all]"
Start Streamlit:
streamlit run launch_client.py --server.port 8501
Navigate to
http://localhost:8501
.Configure the AI Explorer.
Container Installation
Same… but Different
References to podman
commands, if applicable to your environment, can be substituted with docker
.
To run the application in a container; download the [source]({{ .Site.Params.GitHubRepo }}) and from the top-level directory:
Build the image.
From inside the
src/
directory, build the ai-explorer-aio image:cd src/ podman build -t ai-explorer-aio .
Start the Container:
podman run -p 8501:8501 -it --rm ai-explorer-aio
Navigate to
http://localhost:8501
.Configure the AI Explorer.
Advanced Installation
The AI Explorer is designed to operate within a Microservices Architecture, leveraging Microservices Infrastructure like Kubernetes. Review AI Explorer components and the additional Microservices documentation for more information.