AIChat

Use hooks under this section to handle your data flow in chat.

Hooks

useAIAgent
A hook to create an AIAgent instance, which handles the requesting to llm api and response parsing(json or SSE) format.

useAIChat
Use this hook to manage your messages in the thread. Pass the agent instance your created with useAIAgent, and also transformMessage to transform the message returned from llm api into desired format.

useAutoScroll
Use this hook if you want the scroller to auto adjust to the right position. For example, when new message is sent out, it will auto scroll to the message that pending llm api response.

Example

Basic Usage
Below is a basic usage example of using the hooks above together with coral AI chat components to build your own chat interface.
View code

API

StreamReader
Prop nameTypeDefaultDescription
readableStreamReadableStream<any>
useAIAgent
Prop nameTypeDefaultDescription
baseUrlstring
The base URL of the LLM API.
dangerouslyApiKeystring
The API key to use for the request, use it with caution.
defaultHeadersRecord<string, string>
The default headers to use for the request.
messagesany[]
The messages to send to the LLM API, use it subject to the LLM API specification.
modelstring
The model to use for the request, use it subject to the LLM API specification.
streamboolean
Whether to stream the response from the LLM API.
userstring
User ID to send to the model, use it subject to the LLM API specification.
useAIChat
Prop nameTypeDefaultDescription
bubbleListRefRefObject<BubbleListRef>
messagesany[]
scrollOptionsScrollToOptions
    Hooks
    Hooks
    useAIAgent
    useAIAgent
    useAIChat
    useAIChat
    useAutoScroll
    useAutoScroll
    Example
    Example
    Basic Usage
    Basic Usage
    API
    API
    StreamReader
    StreamReader
    useAIAgent
    useAIAgent
    useAIChat
    useAIChat
Coral is a thoroughly developed design system widely adopted by developers and designers for creating beautiful and user-friendly Sea internal products.

Copyright © 2018-2025 Sea Labs