DB - chatmessage

abstract

Stores individual messages within a chat session. Holds the role (user / assistant / system), the text content, the LLM that generated it, the source document chunks used for RAG context, and the detected intent mode.


Table Info

PropertyValue
Table Namechatmessage
SQLAlchemy Modelbackend/app/models/chat.py :: ChatMessage
Pydantic SchemaInline in backend/app/api/v1/chat.py
MigrationSQLModel.metadata.create_all(engine) in backend/app/core/database.py :: init_db()
TimescaleDB HypertableNo

Columns

ColumnTypeNullableDefaultNotes
idUUIDNouuid.uuid4()Primary key
session_idUUIDNoFK → chatsession.id with ondelete="CASCADE"
roleVARCHARNouser, assistant, or system; indexed for fast per-role filtering
contentTEXTNoFull message text (no length limit)
providerVARCHARNo"ollama"LLM provider that generated this message
modelVARCHARNo""Model name used; empty string for user messages
sourcesJSONYesnullRetrieved Qdrant chunks used as context (assistant messages only)
detected_modeVARCHAR(50)YesnullIntent classifier output for this turn (assistant messages only)
created_atTIMESTAMPNodatetime.utcnow()Message ordering key

Constraints & Indexes

TypeColumnsNotes
PRIMARY KEYidUUID v4
FOREIGN KEYsession_id → chatsession.idondelete="CASCADE" — deleting session removes all messages
INDEXroleField(index=True) — supports filtering messages by role

Entity Relationships


SQLAlchemy Model (reference snapshot)

class ChatMessage(SQLModel, table=True):
    id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True)
    session_id: uuid.UUID = Field(
        sa_column=Column(pg.UUID(as_uuid=True), ForeignKey("chatsession.id", ondelete="CASCADE"), nullable=False)
    )
    role: str = Field(index=True)  # user, assistant, system
    content: str
    provider: str = Field(default="ollama")
    model: str = Field(default="")
    sources: Optional[List[Any]] = Field(
        default=None, sa_column=Column(JSON, nullable=True)
    )
    detected_mode: Optional[str] = Field(default=None, max_length=50)
    created_at: datetime = Field(default_factory=datetime.utcnow)
    session: ChatSession = Relationship(back_populates="messages")

Service Layer

OperationFileNotes
Persist user messagebackend/app/services/chat_history_service.pyCalled before LLM inference
Persist assistant messagebackend/app/services/chat_history_service.pyCalled after streaming completes (via BackgroundTasks)
Fetch session messagesbackend/app/api/v1/chat.py :: GET /chat/history/{id}Ordered by created_at
Streaming askbackend/app/api/v1/chat.py :: GET /chat/ask-streamSSE; assistant message saved in background task after stream ends

RoleLink
Parent tableDB - chatsession
Chat APIbackend/app/api/v1/chat.py
Chat history servicebackend/app/services/chat_history_service.py
DevOpsDevOps - DocRAG