The overall components and structures are as follows

flowchart TD
    title["Conversation Flow"]
    style title fill:none,stroke:none,color:black,font-size:16px
    
    title --- A
    
    A[User Input] --> B{Message Processing}
    B --> C["Local Session Storage (only for dev)"]
    C --> D[API Request]
    D --> E[Backend Fine-tuned LLM]
    E --> F[Generate Response]   
    F --> G[Update UI with Message]
    G --> |New User Message|A
    
    %% Node styling
    classDef boxStyle fill:#ffcccb,stroke:#ff6b6b,stroke-width:1px,color:#333333
    classDef titleClass fill:none,stroke:none
    
    A:::boxStyle
    B:::boxStyle
    C:::boxStyle
    D:::boxStyle
    E:::boxStyle
    F:::boxStyle
    G:::boxStyle
    title:::titleClass

Conversation Flow

The conversation flow for doko.ai is as follows:

  1. User Input: The user types and submits a message through the chat interface.
  2. Message Processing: The system processes the message, formats it, and prepares it for both storage and the backend.
  3. Local Session Storage: Since this is still in development, I will store messages locally to maintain conversation context for fast prototyping and testing.
  4. API Request: The processed message is sent to our backend wit FastAPI.
  5. Fine-tuned LLM: Using our finetuned LLM, we pass in the message.
  6. Generate Response: The model processes the query and conversation history to generate a contextually relevant response about the user’s query.
  7. Update UI with Message: The response is returned to the frontend and displayed in the chat interface with smooth animations.