Here's a summary of what we planned initially:

Current System:

image.png

Finalized orchestration and workflow:

1. High-Level System Workflow

The entire system operates as a robust pipeline, ensuring that every user interaction is processed systematically, from initial triage to task execution and resolution.

  1. Interfacing & Initial Triage: User interactions originate from various community platforms such as Discord, Slack, GitHub, and Discourse. Dedicated bots on these platforms serve as the first line of defense and classification. Their primary responsibilities include:
|Only Moderation is needed to be introduced| REST, ABOVE PART IS ALREADY DONE AS CLASSIFICATION ROUTER AND INTERNAL BOTS TO BE CONFIGURED SO NO CHANGES REQUIRED. MIGHT NEED ENHANCEMENTS.

  1. Prioritized Task Queuing: The triaged task, now enriched with priority and routing metadata, is pushed into a RabbitMQ message queue. This component is vital for:
ABOVE PART IS ALSO FINALIZED AND WORKING. MIGHT NEED ENHANCEMENTS

  1. LangGraph Processing Core: This is the brain of the operation. A LangGraph application consumes tasks from the RabbitMQ queue and initiates the main workflow.

2. Core LangGraph Architecture

The architecture is built around a single, intelligent agent that acts as a supervisor, orchestrating a toolkit of specialized nodes. This design provides a highly robust and flexible model for handling diverse DevRel tasks.

  1. gather_context_node (Entry Point): This node serves a single, clear purpose:
    1. It runs only once at the beginning of a new session. It consumes a task from the queue and populates the initial state with the user's conversation history from the database. After this, it passes control directly to the DevRel Agent.