Design and implement the LLM layer responsible for: (1) analyzing user news queries into structured intents/entities/clarity scores, and (2) generating concise, engaging article summaries and other LLM-driven enrichments (e.g., headlines, short blurbs). The LLM layer should expose clean input/output contracts for downstream services and be easy to test and tune independently of ingestion, storage, or trending simulation components.
There will be two LLM interactions
Analyses user news queries and extracts structured information to enable precise news retrieval. Converts free-text queries into a machine-readable format, capturing user intent, entities, and query clarity.
Input
query
(string): The user’s news-related question or request.Output
{
"intents": [
{
"type": "category | score | search | source | nearby | trending",
"intent_weight": 0.0,
"entities": ["list of people, organizations, locations, events"],
"locations": [{"lat": 0.0, "long": 0.0}] // optional, only for 'nearby'
}
],
"clarity_score": 0.0
}