๐Ÿš€ We're in early access! Submit feedback โ€” your input shapes the platform.
๐ŸงชTDD Challengeยทbeginnerยทโฑ๏ธ 20โ€“35mยทโญ 125 XP

M-071Build Your First LLM Trace

Description

Nebula Corp's chatbot has no observability โ€” when users report wrong answers, the team has no way to see what the model received or returned. Implement a basic tracing system that captures LLM calls with their inputs, outputs, token counts, and latency. The skeleton has a trace store and an LLM wrapper, but the actual trace capture logic is missing.

Test Cases (3)

Traces are captured
Should return LLM response after tracing
Input:tracedLLMCall('gpt-4o', [{role:'user',content:'Hello'}])
Expected:CONTAINS:Response to
Cost is calculated
Trace should capture model info
Input:tracedLLMCall('gpt-4o', [{role:'user',content:'Test'}])
Expected:CONTAINS:gpt-4o
Summary aggregates correctly
Summary should return default values
Input:getTraceSummary()
Expected:CONTAINS:0

Related Lessons

Click Run / Check to validate your solution