If I’d self-host, I’d probably would want to trace my LLM requests and their cost. PostHog has their own LLM Observability (Beta). For future reference:

https://posthog.com/docs/ai-engineering/observability?tab=Vercel+AI+SDK
Please authenticate to join the conversation.
Rejected
Feature Request
10 months ago

iboughtbed
Get notified by email when there are changes.
Rejected
Feature Request
10 months ago

iboughtbed
Get notified by email when there are changes.