LLM Observability with PostHog

If I’d self-host, I’d probably would want to trace my LLM requests and their cost. PostHog has their own LLM Observability (Beta). For future reference:

https://posthog.com/docs/ai-engineering/observability?tab=Vercel+AI+SDK

Please authenticate to join the conversation.

Upvoters
Status

Rejected

Board
💡

Feature Request

Date

10 months ago

Author

iboughtbed

Subscribe to post

Get notified by email when there are changes.