This skill helps you debug token counts and optimize prefix caching across OpenAI, Anthropic, and Gemini by applying provider-specific usage rules.
npx playbooks add skill letta-ai/letta --skill llm-provider-usage-statistics
This skill helps you debug token counts and optimize prefix caching across OpenAI, Anthropic, and Gemini by applying provider-specific usage rules.. This skill provides a specialized system prompt that configures your AI coding agent as a llm provider usage statistics expert, with detailed methodology and structured output formats.
Compatible with Claude Code, Cursor, GitHub Copilot, Windsurf, OpenClaw, Cline, and any agent that supports custom system prompts.
This skill helps you debug token counts and optimize prefix caching across OpenAI, Anthropic, and Gemini by applying provider-specific usage rules.