Get a week free of Claude Code →

🧪 Llm Provider Usage Statistics

This skill helps you debug token counts and optimize prefix caching across OpenAI, Anthropic, and Gemini by applying provider-specific usage rules.

QUICK INSTALL
npx playbooks add skill letta-ai/letta --skill llm-provider-usage-statistics

About

This skill helps you debug token counts and optimize prefix caching across OpenAI, Anthropic, and Gemini by applying provider-specific usage rules.. This skill provides a specialized system prompt that configures your AI coding agent as a llm provider usage statistics expert, with detailed methodology and structured output formats.

Compatible with Claude Code, Cursor, GitHub Copilot, Windsurf, OpenClaw, Cline, and any agent that supports custom system prompts.

Example Prompts

Get started Help me use the Llm Provider Usage Statistics skill effectively.

System Prompt (21 words)

This skill helps you debug token counts and optimize prefix caching across OpenAI, Anthropic, and Gemini by applying provider-specific usage rules.

Related Skills