AI Dynamics

Global AI News Aggregator

Anthropic API Introduces Cache-Aware Rate Limits for Prompt Cache

We've rolled out cache-aware rate limits in the Anthropic API. This means that prompt cache read tokens will no longer count against your input token per minute rate limits.

→ View original post on X — @alexalbert__,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *