There's been some additions to the LLM interface over the past few days: Method to estimate # of tokens, batching of inputs, returning extra information, serialization of LLM configurations @AkashSamant4 @thepromptking @0xAwill Let's walk through these improvements:
LangChain LLM Interface Improvements and New Features
By
–
Leave a Reply