New in v0.0.38 Read more about this here: https://
langchain.readthedocs.io/en/latest/exam
ples/prompts/llm_functionality.html#Caching
… @AkashSamant4 for making LLM configurations serializable, a prerequisite for this @brucehammer and @DannyHabibs for highlighting this as a valuable feature
LangChain v0.0.38: New LLM Caching Feature Released
By
–
Leave a Reply