Wdym loading documents into structured tools? Gonna add something like this today/tmrw as well!
@hwchase17
-
Question Answering with Citations Using LangChain Functions
By
–
Question Answering with citations Ahead of our webinar on Wednesday, more `functions` goodness from @jxnlco
: Answer a question (with citations) from a piece of context. Uses `functions` to specify the return schema of the answer + supporting facts https://
python.langchain.com/docs/modules/c
hains/additional/qa_citations
… -
Memory Types and Reflection for Improved Chatbot Performance
By
–
The most interesting thing here IMO is the exploration of multiple different types of "memory" for creating chatbots
— Harrison Chase (@hwchase17) 19 juin 2023
Baseline would just be retrieval over the raw corpus, but by doing some reflection-like things as a preprocessing step, you can get better results https://t.co/XKeHg3Rgp0The most interesting thing here IMO is the exploration of multiple different types of "memory" for creating chatbots Baseline would just be retrieval over the raw corpus, but by doing some reflection-like things as a preprocessing step, you can get better results
-
Webinar on Functions: Use Cases and Q&A Session
By
–
To hear more about `functions`, join us (me, Atty, @fpingham
, and @jxnlco
) for an exciting webinar this Wednesday! We'll cover how to use it, what some common use cases are, and then answer any and all questions! -
Adding Function Chains: Extraction, Tagging, Question-Answering
By
–
So that's the general formula for how we're adding `functions` chains So far we've added: – Extraction
– Tagging
– Question-Answering with citations We're EXTREMELY open to contributions here – with this formula should be a pretty easy addition -
LLM Functions and Pydantic Schema Parsing Integration
By
–
llm_kwargs: this is where we specify `functions` and `function_call` output_parser: this is where we parse the `function_call` response into either a string, json object, or pydantic object @pydantic is really nice for letting users specify schema in a Pythonic way!
-
LLM Configuration and Prompt Templates Setup Guide
By
–
Breaking that down: llm: this is the language model, at the moment needs to be an @OpenAI Chat model prompt: this is the prompt template to use. Generally should be a list of message templates this is pretty standard, nothing new so far
-
Creating LLM Chains with LangChain Method Pattern
By
–
Let's put it all together! We've created (with the help of @fpingham and @jxnlco
) a few chains using this method. All follow the same pattern: “`
chain = LLMChain( llm=llm, prompt=prompt, llm_kwargs=llm_kwargs, output_parser=output_parser,
)
“` -
Function Call OutputParser for API Response Handling
By
–
When doing this, the `content` field on the response message will be blank The real response is in the `function_call` section To make it easier to use that information, we've added an OutputParser than picks out that information
-
Force AI Function Output Format with function_call Parameter
By
–
You can do this by not only passing in `functions` parameter, but also passing in the `function_call` parameter The `function_call` parameter forces it to respond using a particular function – allowing you to guarantee the output in a specific format