Returning extra information Previously, the LLM only returned the most likely string Now there is a new interface to return more information: the top n strings for each input, as well as LLM specific information (h/t @0xAwill for adding the token counts you see below)
LLM Interface Enhancement: Extended Output with Top Results and Token Counts
By
–
Leave a Reply