Estimating the number of tokens Each model has a context window with a certain length, where you can only pass in strings of up to a certain length There now exists a method on each LLM class to calculate the number of tokens for a string in that model
Leave a Reply