Fun fact: LLaMA-Adapter is not LLaMA-specific. You can use it to finetune any LLM.
— Sebastian Raschka (@rasbt) 8 juin 2023
Below, I finetuned a 40B Falcon model using LLaMA-Adapter (it also works on a single GPU with 20 GB RAM).
Have been heads down tinkering & will follow up with more benchmarks! https://t.co/2Fr6faChv2
Fun fact: LLaMA-Adapter is not LLaMA-specific. You can use it to finetune any LLM. Below, I finetuned a 40B Falcon model using LLaMA-Adapter (it also works on a single GPU with 20 GB RAM). Have been heads down tinkering & will follow up with more benchmarks!
Leave a Reply