Home
Omar Hosney
LinkedIn

PEFT (Parameter-Efficient Fine-Tuning) πŸ€— Cheat Sheet

1. Introduction to PEFT πŸš€

2. PEFT Methodologies 🎯

3. Adapter Methods 🧩

4. Quick Tour of PEFT πŸš€

5. Advanced Applications 🌟

6. Advanced Configurations πŸ› οΈ

7. Model Merging & Quantization πŸ› οΈ


Different Adaptors

Low-Rank Adaptation (LoRA) ✨

Mixture of LoRA Experts (X-LoRA) πŸ€–

Low-Rank Hadamard Product (LoHa) 🧩

Low-Rank Kronecker Product (LoKr) πŸ”—

Orthogonal Finetuning (OFT) 🎯

Orthogonal Butterfly (BOFT) πŸ¦‹

Adaptive Low-Rank Adaptation (AdaLoRA) πŸ› οΈ

Llama-Adapter πŸ¦™


Soft Prompts

πŸ“Œ Prompt Tuning

πŸ“Œ Prefix Tuning

πŸ“Œ P-Tuning

πŸ“Œ Multitask Prompt Tuning


IA3 and BOFT πŸ“

IA3 Overview πŸš€

IA3 in Practice πŸ’‘

OFT and BOFT Overview βš™οΈ

BOFT Key Features πŸ”‘

BOFT Parameters πŸ“Š

Example Usage πŸ› οΈ