uukuguy's picture
Update README.md
34f015e
|
raw
history blame
938 Bytes
metadata
language:
  - en
library_name: transformers
pipeline_tag: text-generation
datasets:
  - yahma/alpaca-cleaned
license: apache-2.0

speechless-mistral-moloras-7b

JUST for TEST!

The router of mixture-of-multi-loras enables an automatic assembling of LoRA modules, using a gradientfree approach to obtain the coefficients of LoRA modules and requiring only a handful of inference steps for unseen tasks.

Totally 6 LoRA modules from speechless-mistral-7b-dare-0.85

Code: https://github.com/uukuguy/multi_loras?tab=readme-ov-file#mixture-of-multi-loras

LM-Evaluation-Harness

Open LLM Leaderboard

Metric Value
ARC 59.98
HellaSwag 83.29
MMLU 64.12
TruthfulQA 42.15
Winogrande 78.37
GSM8K 37.68
Average 60.93