gpt2 / README.md
d-matrix
Update README.md
23a2db5 verified
|
raw
history blame
1.98 kB
metadata
license: apache-2.0
datasets:
  - wikitext
  - ptb_text_only
language:
  - en
metrics:
  - perplexity
pipeline_tag: text-generation
model-index:
  - name: distilgpt2
    results:
      - task:
          type: text-generation
        dataset:
          name: penn_treebank
          type: ptb_text_only
        metrics:
          - name: perlexity@BASELINE
            type: dmx-perlexity
            value: 63.45857238769531
          - name: perlexity@FALLBACK
            type: dmx-perlexity
            value: 64.36720275878906
      - task:
          type: text-generation
        dataset:
          name: wikitext2
          type: wikitext-2-raw-v1
        metrics:
          - name: perlexity@BASELINE
            type: dmx-perlexity
            value: 46.05925369262695
          - name: perlexity@FALLBACK
            type: dmx-perlexity
            value: 46.570838928222656

This is a d-Matrix functional reference of the GPT2 model family, of the following revisions:

The reference provides the following functional configurations:

Configuration Explanation
BASELINE a reference functionally equivalent to the original model
BASIC all linear algebraic operands quantized to BFP16-64, and all other operations transformed to approximated kernel simulations

Usage

Install d-Matrix ML Tools first: pip install dmx-mltools.

import os
from mltools.dmx import pipeline

pipe = pipeline(
    task="text-generation",
    model="d-matrix/gpt2",
    revision="temp-distilgpt2",
    dmx_config="BASELINE",
    trust_remote_code=True,
    # device_map="auto",  # enabling model parallel on multi-GPU nodes
)

results = pipe.evaluate(metric="d-matrix/perplexity", dataset="wikitext-2")

Evaluation results