Edit model card

AMRBART is a pretrained semantic parser which converts a sentence into an abstract meaning graph. You may find our paper here (Arxiv). The original implementation is avaliable here

PWC

PWC

PWC

PWC

News🎈

  • (2022/12/10) fix max_length bugs in AMR parsing and update results.
  • (2022/10/16) release the AMRBART-v2 model which is simpler, faster, and stronger.

Requirements

  • python 3.8
  • pytorch 1.8
  • transformers 4.21.3
  • datasets 2.4.0
  • Tesla V100 or A100

We recommend to use conda to manage virtual environments:

conda env update --name <env> --file requirements.yml

Data Processing

You may download the AMR corpora at LDC.

Please follow this respository to preprocess AMR graphs:

bash run-process-acl2022.sh

Usage

Our model is avaliable at huggingface. Here is how to initialize a AMR parsing model in PyTorch:

from transformers import BartForConditionalGeneration
from model_interface.tokenization_bart import AMRBartTokenizer      # We use our own tokenizer to process AMRs

model = BartForConditionalGeneration.from_pretrained("xfbai/AMRBART-large-finetuned-AMR3.0-AMRParsing-v2")
tokenizer = AMRBartTokenizer.from_pretrained("xfbai/AMRBART-large-finetuned-AMR3.0-AMRParsing-v2")
Downloads last month
41
Safetensors
Model size
409M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.