Longformer
This model was released on 2020-04-10 and added to Hugging Face Transformers on 2020-11-16.
Longformer
Section titled “Longformer”Longformer is a transformer model designed for processing long documents. The self-attention operation usually scales quadratically with sequence length, preventing transformers from processing longer sequences. The Longformer attention mechanism overcomes this by scaling linearly with sequence length. It combines local windowed attention with task-specific global attention, enabling efficient processing of documents with thousands of tokens.
You can find all the original Longformer checkpoints under the Ai2 organization.
The example below demonstrates how to fill the <mask> token with Pipeline, AutoModel and from the command line.
import torchfrom transformers import pipeline
pipeline = pipeline( task="fill-mask", model="allenai/longformer-base-4096", dtype=torch.float16, device=0)pipeline("""San Francisco 49ers cornerback Shawntae Spencer will miss the rest of the <mask> with a torn ligament in his left knee.Spencer, a fifth-year pro, will be placed on injured reserve soon after undergoing surgery Wednesday to repair the ligament. He injured his knee late in the 49ers’ road victory at Seattle on Sept. 14, and missed last week’s victory over Detroit.Tarell Brown and Donald Strickland will compete to replace Spencer with the 49ers, who kept 12 defensive backs on their 53-man roster to start the season. Brown, a second-year pro, got his first career interception last weekend while filling in for Strickland, who also sat out with a knee injury.""")import torchfrom transformers import AutoModelForMaskedLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("allenai/longformer-base-4096")model = AutoModelForMaskedLM.from_pretrained("allenai/longformer-base-4096")
text = ("""San Francisco 49ers cornerback Shawntae Spencer will miss the rest of the <mask> with a torn ligament in his left knee.Spencer, a fifth-year pro, will be placed on injured reserve soon after undergoing surgery Wednesday to repair the ligament. He injured his knee late in the 49ers’ road victory at Seattle on Sept. 14, and missed last week’s victory over Detroit.Tarell Brown and Donald Strickland will compete to replace Spencer with the 49ers, who kept 12 defensive backs on their 53-man roster to start the season. Brown, a second-year pro, got his first career interception last weekend while filling in for Strickland, who also sat out with a knee injury.""")
input_ids = tokenizer([text], return_tensors="pt")["input_ids"]logits = model(input_ids).logits
masked_index = (input_ids[0] == tokenizer.mask_token_id).nonzero().item()probs = logits[0, masked_index].softmax(dim=0)values, predictions = probs.topk(5)tokenizer.decode(predictions).split()echo -e "San Francisco 49ers cornerback Shawntae Spencer will miss the rest of the <mask> with a torn ligament in his left knee." | transformers run --task fill-mask --model allenai/longformer-base-4096 --device 0-
Longformer is based on RoBERTa and doesn’t have
token_type_ids. You don’t need to indicate which token belongs to which segment. You only need to separate the segments with the separation token</s>ortokenizer.sep_token. -
You can set which tokens can attend locally and which tokens attend globally with the
global_attention_maskat inference (see this example for more details). A value of0means a token attends locally and a value of1means a token attends globally. -
LongformerForMaskedLMis trained likeRobertaForMaskedLMand should be used as shown below.input_ids = tokenizer.encode("This is a sentence from [MASK] training data", return_tensors="pt")mlm_labels = tokenizer.encode("This is a sentence from the training data", return_tensors="pt")loss = model(input_ids, labels=input_ids, masked_lm_labels=mlm_labels)[0]
LongformerConfig
Section titled “LongformerConfig”[[autodoc]] LongformerConfig
LongformerTokenizer
Section titled “LongformerTokenizer”[[autodoc]] LongformerTokenizer
LongformerTokenizerFast
Section titled “LongformerTokenizerFast”[[autodoc]] LongformerTokenizerFast
Longformer specific outputs
Section titled “Longformer specific outputs”[[autodoc]] models.longformer.modeling_longformer.LongformerBaseModelOutput
[[autodoc]] models.longformer.modeling_longformer.LongformerBaseModelOutputWithPooling
[[autodoc]] models.longformer.modeling_longformer.LongformerMaskedLMOutput
[[autodoc]] models.longformer.modeling_longformer.LongformerQuestionAnsweringModelOutput
[[autodoc]] models.longformer.modeling_longformer.LongformerSequenceClassifierOutput
[[autodoc]] models.longformer.modeling_longformer.LongformerMultipleChoiceModelOutput
[[autodoc]] models.longformer.modeling_longformer.LongformerTokenClassifierOutput
LongformerModel
Section titled “LongformerModel”[[autodoc]] LongformerModel - forward
LongformerForMaskedLM
Section titled “LongformerForMaskedLM”[[autodoc]] LongformerForMaskedLM - forward
LongformerForSequenceClassification
Section titled “LongformerForSequenceClassification”[[autodoc]] LongformerForSequenceClassification - forward
LongformerForMultipleChoice
Section titled “LongformerForMultipleChoice”[[autodoc]] LongformerForMultipleChoice - forward
LongformerForTokenClassification
Section titled “LongformerForTokenClassification”[[autodoc]] LongformerForTokenClassification - forward
LongformerForQuestionAnswering
Section titled “LongformerForQuestionAnswering”[[autodoc]] LongformerForQuestionAnswering - forward