ERNIE
This model was released on 2019-04-19 and added to Hugging Face Transformers on 2022-09-09.
ERNIE1.0, ERNIE2.0, ERNIE3.0, ERNIE-Gram, ERNIE-health are a series of powerful models proposed by baidu, especially in Chinese tasks.
ERNIE (Enhanced Representation through kNowledge IntEgration) is designed to learn language representation enhanced by knowledge masking strategies, which includes entity-level masking and phrase-level masking.
Other ERNIE models released by baidu can be found at Ernie 4.5, and Ernie 4.5 MoE.
Click on the ERNIE models in the right sidebar for more examples of how to apply ERNIE to different language tasks.
The example below demonstrates how to predict the [MASK] token with Pipeline, AutoModel, and from the command line.
from transformers import pipeline
pipeline = pipeline( task="fill-mask", model="nghuyong/ernie-3.0-xbase-zh")
pipeline("巴黎是[MASK]国的首都。")import torchfrom transformers import AutoModelForMaskedLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained( "nghuyong/ernie-3.0-xbase-zh",)model = AutoModelForMaskedLM.from_pretrained( "nghuyong/ernie-3.0-xbase-zh", dtype=torch.float16, device_map="auto")inputs = tokenizer("巴黎是[MASK]国的首都。", return_tensors="pt").to(model.device)
with torch.no_grad(): outputs = model(**inputs) predictions = outputs.logits
masked_index = torch.where(inputs['input_ids'] == tokenizer.mask_token_id)[1]predicted_token_id = predictions[0, masked_index].argmax(dim=-1)predicted_token = tokenizer.decode(predicted_token_id)
print(f"The predicted token is: {predicted_token}")echo -e "巴黎是[MASK]国的首都。" | transformers run --task fill-mask --model nghuyong/ernie-3.0-xbase-zh --device 0Model variants are available in different sizes and languages.
| Model Name | Language | Description |
|---|---|---|
| ernie-1.0-base-zh | Chinese | Layer:12, Heads:12, Hidden:768 |
| ernie-2.0-base-en | English | Layer:12, Heads:12, Hidden:768 |
| ernie-2.0-large-en | English | Layer:24, Heads:16, Hidden:1024 |
| ernie-3.0-base-zh | Chinese | Layer:12, Heads:12, Hidden:768 |
| ernie-3.0-medium-zh | Chinese | Layer:6, Heads:12, Hidden:768 |
| ernie-3.0-mini-zh | Chinese | Layer:6, Heads:12, Hidden:384 |
| ernie-3.0-micro-zh | Chinese | Layer:4, Heads:12, Hidden:384 |
| ernie-3.0-nano-zh | Chinese | Layer:4, Heads:12, Hidden:312 |
| ernie-health-zh | Chinese | Layer:12, Heads:12, Hidden:768 |
| ernie-gram-zh | Chinese | Layer:12, Heads:12, Hidden:768 |
Resources
Section titled “Resources”You can find all the supported models from huggingface’s model hub: huggingface.co/nghuyong, and model details from paddle’s official repo: PaddleNLP and ERNIE’s legacy branch.
ErnieConfig
Section titled “ErnieConfig”[[autodoc]] ErnieConfig - all
Ernie specific outputs
Section titled “Ernie specific outputs”[[autodoc]] models.ernie.modeling_ernie.ErnieForPreTrainingOutput
ErnieModel
Section titled “ErnieModel”[[autodoc]] ErnieModel - forward
ErnieForPreTraining
Section titled “ErnieForPreTraining”[[autodoc]] ErnieForPreTraining - forward
ErnieForCausalLM
Section titled “ErnieForCausalLM”[[autodoc]] ErnieForCausalLM - forward
ErnieForMaskedLM
Section titled “ErnieForMaskedLM”[[autodoc]] ErnieForMaskedLM - forward
ErnieForNextSentencePrediction
Section titled “ErnieForNextSentencePrediction”[[autodoc]] ErnieForNextSentencePrediction - forward
ErnieForSequenceClassification
Section titled “ErnieForSequenceClassification”[[autodoc]] ErnieForSequenceClassification - forward
ErnieForMultipleChoice
Section titled “ErnieForMultipleChoice”[[autodoc]] ErnieForMultipleChoice - forward
ErnieForTokenClassification
Section titled “ErnieForTokenClassification”[[autodoc]] ErnieForTokenClassification - forward
ErnieForQuestionAnswering
Section titled “ErnieForQuestionAnswering”[[autodoc]] ErnieForQuestionAnswering - forward