更新时间:2023-12-02 22:27:22
来自 Huggingfaces 的模型可以使用 transformers
库开箱即用.它们可以与不同的后端(tensorflow、pytorch)一起使用.
The models from huggingfaces can be used out of the box using the transformers
library. They can be used with different backends (tensorflow, pytorch).
在此处(在create_model"函数).
Using huggingface's transformers with keras is shown here (in the "create_model" function).
一般来说,您可以使用 模型中的示例代码加载拥抱脸的转换器卡片(在变压器中使用"按钮):
Generally speaking you can load a huggingface's transformer using the example code in the model card (the "use in transformers" button):
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("Hate-speech-CNERG/dehatebert-mono-arabic")
model = AutoModelForSequenceClassification.from_pretrained("Hate-speech-CNERG/dehatebert-mono-arabic")
然后进行推理 文档 展示了一种从一个加载的变压器模型:
Then to proceed to inference the doc shows a way to get the output from a loaded transformer model:
inputs = tokenizer("صباح الخير", return_tensors="pt")
# We're not interested in labels here just the model's inference
# labels = torch.tensor([1]).unsqueeze(0) # Batch size 1
outputs = model(**inputs) #, labels=labels)
# The model returns logits but we want a probability so we use the softmax function
probs = pytorch.softmax(outputs['logits'], 1)