CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutSign UpSign In
huggingface

Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.

GitHub Repository: huggingface/notebooks
Path: blob/main/course/fr/chapter2/section2_tf.ipynb
Views: 2548
Kernel: Python 3

Derrière le pipeline (TensorFlow)

Installez la bibliothèque 🤗 Transformers pour exécuter ce notebook.

!pip install transformers[sentencepiece]
from transformers import pipeline classifier = pipeline("sentiment-analysis", model="tblard/tf-allocine") classifier( ["J'ai attendu un cours d'HuggingFace toute ma vie.", "Je déteste tellement ça !"] )
from transformers import AutoTokenizer checkpoint = "tblard/tf-allocine" tokenizer = AutoTokenizer.from_pretrained(checkpoint)
raw_inputs = [ "J'ai attendu un cours d'HuggingFace toute ma vie.", "Je déteste tellement ça !", ] inputs = tokenizer(raw_inputs, padding=True, truncation=True, return_tensors="tf") print(inputs)
from transformers import AutoModel checkpoint = "tblard/tf-allocine" model = AutoModel.from_pretrained(checkpoint, from_tf=True)
outputs = model(**inputs) print(outputs.last_hidden_state.shape)
from transformers import AutoModelForSequenceClassification checkpoint = "tblard/tf-allocine" model = AutoModelForSequenceClassification.from_pretrained(checkpoint, from_tf=True) outputs = model(**inputs)
print(outputs.logits.shape)
print(outputs.logits)
import tensorflow as tf predictions = tf.math.softmax(outputs.logits, axis=-1) print(predictions)
model.config.id2label