Path: blob/master/GenAI Transformers Basics/4 Hugging Face lib functions.ipynb
3074 views
Kernel: Python 3 (ipykernel)
The Hugging Face transformers library provides a variety of pretrained models and pipelines that can perform different natural language processing (NLP) tasks beyond sentiment analysis. Below are some examples of tasks you can perform using different pipelines provided by the transformers library, along with a set of functions demonstrating these capabilities.
pip install wordcloud --trusted-host pypi.org --trusted-host files.pythonhosted.org transformers==4.9.2 torch==1.9.0
In [2]:
Out[2]:
[{'label': 'NEGATIVE', 'score': 0.9996954202651978}]
In [3]:
Out[3]:
Downloading: 0%| | 0.00/998 [00:00<?, ?B/s]
Downloading: 0%| | 0.00/1.33G [00:00<?, ?B/s]
Downloading: 0%| | 0.00/60.0 [00:00<?, ?B/s]
Downloading: 0%| | 0.00/213k [00:00<?, ?B/s]
C:\Users\suyashi144893\Anaconda3\lib\site-packages\transformers\pipelines\token_classification.py:154: UserWarning: `grouped_entities` is deprecated and will be removed in version v5.0.0, defaulted to `aggregation_strategy="AggregationStrategy.SIMPLE"` instead.
warnings.warn(
[{'entity_group': 'PER', 'score': 0.9957955, 'word': 'Ashi', 'start': 0, 'end': 4}, {'entity_group': 'LOC', 'score': 0.99833214, 'word': 'Delhi', 'start': 16, 'end': 21}]
In [4]:
Out[4]:
Downloading: 0%| | 0.00/473 [00:00<?, ?B/s]
Downloading: 0%| | 0.00/261M [00:00<?, ?B/s]
Downloading: 0%| | 0.00/49.0 [00:00<?, ?B/s]
Downloading: 0%| | 0.00/213k [00:00<?, ?B/s]
Downloading: 0%| | 0.00/436k [00:00<?, ?B/s]
{'score': 0.975246012210846, 'start': 40, 'end': 53, 'answer': 'New York City'}
In [5]:
Out[5]:
Downloading: 0%| | 0.00/1.58k [00:00<?, ?B/s]
Downloading: 0%| | 0.00/1.63G [00:00<?, ?B/s]
Downloading: 0%| | 0.00/899k [00:00<?, ?B/s]
Downloading: 0%| | 0.00/456k [00:00<?, ?B/s]
Downloading: 0%| | 0.00/1.36M [00:00<?, ?B/s]
C:\Users\suyashi144893\Anaconda3\lib\site-packages\torch\_tensor.py:575: UserWarning: floor_divide is deprecated, and will be removed in a future version of pytorch. It currently rounds toward 0 (like the 'trunc' function NOT 'floor'). This results in incorrect rounding for negative values.
To keep the current behavior, use torch.div(a, b, rounding_mode='trunc'), or for actual floor division, use torch.div(a, b, rounding_mode='floor'). (Triggered internally at ..\aten\src\ATen\native\BinaryOps.cpp:467.)
return torch.floor_divide(self, other)
A good human embodies kindness, empathy, and integrity. They act selflessly, helping others and showing compassion. Honesty and respect guide their interactions, fostering trust and positive relationships.
In [7]:
Out[7]:
Enter initial text or topic:Language
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
Language by the Book Society of America.
It's so nice to be able to be so transparent about how we are going to use our power.
I've been very lucky the time is right now. I have lots of work to
In [ ]:
In [ ]:
In [ ]:
In [ ]: