Machine Learning - Transformers

Overview

Transformers use self-attention to model dependencies without recurrence or convolution, enabling parallel training and state-of-the-art results across NLP, vision, and audio.

Quick demo (HuggingFace)

# pip install transformers torch --upgrade
from transformers import pipeline
clf = pipeline("sentiment-analysis")
print(clf("I love practical ML!"))