All Guides

Hugging Face Modelle verwenden

Installieren Sie die Hugging Face Transformers-Bibliothek und nutzen Sie tausende KI-Modelle.

Intermediate25 Min.

Setup Steps

1. Install the Transformers library:

pip install transformers torch accelerate

2. Text generation example:

python
from transformers import pipeline
generator = pipeline('text-generation', model='gpt2')
result = generator('Artificial intelligence', max_length=100)
print(result[0]['generated_text'])

3. Sentiment analysis:

python
classifier = pipeline('sentiment-analysis')
result = classifier('This movie is amazing!')
print(result)

4. Login to Hugging Face Hub:

pip install huggingface_hub
huggingface-cli login

5. Download a custom model:

python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.1-8B")
model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-3.1-8B")

6. Share a model:

python
model.push_to_hub("username/model-name")
tokenizer.push_to_hub("username/model-name")