0

I have a collection of news articles and I want to produce some new (unbiased) news articles using meta-llama/Meta-Llama-3-8B-Instruct. The articles are in a huggingface Dataset and to feed the transformers library pipeline I am using a KeyDataset like this:

key_dataset = KeyDataset(content, "prompt")

where prompt = "orders for LLM + article_content"

I want to produce new articles in batches to utilize better the GPU like this:

outputs = list(tqdm(pipeline(key_dataset, 
 batch_size=4, 
 max_new_tokens = 2*2024,
 eos_token_id=terminators,
 do_sample=True,
 temperature=1,
 top_p=0.9,), 
 total=len(key_dataset)))

The problem is that sometimes there is not generated text at all for some articles. Why is that? Doesn't batching work for different length inputs?

asked Jan 3, 2025 at 16:45

0

Know someone who can answer? Share a link to this question via email, Twitter, or Facebook.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.