I have a collection of news articles and I want to produce some new (unbiased) news articles using meta-llama/Meta-Llama-3-8B-Instruct. The articles are in a huggingface Dataset and to feed the transformers library pipeline I am using a KeyDataset like this:
key_dataset = KeyDataset(content, "prompt")
where prompt = "orders for LLM + article_content"
I want to produce new articles in batches to utilize better the GPU like this:
outputs = list(tqdm(pipeline(key_dataset,
batch_size=4,
max_new_tokens = 2*2024,
eos_token_id=terminators,
do_sample=True,
temperature=1,
top_p=0.9,),
total=len(key_dataset)))
The problem is that sometimes there is not generated text at all for some articles. Why is that? Doesn't batching work for different length inputs?
asked Jan 3, 2025 at 16:45
Xhulio Xhelilai
451 silver badge7 bronze badges
lang-py