-
Notifications
You must be signed in to change notification settings - Fork 11.1k
-
From the response of AI to my question about why the order of retrieved documents (chunks) in RAG matters I know that in Transformers like GPT there is an inherent positional bias which means that earlier tokens in an input sequence have more influence in the output of the model. Was it mentioned in the book? I read through the book a few times and haven't found it.
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 1 comment
-
It's an interesting phenomenon. Historically, it's been at the beginning and end of a document. It's likely because that's also where humans put important information in documents, and the model maybe picks this up from the training data. I discussed it back then in my blog here: https://magazine.sebastianraschka.com/p/ai-research-highlights-in-3-sentences-738
Beta Was this translation helpful? Give feedback.