Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Positional bias is not mentioned? #868

Unanswered
Jessen-Li asked this question in Q&A
Discussion options

From the response of AI to my question about why the order of retrieved documents (chunks) in RAG matters I know that in Transformers like GPT there is an inherent positional bias which means that earlier tokens in an input sequence have more influence in the output of the model. Was it mentioned in the book? I read through the book a few times and haven't found it.

You must be logged in to vote

Replies: 1 comment

Comment options

It's an interesting phenomenon. Historically, it's been at the beginning and end of a document. It's likely because that's also where humans put important information in documents, and the model maybe picks this up from the training data. I discussed it back then in my blog here: https://magazine.sebastianraschka.com/p/ai-research-highlights-in-3-sentences-738

f9d9917c-fa22-498b-a79b-195e1eda8107_1571x791

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants

AltStyle によって変換されたページ (->オリジナル) /