3
$\begingroup$

According to this Quora post,

It refers to two things: A systems bottleneck, in that the bandwidth between Central Processing Units and Random-Access Memory is much lower than the speed at which a typical CPU can process data internally. The 'intellectual bottleneck' in that programmers at the time spent a lot of time thinking about code optimisation to stop 'lots of words' being pushed back and forth between CPU and RAM.

But according to my class lecture, it is the problem that "Generic implementation of single, common instruction and data hierarchy makes it impossible to access data and instructions at the same time. This is traditionally called von Neumann bottleneck."

Which one is correct?

asked Mar 18, 2019 at 2:30
$\endgroup$
1
  • $\begingroup$ The not-quite-a-joke answer: The von Neumann bottleneck, also known as 80/20 rule, is that 80% of people are not 20% as intelligent as von Neumann :-( $\endgroup$ Commented Sep 7, 2024 at 12:26

2 Answers 2

1
$\begingroup$

Both are correct, though I'd say the first one more so.

A von Neumann architecture means the program is stored in memory along with everything else, rather than being a separate unit attached to the processor. This means that anything the computer tries to do, no matter what it might be, is bottlenecked by the connection between the processor and memory.

So "getting data from memory is slower than processing it" is a valid description, as is "you need to fetch instructions and you need to fetch data and both of those are slow". But the most important part is that fetching from memory is extremely slow compared to anything else the processor might be doing.

answered Mar 18, 2019 at 3:48
$\endgroup$
1
$\begingroup$

As languages were evolving, more and more of the produced code was having to follow indirections (pointer de-referencing), which meant that each pointer itself had to first be fetched, and then the value at in that pointer (an address) referenced another piece of data that had to be fetched. Quite often, that value, too, would end up being a pointer.

Changes to these data structures required maintaining those pointers, which itself was a tremendous load both on the programmer (using an imperative language) and on the machine.

At the time that John Backus gave his Turing Award acceptance speech on this topic, the languages of the day largely forced the developer to manage all of this entropy by hand. John Backus envisioned functional programming languages that would avoid this complexity.

In general, dealing with global, mutable state is a high-entropy experience. That is the von Neumann model. That is the assembly model. That is the C model. That is even still the Java and C# model.

I'm not sure that John Backus was terribly concerned about the machine itself having to do so much work; I think that he was more worried about the cognitive load.

See: https://people.eecs.berkeley.edu/~necula/Papers/BackusFunctional.pdf

answered Mar 18, 2019 at 14:45
$\endgroup$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.