At the same time, with our current binary computing (newtonian, as opposed to the still-in-infancy quantum computing) we quickly run into limits of space (to store data) and speed of operation (to communicate that info.)
Hmm, digital computing is "quantum" too. A binary computing would be the simpler form.
Analog computers would be "newtonian" unless wave signals are interpreted using quantum physics laws instead of classic ones.
Polish analog computer AKAT-1 (1959)
source
What I mean is that "newtonian" doesn't fit well into discrete maths (either binary, trinary, quaternary, ...). So newtonian as description of a binary computer doesn't make much sense to me.
On the contrary, an analog computer produces real time results that can be read either using classical physics laws or quantum physics laws.
For those reasons, attemps are being made both to develop analog quantum computers and digital quantum computers.