Date: Mon May 07 2001 - 15:22:26 PDT
Jeff Bone wrote:
> Well, while the argument has been somewhat byzantine, I think we're in perfect agreement. :-) You
> just restated my point pretty well...
I'm not sure caches have any point at all where you have many spatially
distributed agents (fancy that, since we can't make purely bosonic computers)
simultaneously mangling data, which must remain consistent. The traffic and
delays created by protocols necessary to ensure consistence are actually
becoming more complicated if you assume the existance of a cache.
Whenever there's just one agent that is doing the mangling, situation is different.
Cache hierarchies in monoprocessor systems do make sense, assuming (now here's a
rather questionable assumptions) that solid-state memories need to be organized
by having different latencies. (This assumption is going to be nuked by embedded
memory systems, and eventually by cellular automata flavoured systems, which
make no distinction between memory and CPU, between code and data, where everything
is just bits, and resides in maximal proximity to the ALU: being part of it).
Anyway, the assumption of absolute consistency is another human artifact.
For good reasons biology never relies on architectures which have that at its
core. Systems fail, if you need to operate efficiently, you're hugging system
noise, the result are occasional failures. Your system does need to behave in
a boolean way: either it works, or it breaks, it needs to display a continuum
between these two behaviours. This is also known as graceful degradation.
Current human handiwork wouldn't last an hour in a rainy night in the sticks.
Anyway, the destillation of this crappy essence is: the first rule of caching: don't.
Second: if have to be consistent, always, orelse, you're almost fux0red already.
Do we still agree? ;)
This archive was generated by hypermail 2b29 : Mon May 07 2001 - 13:33:29 PDT