On Tue, Apr 27, 2004 at 08:25:06PM +0200, Thiemo Seufer wrote: > Andrew Suffield wrote: > > On Tue, Apr 27, 2004 at 02:47:42AM +0200, Thiemo Seufer wrote: > > > Don Armstrong wrote: > > > [snip] > > > > > The human brain is way too limited for that task. Just explain me > > > > > how your brain interprets a bitstream which represents a JPEG of > > > > > non-trivial size. :-) > > > > > > > > Not particularly well, but I'd imagine that there are people out there > > > > who wouldn't have that much trouble with it. Moreover, while > > > > difficult, it's not something that's inherently impossible. > > > > > > It is, simply because the human brain can't handle more than 7+-2 items > > > at the same time ("The magical number seven, plus or minus two", may > > > even show up some google results for it). So a human is rather limited > > > WRT algorithmic complexity. > > > > That's bogus. A digital logic circuit can't handle more than two items > > at the same time, but has little problem with algorithmic complexity. > > If it has to remember more than 2 items for that algorithm, it has > the same problem. No. Really. It just has to iterate. This is how computers work: the output from one iteration becomes the input to the next. There is no mystical storage for all the working data from earlier iterations, it is discarded. This is the most basic of Turing machines, from which all known algorithms can be constructed. > > You only need to visualise one image at a time to interpret JPEG > > anyway. > > For a useful result you need some sort of "external memory". Nope. Precisely one image, which changes as you read more of the input stream. Nothing else is needed. It's rather slow, but quite doable. -- .''`. ** Debian GNU/Linux ** | Andrew Suffield : :' : http://www.debian.org/ | `. `' | `- -><- |
Attachment:
signature.asc
Description: Digital signature