Andrew Sackville-West wrote:
bah! I spit on your shiny new commodore. *my* commodore was a vic-20 with 3.5k ram. Now that taught you how to control resources. bunchanew-fangled whipper-snappers.seriously though, we used to max out that little machine. like this little trick. a '?' was shorthand for 'print' *and* it used less memory. but when you 'list'ed the line that contained it, the interpreter would expand the '?' into a 'print'. If you were really packing the code in, that would make you line longer than the 80 char max so you'dhave to go back in and convert all your 'print's back to '?'s. yeah. those were the days... A
In fact, I never figured out how they manage to consume all that memory. It must be they are working hard on it. :D
I cannot get away from feeling that it is a great waste of resources, not computer, but resources for production of new memory, processors, hard disk drives, etc. You spend not only material, but also energy, and of course human work. On the other hand, you save time of programmers.
Is it really worth it or is the computer industry just looking for a way to keep their business going? Probably a good question, but you don't have to answer since it's a little of topic here. :)