[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: respectful slightly dumb question about 64 bit computing.....

On Sun, Oct 29, 2006 at 01:22:49PM +0000, Michael Fothergill wrote:
> Dear Debian folks,
> I have been reading about the benefits of 64 bit computing on the web.  In 
> the old days I used to run some molecular dynamics calculations on a DEC 
> Alpha with a 64 bit chip in it and the developer there did get a definite 
> boost from it.
> I have been reading the discussion on wikipedia about 64 bit computing.  
> There is a little section in there that interested me and I wondered if I 
> could ask a question about it.
> Here is the section:
> "The emergence of the 64-bit architecture effectively increases the memory 
> ceiling to 264 addresses, equivalent to 17,179,869,184 gigabytes or 16 
> exabytes of RAM. To put this in perspective, in the days when a mere 4 kB 
> of main memory was commonplace, the maximum memory ceiling of 232 addresses 
> was about 1 million times larger than typical memory configurations. Taking 
> today's standard as 4 GB of main memory (actually, few personal computers 
> have this much), then the difference between today's standard and the 264 
> limit is a factor of about 4 billion. Most 64-bit consumer PCs on the 
> market today have an artificial limit on the amount of memory they can 
> recognize, because physical constraints make it highly unlikely that one 
> will need support for the full 16 exabyte capacity. Apple's Mac Pro, for 
> example, can be physically configured with up to 16 gigabytes of memory, 
> and as such there is no need for support beyond that amount. A recent Linux 
> kernel (version 2.6.16) can be compiled with support for up to 64 gigabytes 
> of memory."
> OK, here's the dumb question:
> Let's suppose that money was no object and we managed in some technical 
> feat to construct a computer that could have a 64 bit chip in it that would 
> be properly hooked up to 16 exabytes of RAM.
> If I had such a computer in my possession and I offered to donate to the 
> Debian community how would it respond?
> Would it say
> 1.  What a waste of money you idiot.  Why are you even posting this 
> question?  It just shows how stupid you are. This computer would have 
> memory capacity that no one could use for any purpose we can think of so 
> you should have donated the money building it to charity or to the Debian 
> community for work on developing its OS for  conventional computing 
> resources instead. Don't post anything else on this site for at least 
> another six months or until you have had a brain scan.
> OR
> 2. We would be delighted to receive the donated computer.  We think that we 
> could configure our Debian OS to run on it and yes, there would be serious 
> computing problems it could address.
> What sorts of problems would they be?  I suppose it could one that would 
> require e.g. a huge database.
> The other question I have is:  how much performance increase in database 
> applications is typicall seen using 64 bit computing?
> Regards,
> Michael Fothergill

Many years ago, when I was a graduate student at Princetion, my thesis
advisor invited me to go along on a trip to IBM to look at a computer
that IBM and the Air Force were offering as a gift to Princeton. It
was a very large room full of specially oversized relay racks filled
with electronic gear. It was left over from a very secret project
whose name I have forgotten. It turned out that Princeton could not
afford to pay the electric bill for running it without a rather
generous gift from someone (the USAF?). Princeton must have said, no
thanks, because the 'gift' never happened.

Similarly, Debian probably could not afford to house such a monster as
you talk about, nor even find volunteers whose time could not be
better used on some other project than figuring out how to use it.

Such a thing is useful to think about for very large databases. It is
good for performance to have the whole database in RAM all the time.
But, for this use, you only need enough RAM to handle your database
and scratch space for transient stuff. And, you probably need a spare
copy of your monster in some other place as a backup for continuation
of business in the unlikely case that a bolide hits the first. And,
the second copy needs to be kept in sync with the master, etc.  (You
can't really bring up such a monster from tape backup in a few days.)
So, the possibility of very large RAM allows you to reorder your
check-list of technical problems to be solved on a major database
project. It might reduce the cost, but it doesn't really shorten the
list. And, then again, it might not reduce the cost. 

It will definitely require major revision of the PowerPoint
presentation ;-)

Paul E Condon           

Reply to: