[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Building computer



Hi Catherine,

I haven't caught up with the rest of the thread but just wanted to
address a couple points here.

On 9/26/2013 11:12 AM, Catherine Gramze wrote:
> 
> On Sep 26, 2013, at 1:05 AM, Stan Hoeppner <stan@hardwarefreak.com> wrote:
> 
>>
>> What desktop applications are you using that require 8GB, let alone
>> 16GB, of RAM?  I'd think 4 would be plenty.  If you wish to over buy
>> DRAM, that's a personal choice.  It will likely not improve performance
>> in any meaningful way, for WOW in Wine, or anything else.
>>
> 
> I will be running more than one app at a time. For example WoW, a browser, a Ventrilo client, and a chat client at minimum.

4GB is more than plenty, unless WOW has turned into a complete and total
memory hog.  Obviously it eats more running through Wine emulation.  But
Win and WOW combined shouldn't eat more than 2GB, so you have 2GB left
to the rest, which is plenty.

>>> onboard sound, Realtek ALC892
>>
>> I'm no Linux sound expert.  I don't know if stock Wheezy supports the
>> 892.  Maybe others can chime in.
>>
>>> onboard NIC, Realtek 8111E
>>
>> The 8111 is supported, with non-free firmware, IIRC.
> 
> I am no fan of non-free firmware. 

Thinking back, I believe this was a temporary issue with just the one
8110/8111 GbE firmware.  This may have been sorted out already.  Realtek
missed a deadline or something so it had to go into non-free.  All the
Realtek NICs have had free firmware for quite some time IIRC.  This one
was an anomaly.  We helped somebody fix this here a few months ago.
Hell, maybe it was a year...times files.  Others here may remember this.

> Perhaps I need to look at different motherboards with different Lan and sound capabilities.

80% or so of new retail PCs, and mobos, have Realtek ethernet and
Realtek audio on board.  It's ubiquitous.  Fully supported in upstream
Linux.  The newest chips may or may not be supported by Wheezy.  I'd
guess you won't have any problems here.

>> TTBOMK, WOW doesn't require anywhere close to 2GB of VRAM for textures
>> and frame buffer, even at 1920x1080.  So your choice of video card seems
>> to be serious overkill.  
> 
> You would be wrong. 

I'll give the opportunity to re-evaluate that conclusion. ;)

> With my current Radeon 6970 I can run only on medium level graphics, with light shafts turned off and ground clutter on low. If I put the settings any higher I get serious lag in raids and in crowded environments. The official minimum specs for games are always put ridiculously low; they are for running the game at the lowest graphic settings with all options turned off. Nobody wants to play a game that looks that bad.

The AMD spec 6970 is a powerhouse:

880MHz GPU clock
1536 shaders (stream processors)
2GB GDDR5
256 bit bus
176GB/s bandwidth

This card is massive overkill for WOW.  This was a balls to the wall
fist person shooter card when introduced, and it still is.  It should
run WOW on highest settings with aplomb.

For comparison, the card I currently use is a GeForce GT240 with 96
shaders, 550HMz GPU clock, 1GB DDR3, 128bit interface, 22.4GB/s.  It
runs the Unigine Heaven benchmark at highest settings, 1440x900 full
screen, at over 15 FPS average, 22 FPS peak.  The polygon detail of this
benchmark is at least 100 times greater than what you'll see in WOW.
This card is more than plenty for running WOW smoothly in Windows.  Your
6970 should run Heaven at this resolution at 80+ FPS.  Download it and see:

http://unigine.com/products/heaven/

Run the native Linux version and the Windows version in Wine, windowed
and full screen.


The card is not the cause of your problems, unless:

1.  You bought a stripped down model from a 3rd party vendor w/128bit
bus, low clocked GDDR3 VRAM, and/or under clocked GPU core.  But TTBOMK
no such castrated 6970s ever shipped.

2.  It is broken.  Is the fan working properly?  Does it kick up to full
RPM when you launch a 3D app?  If not you're losing 2/3rds of the clock,
and performance.

3.  It is a passive radiator heatpipe model and in a case with poor
airflow to the card, causing the same low clocking as #2.

Is this problem only with Linux/Wine or also running WOW in Windows?
I'll make an educated guess it's only with Wine.  At what resolution due
to run WOW?

Another thing to consider is whether you're running WOW full screen or
Windowed.  Are you running it windowed?

All consumer based GPUs/drivers tend to drop frame rate when running an
OpenGL application windowed vs full screen.  Neither the drivers nor
hardware are optimized very well for what's called "2D overlay".

The reason for this is that there are 2 separate virtual frame buffer
regions in memory, the 2D buffer and the 3D buffer.  This is what allows
you to seamlessly switch between an OpenGL application window and the
desktop, minimize windows, etc, instantaneously.  But when running
windowed, the driver and the GPU hardware must do an additional merge
pass for every real frame in the physical frame buffer, overlaying the
virtual 3D buffer atop the 2D buffer.  This eats a tremendous amount of
CPU and GPU cycles, puts extra load on the ROPs, and eats a huge amount
of on card memory bandwidth.  This has been the case for 15+ years.
When you are in full screen OpenGL mode, this doesn't occur, because the
2D image is not part of the active FOV (field of view), thus no overlay
occurs.

The professional OpenGL cards don't have this problem because the
default application mode for 3ds MAX, Autocad, Pro/E, etc, is windowed,
not full screen.  The drivers and hardware are optimized for this mode.
 This doesn't mean you can buy a professional card and avoid this
problem with games.  Because these cards are optimized for CAD their
gaming performance is horrible.  Much has been written on this subject
over the past 2 decades.  Google if you're interested in more detail.

If you're running WOW full screen then none of this should apply.  But
it's useful information, for you, and others here.

>> Overkill again.  A 400W PSU with a single 20A +12V rail is sufficient
>> for just about any desktop PC.  The Radeon 7870 consumes a peak of 6.25A
>> at +12V on the PCIe connector, plus a peak of 6.25A at +12V through the
>> x16 slot, 12.5A, 150W peak total.  A CPU + HDD will not consume anywhere
>> close to the remaining 7.5A of +12V power.  That would be 90 watts.  The
>> bulk of a CPU's power is drawn from the +3.3V rail, where most PSUs
>> rated for 30A+, or ~100W.  If one has more than 4 spinning HDs then a
>> 400W PSU might become borderline with the 7870.
>>
> And yet the Radeon website says the 7870 requires a 500 watt power supply minimum. I quote:
> 
> 	PCI Express® based PC is required with one X16 lane graphics slot available on the motherboard
> 	500W (or greater) power supply with one 75W 6-pin PCI Express power connector recommended
...
>> A 750W PSU can handle a 115W CPU, 32GB RAM, two 7870s in Xfire, and 8
>> spinning HDs, without breaking a sweat.  

>>>>>>> AMD (and nVidia) recommends
>>>>>>> over sized PSUs for a single reason:  fewer support calls to card
>>>>>>> vendors due to lack of power to the card.  

Their minimum wattage disclaimer is aimed at the uneducated, i.e. you
before this thread, hopefully not after.  Think of the annoying dinger
in your car reminding you to put on your seat belt.  Similar thing here.
 The manufacturer is trying to protect you from yourself, and protect
itself and the card vendors, who are their direct customers.

> I chose 750 watts because I was unable to find anything between 500 and 750 on Amazon. It might be there. 

This, amongst other reasons, is why I don't shop Amazon's marketplace,
especially for computer gear.  Newegg, literally, carries just about
everything, from $0.99 Molex adapters, to $24,000 HP disk arrays.  They
carry 580 models of PC PSUs.  And another 109 models of server PSUs.
Every piece of personal electronics gear I've purchased since 2003 has
been from Newegg.  How did I learn of Newegg?  Someone mentioned them in
game chat while I playing Counter-Strike. :)

> It may be possible to use a smaller PSU, but I have seen what happens when a video card is underpowered. I don't want to go there. I want more than enough power for my graphics card plus other components.

I'm arming you with the knowledge to acquire the right size PSU, without
"going there", and save money to spend on other more important
components, or on something else entirely.  How about an analogy.

I'll assume you're familiar with horsepower.  746 watts equals 1
horsepower.  If you were to attach a 1HP DC motor and a 12V car battery
to your mountain bike with the correct size sprocket, it would
accelerate the bike and you, if you could hang on, to over 30mph, and it
would climb a fairly steep hill without slowing down much.

A 750W PSU provides slightly more than 1 HP.  Now, if you believe that
the integrated circuit chips in your computer, no matter how Hi-Po or
how many, need anywhere close to the amount of power needed to propel
you on your bike past 30mph, then I've failed horribly in my attempt to
educate you.  Yes, some ICs and combos of ICs take a lot of power.  But
a desktop PC is never going to require 1 horsepower of electricity for
Pete's sake...

Most people over buy hardware because they simply lack the proper
knowledge to make informed decisions.  I'm simply trying to inform you,
and others who are reading.

> The H87-43G is an 1150 socket board.  http://us.msi.com/product/mb/H87-G43.html

Quite right.  How did that happen?  Somehow I pulled up the wrong model
in my search on the MSI site.  Hmm...  I'd normally guess typo, but I
copy/pasted.  I'll blame a cosmic ray, as Compaq started to do back in
2003.  See top of page 5:

http://www.compaq.com/alphaserver/download/html/dhb_ev7_alphaserver_delivers_061703.html

Compaq added cosmic rays to their literature as a source of errors, the
only server vendor to do so in history, after a customer's $2 million
Alphaserver crashed inexplicably.  I'm unable to locate the original
article I read then--been over 10 years.  Compaq (DEC) engineers spent
months analyzing what could have happened and found no fault in any
circuit, any transistor, in the machine.  They surmised, based on
previous IBM research[1], that a cosmic ray passed though a transistor
in the L2 cache module, causing a bit flip soft error condition from
which the machine was not designed to recover.

[1]
http://www.jai.com/SiteCollectionDocuments/Camera_Solutions_Application_Tech_Note/TechNote-TH-1087-CosmicRays.pdf

>> The socket vs upgrade concern is not valid.  It will be at least 10
>> years before desktop applications, especially Linux apps, are
>> sufficiently threaded to take advantage of today's 4 core CPUs, let
>> alone 6/8 cores.  New hardware doesn't make old hardware obsolete.  New
>> software does.  There are many people, including myself, who would say
>> this applies even to dual core CPUs.  And in fact, at the rate of
>> desktop software development today, WRT threads, one is better off
>> purchasing a higher clocked dual core CPU with lots of cache than a quad
>> core model at a lower clock, especially given power consumption.
>>
> This applies to everyone except gamers, Stan. Gamers upgrade far more often than other people, because game graphics are constantly requiring ever more advanced GPUs.  The industry is genuinely gamer-driven..

Heheh.  I was running dual Voodoo2s in SLI in 1998, probably a little
before your time.  No, not nVidia's horribly inefficient Scalable Link
Interface.  I'm referring the the original SLI, 3Dfx's Scan Line
Interleave, which actually scaled very well.  Sad to say, I spent over
$700 in 12 months on 3 video card upgrades alone at that time.  That's
1998 dollars, equivalent to over $2000 today.  For reference, the
Pentium II 400 was bleeding edge then, and the ultimately overclockable
Celeron 300A at 450MHz was the gamers' choice of CPU.

This was at the dawn of realtime 3D hardware rendering for PC games,
before Direct3D, before Carmack made OpenGL popular, when no standards
had yet emerged.  3Dfx was the only choice for performance, and perform
they did.  It was years later that nVidia came out with the GeForce 256
and coined the term "GPU" that is now ubiquitous.  I've purchased over a
dozen graphics cards since 98, not to mention all the other gaming
driven hardware purchases.  And I've spent far more than that on server
hardware...

I wish you'd not have mentioned this.  I've tried to forget the many
thousands of dollars I wasted on now useless hardware over the past 20
years, because I didn't possess back then the knowledge I've gained
since.  Spending thousands chasing perfectly smooth, hiccup free game
play.  Gaming period.  Perceptions and priorities change over time.
You'll feel the same down the road a few years.

But to address your specific point, and this is not meant as an insult,
most gamers are completely ignorant of how 3D gaming software actually
works, how hardware actually works, and what is actually required to
achieve the desired performance level.  Due to their ignorance, and
basing purchase decisions on what other ignorant people recommend, most
gamers spend 2-4x the money they need to on hardware to achieve the
desired performance.  4/6/8 core CPUs, dual or more GPUs, are the prime
examples.  The sad things is, in many instances with distributed online
gaming, network latency is the cause of lag, and no amount of hardware
can fix that.  That's an application level problem.

> I can deal with never having the most advanced GPU, but I can't deal with the abysmal graphics that the integrated chipsets provide. And those all share memory with the system, so the 4 gig you suggest would be wholly inadequate when playing games as well as running other apps.

You should realize by now that I actually know a little bit about that
of which I speak.  Integrated GPUs using system DRAM for frame and
texture buffer are not high performance.  But typically for non shooter
3D games, such as WC3, WOW, etc, it's more than adequate, because the
polygon rate is typically a couple of orders of magnitude lower, and
you're not dealing with as many textures, nor as complex.

The best advice you'll receive on this subject from a pure performance
stand point follows:

WRT gaming hardware, in general:

1.  Spend your dollars where they have maximum impact, don't waste
    money where it won't make a difference.  I.e. buy a fast dual
    core CPU w/big cache, not a useless quad.  Spend the savings
    on the GPU, good mouse, case with good airflow, etc.

2.  Always spend on a single fast GPU.  SLI/Xfire is a massive waste of
    dollars because of the horrible inefficiency.  You will get better
    peak performance from SLI at high resolution, but the ratio of
    dollars to FPS is not linear, it's a very steep curve.

3.  By a real soundcard and avoid the AC97 stuff.  AC97 sound will
    always cause in game lag, at the worst most critical times.

4.  With modern CPUs, don't bother with overclocking.  It won't make any
    significant gains in game play but may cause stability problems,
    which you'll have to waste time sorting out.


WRT setup/architecture:

1.  Ditch gaming on Linux with Wine emulation.  Convert your current
    gaming PC to Windows only.  Then you won't *need* to spend on new
    gaming hardware.  Your current 6970 GPU is plenty beefy.  You've not
    stated your current CPU that I recall, but I'd guess given the GPU
    you have that it's plenty beefy as well.

2.  Build/buy a little nettop style box to run Linux and your
    productivity apps, web, email, etc.  $300 or less, far less than
    a new gaming rig.

3.  Buy a good KVM and switch between the two systems.


If this is not a palatable option for you and you are determined to run
WOW in Wine, then a CPU with higher per core IPC, a faster clock and/or
more L2/L3 cache may help some, but more cores will not.  Neither will
more system RAM or GPU RAM, if what you currently have is adequate, and
it is.  A faster GPU won't help either.  With emulation, host CPU
instruction dispatch rate is your main bottleneck, but there are others,
such as interrupt passing, etc.

Hope this information is received in the spirit in which it is intended,
which is to educate, not argue.

-- 
Stan


Reply to: