[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Building computer



On 9/28/2013 9:16 PM, berenger.morel@neutralite.org wrote:
> 
> 
> Le 28.09.2013 22:46, Stan Hoeppner a écrit :
>> On 9/28/2013 8:14 AM, berenger.morel@neutralite.org wrote:
>>>
>>>
>>> Le 28.09.2013 13:33, Stan Hoeppner a écrit :
>>>> Hi Catherine,
>>>>
>>>> I haven't caught up with the rest of the thread but just wanted to
>>>> address a couple points here.
>>>>
>>>> On 9/26/2013 11:12 AM, Catherine Gramze wrote:
>>>>>
>>>>> On Sep 26, 2013, at 1:05 AM, Stan Hoeppner <stan@hardwarefreak.com>
>>>>> wrote:
>>>>>
>>>>>>
>>>>>> What desktop applications are you using that require 8GB, let alone
>>>>>> 16GB, of RAM?  I'd think 4 would be plenty.  If you wish to over buy
>>>>>> DRAM, that's a personal choice.  It will likely not improve
>>>>>> performance
>>>>>> in any meaningful way, for WOW in Wine, or anything else.
>>>>>>
>>>>>
>>>>> I will be running more than one app at a time. For example WoW, a
>>>>> browser, a Ventrilo client, and a chat client at minimum.
>>>>
>>>> 4GB is more than plenty, unless WOW has turned into a complete and
>>>> total
>>>> memory hog.  Obviously it eats more running through Wine emulation. But
>>>> Win and WOW combined shouldn't eat more than 2GB, so you have 2GB left
>>>> to the rest, which is plenty.
>>>
>>> I am only quickly reading the thread, and it's the 2nd time I see "wine"
>>> associated with "emulation".
>>> As the name says, WINE Is Not An Emulator, it does not emulate a
>>> computer, it does not emulate the windows' kernel, it emulates nothing.
>>
>> <snip>
>>
>> Whether you call it an emulator, translator, simulator, or Santa Claus,
>> it 'provides' the Windows APIs to the application, the DLLs, etc.
>> Providing this does require additional memory.  It's not a large amount
>> by today's standards, but it is non negligible.  I made that case above
>> and previously in the thread.
> 
> So, you do think that an emulator is the same as a dynamic library? If
> yes, well... why not. But then, since all DL are emulators, do not use
> that word to prove that they'll consume more resources, being CPU or
> memory.
> 
>> So I'm unclear as to why you picked my reply for your rebuttal, given
>> we're on the same page.
> 
> The reason for which I replied, is that an emulator emulates a complete
> system and this have a huge overhead. WINE, as a dynamic library, could,
> theoretically ( I say theoretically because I did not ran any tests,
> I'll be honest on that point. Plus, it's impossible to have strictly the
> same costs ) have the same overhead than window's API. In practice, it
> will have a small CPU overhead, but to say it's not a small one, one
> should give some valgrind analysis.
> 
> About the reason to reply to your post and not another one, it was
> because it was the second one saying that, that I had read on the same
> thread at the moment I replied :)
> It was not personal.
> 
>> The problem with 3D game performance under Wine
> 
> I will not say it does not cost more than running on windows. I have no
> proofs. But it is not related to the fact it is an emulator, it can only
> be related to the fact it is a badder implementation, or one with more
> more layers.
> 
>> is not memory
>> consumption, but the CPU overhead,
> 
> CPU for 3D stuff? You might be right and I am probably wrong, but could
> not it be because linux's 3D drivers are less good than windows' ones?
> This is a real question, not a troll, and the reason of that opinion of
> mine is quite easy to understand, and so, probably very simplistic:
> video games are mostly targeting windows' users, and so, there were more
> money and time spent on enhancements... on windows' side.

This is absolutely a valid point.  And it surely depends on which GPU
one uses, whether they use an open source reverse engineered 3D driver,
or a proprietary vendor binary only driver.

Much has been written in the technical press regarding the significant
resources both AMD and nVidia expend optimizing their MS Windows drivers
for specific games.  And in fact, in nVidia's case, the driver control
panel allows the end user to optimize driver parameters on a per game
basis.  Some of these optimizations reduce the load on the host CPU,
some reduce the load on the GPU's shaders, some reduce the load on the
GPU memory subsystem, etc.  So there's not question a given game can run
faster on Windows than Linux, simply due to this level of optimization.

> Well, again, I admit, I have no benchmark to prove my words.
> Of course, on a more technical point, I can agree that one more layer
> for OpenGL related stuff might have a cost. But, that cost might also be
> removed at compile time.

Recall the optimizations above?  On of the tricks the optimized Windows
drivers often perform is replacing an expensive inbound GL operation
with a less expensive one that may sacrifice detail for speed, or
replace a complex operation with a series of less complex calls that
execute more quickly on the host CPU or in a particular GPU shader.
They can do this because they heavily profile the execution of each
Windows game, or at least a great many of them.  These optimizations, or
shortcuts, are not present in the Linux binary blob closed drivers, and
certainly nothing like this exists in the open source drivers.

I can't recall which magazine, whether it was Tom's or Anandtech, etc,
but they uncovered this driver cheating nVidia was doing, many years
ago.  There was an operation one of the benchmark game demos was
performing.  nVidia discovered via profiling the code that the operation
had almost no discernible visual impact, but a large impact on CPU/GPU
throughput.  To increase the FPS of this code vs ATI hardware, they
inserted a check in the driver for this specific game.  When the
operation passed through the driver they replaced it with a no-op.  This
increased FPS of the nVidia solution by some 10%.

The journalists caught it simply because they were using identical
hardware and game version from a previous test.  The only change was to
a newer nVidia driver.  They were suspicious that a slightly newer
driver could change performance so much for just one game, and uncovered
the "cheating".  This is no longer considered "cheating" by the press,
because both AMD and nVidia profile all the popular games and add these
optimizations.

But you won't see these optimizations in the Linux drivers.  There's
just not enough profit in the Linux space to allow for the profiling and
optimization work.  That, and AFAIK it's not possible to do this with
Wine in between the driver and the Windows game code.

> I can learn that my opinion here is wrong, I have no problem with that.
> I'm wrong on a lot of things after all, and am always happy when I learn
> that I was wrong on something else. But give me any reason. Or a proof.
> Use a linux kernel, and a WINE base environment, then show be
> benchmarks. That would be sufficient. Or reasons for why wine should
> cost much ( so that I could do some searches on your words and my errors ).

I can explain how/why the technology works the way it does, but asking
me to perform comparative benchmark testing for you a little much, don't
you think?  Others have already done so.  Google is your friend.

>> which I also made clear previously.
> 
> This is exactly why I admitted having only read quickly the thread.
> Sorry, but I did not noticed that part. From my memories, what I have
> read and that might not be your own words ( I have also read that you
> say that a lot of RAM is useless for most users, and I agree with that )
> was that wine + wow would take at least 2GB. 

No, no.  Context is important here.  The OP stated she needed more than
4GB of RAM.  I stated that WOW on Wine shouldn't even require 2GB, but
for argument's sake, if it did need 2GB, then all of her other stuff
would easily fit in the remaining 2GB.

I stated previously that I've never played WOW, nor WOW on Wine.  So I
can't just fire it up and look at top to see what it uses.  I had some
friends that were playing WOW back when it first came out, almost 10
years ago.  They were playing on Win XP machines with 256MB RAM and
GeForce 4 MX cards with 32MB VRAM.  Obviously the game has evolved since
then, but no game increases in processing complexity/load by a factor of
10x in its lifetime.

> It may be true, but from
> what I remember to have read, there was an implicit affirmation that it
> was due to wine, when it was because of the whole system.
>
> My reply was not against anyone, my apologies if it seems so. It was
> because an emulator is something, and an API is another different thing
> (but, with enough abstraction, we could say it's the same, since their
> uses is always to make a software running...) which have less costs.

The problem is you jumped in because you perceived someone to have
misused the word "emulator".  Look the word up in dictionary.  It does
*NOT* say "software that translates the instructions of one processor
architecture to execute natively on another."

Neither myself nor anyone else used the word "emulator" in this context.
 I/we did not say Wine was a "classic computer science emulator".  Wine
emulates Windows.  Using the dictionary definition of "emulate" this
statement is absolutely correct.

Context matters, always.  One needs to understand the context of a
discussion before jumping in to clobber someone of word misuse. ;)
Simply looking at the subject line of the discussion makes it pretty
clear this is not a discussion of the technical underpinnings of Wine.

-- 
Stan


Reply to: