While users seem to love the concept, the amount of CPU used to do it seems to vary wildly. From 12% or so on my ThinkPad T60 to nearly 50% on systems that are much faster.
The difference? So far, it basically can be thrown to the feet of nVidia. Their drivers, which not only seem to crash PC games do double-duty in being very inefficient with DreamScene.
Below are the results on my laptop. Note the very low (as in not great) performance rating and yet its relatively low CPU use while running a DreamScene. It has an ATI card.


Now, Microsoft is getting a taste of what we've gone through. Maybe this time, nVidia will be motivated to worry about desktop drivers rather than trying to show they can get an extra frame per sec in a first person shooter.
Still, if you have a good machine, nVidia or ATI, the CPU use isn't a big deal in actual use, especially if the .DREAM file's content is in MPEG and not WMV (WMV is more CPU intensive). And hopefully soon nVidia's drivers will get better. Because I want to get an Geforce 8800 but am not going to get one until they fix this stuff.
No comments:
Post a Comment