ive been trying to find a review on the net that has benchmarks of the ATI Radeon LE with/without hyperZ enabled to see what type of difference it makes in real world.
so far all reviews also seem to clock the card up to the same speed as the Radeon 32DDR, which isnt realy useful.
What I want to find out is if it realy makes all that much difference.
Everybody is giving the Parhelia a hard time due to lack of hidden surface removal hardware, but we dont realy know for sure how much difference it makes.
I do have a Radeon LE myself, but its in the wifes computer, and she wont let me play with it (shes writting her thesis, so fair enough).
If anybody has a Radeon LE in windows 9x, and has a tweaker like Radedit or whatever, I seem to remember there was a way to enable/disable certain parts of Hyper-Z.
The Parhelia has fast Z clear, so what we need is a benchmark between:
Standard RadeonLe with fast Z clear enabled.
RadeonLe with full HyperZ enabled
The differences between these should show %age wise what sort of difference HSR makes in the real world.
Anybody keen?
Links:, Ill update as I find more.
PS, found a little bit on the XBit site.
Merc truck Racing 1024*768*16
RadeonLE : 51.9
RadonLe +HZ:51.5
UT 1024*768*16
LE:42.39
LE+HZ:41.78
And the interesting this:
3Dmark 2000
Game1 low:
LE: 59.2
HZ: 77
Game1 high:
Le: 22.6
HZ: 24.8
Game2 Low
LE: 54.5
HZ: 69.1
Game2High
LE: 33.1
HZ: 36.2
My conclusions:
From the Xbit stuff, looks like real games makes no difference, fake gamse (3dmark) makes a big difference at low settings, a little difference at high settings.
This still isnt acurrate, as we dont know how much of the difference is from fast Z clearing.
I would say that all real games have already done most of the HSR themselves in their engines, and only badly written games/synthetic benchmarks are going to see any major difference.
I would also say the Parhelia would Suck at Village Mark, but that was deliberatly written to render the scene in a inefficient way, which we (shouldnt) see in any real games.
Please feel free to add anything to this/correct my conclusions.
Ali
so far all reviews also seem to clock the card up to the same speed as the Radeon 32DDR, which isnt realy useful.
What I want to find out is if it realy makes all that much difference.
Everybody is giving the Parhelia a hard time due to lack of hidden surface removal hardware, but we dont realy know for sure how much difference it makes.
I do have a Radeon LE myself, but its in the wifes computer, and she wont let me play with it (shes writting her thesis, so fair enough).
If anybody has a Radeon LE in windows 9x, and has a tweaker like Radedit or whatever, I seem to remember there was a way to enable/disable certain parts of Hyper-Z.
The Parhelia has fast Z clear, so what we need is a benchmark between:
Standard RadeonLe with fast Z clear enabled.
RadeonLe with full HyperZ enabled
The differences between these should show %age wise what sort of difference HSR makes in the real world.
Anybody keen?
Links:, Ill update as I find more.
PS, found a little bit on the XBit site.
Merc truck Racing 1024*768*16
RadeonLE : 51.9
RadonLe +HZ:51.5
UT 1024*768*16
LE:42.39
LE+HZ:41.78
And the interesting this:
3Dmark 2000
Game1 low:
LE: 59.2
HZ: 77
Game1 high:
Le: 22.6
HZ: 24.8
Game2 Low
LE: 54.5
HZ: 69.1
Game2High
LE: 33.1
HZ: 36.2
My conclusions:
From the Xbit stuff, looks like real games makes no difference, fake gamse (3dmark) makes a big difference at low settings, a little difference at high settings.
This still isnt acurrate, as we dont know how much of the difference is from fast Z clearing.
I would say that all real games have already done most of the HSR themselves in their engines, and only badly written games/synthetic benchmarks are going to see any major difference.
I would also say the Parhelia would Suck at Village Mark, but that was deliberatly written to render the scene in a inefficient way, which we (shouldnt) see in any real games.
Please feel free to add anything to this/correct my conclusions.
Ali
Comment