When we compare hardware we tend to put a large emphasis on benchmarks, but how valid are they ?
Take graphics cards for instance, the reviewer runs a demo on a game, the game reports the number of frames rendered per second and this is taken as a gauge of the cards speed.
But how does the game measure the frames rendered ? Can it be trusted ?
And CPU tests, sure the test can tell you how the CPU/board/memory configuration performs while doing a predetermined combination of calculations, but are they a reflection of how the CPU will be used in real life ?
Consider a graphics card that gets a minimum FPS of say 35, that should be good enough to be considered smooth. But those 35 FPS are the total number of frames rendered in a second, it could very well be that the card renders 34 frames in the first half second and just 1 frame in the last half. This would be jerky, while a card give a minimum of 20 FPS, where the 20 frames are spaced equally in the course of the second would be a lot more smooth, perhaps even very playable.
But then again, if we cannot trust benchmarks, how can we evaluate hardware at all ? By comparing the manufacturers specs ? I think not, 2 products with similar specs can perform very different, both due to drivers and due to differences in the hardware not shown in the specs.
I think the only fair way to compare products is by using them, unfortunately we, as consumers, rarely have to possibility to test all the different products on the marked before making our choice. And as most hardware sites only tests products by benchmarking them, often without the best drivers and optimizations, due to lack of time (I would guess), we are again left with the benchmarks.
I'm just ranting here, but it's bothered me for quite a while that there seems to be no valid way of determining how hardware compares to the competition.
Take graphics cards for instance, the reviewer runs a demo on a game, the game reports the number of frames rendered per second and this is taken as a gauge of the cards speed.
But how does the game measure the frames rendered ? Can it be trusted ?
And CPU tests, sure the test can tell you how the CPU/board/memory configuration performs while doing a predetermined combination of calculations, but are they a reflection of how the CPU will be used in real life ?
Consider a graphics card that gets a minimum FPS of say 35, that should be good enough to be considered smooth. But those 35 FPS are the total number of frames rendered in a second, it could very well be that the card renders 34 frames in the first half second and just 1 frame in the last half. This would be jerky, while a card give a minimum of 20 FPS, where the 20 frames are spaced equally in the course of the second would be a lot more smooth, perhaps even very playable.
But then again, if we cannot trust benchmarks, how can we evaluate hardware at all ? By comparing the manufacturers specs ? I think not, 2 products with similar specs can perform very different, both due to drivers and due to differences in the hardware not shown in the specs.
I think the only fair way to compare products is by using them, unfortunately we, as consumers, rarely have to possibility to test all the different products on the marked before making our choice. And as most hardware sites only tests products by benchmarking them, often without the best drivers and optimizations, due to lack of time (I would guess), we are again left with the benchmarks.
I'm just ranting here, but it's bothered me for quite a while that there seems to be no valid way of determining how hardware compares to the competition.
Comment