Announcement
Collapse
No announcement yet.
Doom 3 vs. Half Life 2: Regardless of Which is Better, We All Lose
Collapse
X
-
I think statements like these make the article a bit interesting...
"Did ATI gain from its Valve partnership? No. Will it, in the long run? Only if final performance figures vindicate the purchasing decisions of people who bought HL2-enabled cards as long as 18 months ago. If they don’t, ATI and Valve both end up looking like liars, if they do, we end up with a bifurcated system where you buy NVIDIA for Doom, ATI for Half Life, and suffer if you wanted both. "
He makes it sound like those that bought the ATI cards expecting good performance out of HL2 will suffer!!!
Why are they going to suffer??? is Doom3 the only game on earth? So ATI plays Doom3 a few frames short of NVidia but the image quality is identicle, where's the problem here?
I guess we'll wait until HL2 is released September 2, and find out what happens.
Regards,
Elie
Comment
-
its a good article, but yet they are still stuck in the 3dfx way of thinking, which is more FPS=better video card, which totally shoots down their argument, if both cards are equal otherwise.Why is it called tourist season, if we can't shoot at them?
Comment
-
I'm getting bit by stuff like this right now. I just bought Star Wars: KOTOR. It crashes all the time on ATI cards. I know it's the game's fault, and not ATI's, but damn it's annoying.Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
Comment
-
Originally posted by Elie
I think statements like these make the article a bit interesting...
"Did ATI gain from its Valve partnership? No. Will it, in the long run? Only if final performance figures vindicate the purchasing decisions of people who bought HL2-enabled cards as long as 18 months ago. If they don’t, ATI and Valve both end up looking like liars, if they do, we end up with a bifurcated system where you buy NVIDIA for Doom, ATI for Half Life, and suffer if you wanted both. "
He makes it sound like those that bought the ATI cards expecting good performance out of HL2 will suffer!!!
Why are they going to suffer??? is Doom3 the only game on earth? So ATI plays Doom3 a few frames short of NVidia but the image quality is identicle, where's the problem here?
I guess we'll wait until HL2 is released September 2, and find out what happens.
Regards,
Elie
And then a year later the game finally releases, if the final numbers turn out to be decent, they'd think, gee, NV3X are actually playable, and now i want better linux drivers, i should have gotton an NV3X card instead of this R3X0 card.
ATI doesn't do bad in D3 at all. The writer emphasized that many times in the article. Its just that IF (a big if) the game didn't do so bad on nvidia hardware afterall, there may be some disappointed customers who wanted what other things nvidia cards have to offer. And if that even happen, it will haunt ATI's sales in the future. (at lesat the sales of those who have read this article )
Comment
-
Originally posted by GT98
its a good article, but yet they are still stuck in the 3dfx way of thinking, which is more FPS=better video card, which totally shoots down their argument, if both cards are equal otherwise.
One last thing, both nvidia and ati sucks. Matrox forever! (but then they did miss the speed category, whatever)
edit: Thanks Dave! That was a very interesting read!Last edited by Chrono_Wanderer; 22 August 2004, 22:25.
Comment
-
I guess the performance-delta in the final HL2 will be about the same as it is in Doom3 - and all new - ATI+NVidia - cards will be on a very playable level.
Since HL2 uses lots of shaders, but only short ones (well suited for ATI), we likely will see a "fps-victory" for ATI that is dependent on the ATI cards higher fillrate (->30% max., most likely smaller due to other things like AI, CPU-dependency, etc.). This goes for the NV40 vs. the R420, however, since NVidia with the NV40 finally corrected their shader-performance
The older NV cards of the NV30-series will still most likely suck at HL2 because of their inferior shader performance.
So the writer is not fully correct. When you want to play HL2, it surely was a good idea to NOT get an NV3x based card.
If those buyers from that time can feel betrayed depends more on the question: how good will HL2 play on the old R300 based ATI cards?
If it's not really playable well on these cards, they would have every right to feel betrayed...Last edited by Indiana; 24 August 2004, 06:26.
Comment
-
Matrox only lost me as a customer do to their crappy linux support. Here it is, a little more than 8 months later and they still haven't fixed their driver so that it works properly with the 2.6.x kernels.
The fact that some people DO buy their video card based on how FAST it's going to run a game over another card is sad but true. Personally, I figure any high-end card (especially those that are marketed for gamers (we should all burn Matrox for burning us on this)) should be able to play games nicely. I guess I can't blame Matrox too much, since they did release their Parhelia more than 2 years ago. Doesn't really matter now, I wouldn't go back to them due to the bad linux support.
LeechWah! Wah!
In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.
Comment
-
A quick question on HL2... I've read several places that at least the Vampire the Masquerade: Bloodlines (which is what I'm waiting for, HL2 may be cool, but I'm waiting on the Vamp game that uses it's engine!) will have support for SM 3.0. So if this is the case, wouldn't that give the NV40 at least a comparable performance to the R420? Or maybe even better? I'd guess it all depends on how much of the game uses it.
LeechWah! Wah!
In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.
Comment
-
Depends on the used shaders. The rare examples of Shader 3.0 being used didn't bring great performance gains to the NV40, so I wouldn't expect to much.
Still, with longer shaders, the NV40 should gain performance compared to the R420.
Comment
-
Originally posted by Indiana
Depends on the used shaders. The rare examples of Shader 3.0 being used didn't bring great performance gains to the NV40, so I wouldn't expect to much.
Still, with longer shaders, the NV40 should gain performance compared to the R420.Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
Comment
-
Do you mean the 6800? If you did, well before the 1.2 patch all nvidia boards defaulted to a lower precision mode (if I'm not misstaken) that produced lower quality images than the latest patch that added 3.0 shaders.
I haven't checked this myself though, as I own a Radeon.
Comment
-
Actually from everything I've seen, the 1.2 patch (that was recalled) added support for the SM 3.0 actually ran a lot faster on the 6800 cards. The reason they recalled it, is because all of the ATI users whined that it broke the game (don't remember exactly the issues that ATI users were having, but they were pretty bad). I'm sure over at nvnews.net there are a lot of people who posted comparisons.
LeechWah! Wah!
In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.
Comment
-
Originally posted by Novdid
Do you mean the 6800? If you did, well before the 1.2 patch all nvidia boards defaulted to a lower precision mode (if I'm not misstaken) that produced lower quality images than the latest patch that added 3.0 shaders.
Comment
Comment