If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
<font face="Verdana, Arial, Helvetica" size="2">I think NVIDIA is probably bending over backwards to support Apple and gain the OEM business from ATI.</font>
At $600.00 a shot??????
Joel
Libertarian is still the way to go if we truly want a real change.
While i agree that 600$ is a lot of money for the card,i have to admit that what we saw in the demos was extremely impressive(especially seing the doom 3 engine at work),and building a graphics chip with 57 million transistors must have been quite a challenge as well.
This is the kind of card that's able to handle every single game that will be released in the next 2+ years without breaking much of a sweat.
Now i have just one question...can the G800 compete with it???...
Not that i doubt that matrox has the ability to(at least i hope they do),but it seems to me that even though they've had 2+ years to develop something new,going from a G400 to something that can pose a serious challenge to the GF3 in 3d performance seems like asking a lot.
note to self...
Assumption is the mother of all f***ups....
Primary system :
P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...
<font face="Verdana, Arial, Helvetica" size="2">This is the kind of card that's able to handle every single game that will be released in the next 2+ years without breaking much of a sweat.</font>
That's the dumbest statement I've ever heard, do you have any clue what kind of games will be released in the next 2+ years???
lol
System 1:
AMD 1.4 AYJHA-Y factory unlocked @ 1656 with Thermalright SK6 and 7k Delta fan
Epox 8K7A
2x256mb Micron pc-2100 DDR
an AGP port all warmed up and ready to be stuffed full of Parhelia II+
SBLIVE 5.1
Maxtor 40g 7,200 @ ATA-100
IBM 40GB 7,200 @ ATA-100
Pinnacle DV Plus firewire
3Com Hardware Modem
Teac 20/10/40 burner
Antec 350w power supply in a Colorcase 303usb Stainless
You see,i've been using a Gf2 64 meg for about 9 months now,and to this day there isn't a single game out there that's forces to it work at resolutions under 1024*768 32bit in order to keep an acceptable frame rate(60+ fps),so even if i bought the Gf3 it woudn't be because my existing card can't handle the resolutions that i like to play anymore.
Even the most demanding games are at least 18 months behind the best hardware,mostly because developers what to see their games played on the biggest user base possible(and affects potential sales).
Last time i checked,apart from from all the dx8 features that the Gf3 has built in,it's easily twice as fast(or more-HSR)as my card in real world fps performance as far as effective fill rate is concerned.
It could very well be the first card that will be able to hit 100 fps in Q3 at 1600*1200 32 bit,so i don't really forsee any game in the next 2 years that will force that particular card to play games at 1024*768 32 bit or under(at 60 fps),and that includes doom 3,which will probably be the most advanced engine available within the next two years.
You see,you need 8 times less fill rate at 1024*768 32 bit compared to 1600*1200 32 bit(assuming the same target fps).
I know that there will be even faster video cards later on,but the point here is that the Gf3 will be more than enough card for the next 2 years.
So even if a developer starts building a new game TODAY,after seeing what the gf3 is capable of,and his intention is to make a game so demanding it will drive that the card to it's limit's,in which the it will only handle the game at the lowest possible resolutions(800*600 and below),the simple fact that it usually take's 2+ years to build the game anyways.
So i ask again,in exactly what way was my statement dumb???...
note to self...
Assumption is the mother of all f***ups....
Primary system :
P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...
Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
yea, i got 2.44, where'd u get 8?
although the graphics do indeed look nice in the quake3 shots, i do not think it will be worth the money. How do you guys think about the bang/buck ratio?
We've drawn into new frontiers with graphics technology.
Any new top-of-the-line card from any manufacturer will cost an arm and a leg. I do not expect ATI or Matrox to go below $400USD for their best offerings.
nVidia can charge $600 because they are months ahead of releasing a product that anyone will be able to compete against. Once ATI/Matrox have something to show, I bet the GF3 cards will have come down about 40% in price. (I'll be optimistic and say June is the soonest we'll see a Radeon 2 or G800)
nVidia is simply doing what any other technological company does. Charging a premium for its best product. (P4, Thunderbird when it was originally released, etc)
One side note, the MSRP for the GF3 has gone through rigorous marketing. People *will* buy that card at that price.
[This message has been edited by isochar (edited 27 February 2001).]
Sure, some rich bastard will always buy the best of the best, but i doubt the majority will. That card will cost atleast $1000 CAD locally for 1 single component in your computer. I could almost make 2 whole computers for the cost of that.
Now lets all hope that soon a card will be released as a successor to the radeon, the g400, or the kryo to rival nvidia.
Comment