If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
At the heart of NVIDIA’s new G80 GPU are its new shading units, NVIDIA dubs them “stream processorsâ€.
Looks like NV is going GPGPU; adding the capability to process software in the GPU using parallel processing.
Problem from that standpoint is that they'll be doing it in a card while ATI/AMD will be doing it right in the processor core with their Fusion chips. Some Fusions will have 2 GPU-style processing units on each core
That out of the way: 177W is just pure insane, even if it's the figure for SLI.
Dr. Mordrid ---------------------------- An elephant is a mouse built to government specifications.
I carry a gun because I can't throw a rock 1,250 fps
And it'll be loud like a leaf blower, and nobody cares except the FPS-kiddiez who "NEED" the latest card.
Most gaming is still done on 9600-class hardware. Hell, most of my friends still have one of my old cards. Lots of people still play games on GF3, Radeon 8500, etc.
So this is nifty, but... relatively pointless.
The Internet - where men are men, women are men, and teenage girls are FBI agents!
I'm the least you could do
If only life were as easy as you
I'm the least you could do, oh yeah
If only life were as easy as you
I would still get screwed
Actually, some of the reviews I've seen report that under load it can go up to 220 W (but that probably includes the cooling fans, etc.). I believe that the rumored requirement for power for a 8800 SLI rig was in the neighborhoos of 750 W, w/ 20 A per 12 V SLI rail. 1 kW was recommended
Jammrock
“Inside every sane person there’s a madman struggling to get outâ€
–The Light Fantastic, Terry Pratchett
I've never understood the people who buy $800 video cards. It makes no sense at all. The most expensive card I've ever purchased was the G400, and over the years I've never found a game I couldn't play at reasonable frame rates and resolutions. I just upgrade to a low-to-midrange card every two years or so, and I'm good. I just picked up a Radeon X1900GT for like $220, and it's fantastic so far.
Lady, people aren't chocolates. Do you know what they are mostly? Bastards. Bastard coated bastards with bastard filling. But I don't find them half as annoying as I find naive, bubble-headed optimists who walk around vomiting sunshine. -- Dr. Perry Cox
And it'll be loud like a leaf blower, and nobody cares except the FPS-kiddiez who "NEED" the latest card.
Most gaming is still done on 9600-class hardware. Hell, most of my friends still have one of my old cards. Lots of people still play games on GF3, Radeon 8500, etc.
So this is nifty, but... relatively pointless.
And thats the point really, if Nvid want the average graphics card put into mainstream PC's to progress past the 9600 class then you need groundbreaking cards to establish new technology, and then wait a few cycles for all of its speed and features to filter down and be refined into a sensible $150-200 card.
Not many will buy these super high end cards today for $600 but I bet joe average will in 2 years time in his dell wether he knows it or not.
is a flower best picked in it's prime or greater withered away by time?
Talk about a dream, try to make it real.
I've never understood the people who buy $800 video cards. It makes no sense at all. The most expensive card I've ever purchased was the G400, and over the years I've never found a game I couldn't play at reasonable frame rates and resolutions. I just upgrade to a low-to-midrange card every two years or so, and I'm good. I just picked up a Radeon X1900GT for like $220, and it's fantastic so far.
You and me both! I'd rather get an inexpensive laptop at that price. Most I ever spent was 270$, and that was on a GF2 Pro. Lasted about 2 years till I realized that I could get a faster card at 100$.
Titanium is the new bling!
(you heard from me first!)
I've never understood the people who buy $800 video cards. It makes no sense at all. The most expensive card I've ever purchased was the G400, and over the years I've never found a game I couldn't play at reasonable frame rates and resolutions. I just upgrade to a low-to-midrange card every two years or so, and I'm good. I just picked up a Radeon X1900GT for like $220, and it's fantastic so far.
I used to be on that bandwagon, till the Radeon 9700 PRO came out...That card lasted me longest at two years or so. I got a X800XL to replace it in the system i built in 2005, but I never gamed quite as much on it due to the 24inch WS monitor I got about 6 months later. I can finally game with the new system I just got with the Crossfire X1950Pro's I got and they only cost me $400 bucks for the 2 cards..
Why is it called tourist season, if we can't shoot at them?
I've never understood the people who buy $800 video cards. It makes no sense at all. The most expensive card I've ever purchased was the G400, and over the years I've never found a game I couldn't play at reasonable frame rates and resolutions. I just upgrade to a low-to-midrange card every two years or so, and I'm good. I just picked up a Radeon X1900GT for like $220, and it's fantastic so far.
I did it sort of once but never again. Matrox Marvel G200 with dual 12MB Voodoo IIs added up to $800 US.
Looks like NV is going GPGPU; adding the capability to process software in the GPU using parallel processing.
Generally speaking, video cards could do this for years. It was just that AGP sucked for transferring the results back to the rest of the system.
Problem from that standpoint is that they'll be doing it in a card while ATI/AMD will be doing it right in the processor core with their Fusion chips. Some Fusions will have 2 GPU-style processing units on each core
I'd rather have the video processing done close to the video buffers, rather than closer to the CPU.
Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
And thats the point really, if Nvid want the average graphics card put into mainstream PC's to progress past the 9600 class then you need groundbreaking cards to establish new technology, and then wait a few cycles for all of its speed and features to filter down and be refined into a sensible $150-200 card.
Not many will buy these super high end cards today for $600 but I bet joe average will in 2 years time in his dell wether he knows it or not.
By your logic of 2 years behind, the average Dell should now be shipping with a Radeon X-class card, or AT LEAST Radeon 9600 class graphics, right? But it's not. The average Dell right now ships with onboard Intel graphics that can MAYBE rival an original Radeon (aka Radeon 7000) - a 6 year old graphics card. The "trickle down" effect is, in this case, nonexistant.
The Internet - where men are men, women are men, and teenage girls are FBI agents!
I'm the least you could do
If only life were as easy as you
I'm the least you could do, oh yeah
If only life were as easy as you
I would still get screwed
The average server ships with that, but the average consumer computers that my friends have bought have ATI X800s in them.
Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
Comment