If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
This technology will also be very expensive and is simply not aimed at the mainstream market. NVIDIA SLI is meant for a “hardcore only†market segment, but it will likely gain NVIDIA a tremendous amount of favorable press along the way. NVIDIA SLI is going to be very costly and implementation of it will likely run over US$1300 considering two GeForce 6800Ultras and a motherboard to suit them will need to be purchased. Possibly a new power supply would be required as well depending on the wattage of your current unit.
Yeah right I'm gonna spend over $1300 on two video cards
My V2 SLI with a G200 Marvel was $600 bucks back 5 years ago.
Why is it called tourist season, if we can't shoot at them?
Originally posted by TdB now, the only difference lies in the framerate, and the nv40 can already run every game REALLY fast at any setting, that is, faster than the refreshrate most people run their monitors at.
my point is: back then you could see the effect sli had in your games even without knowing much about computers, now you have to enable a framerate counter and disable v-sync, just to see if sli is enabled...
Seriously do any of you expect a game this year (or the next). that ONE nv40 can´t run perfectly with everything to the max?
No, the difference isn't just in framerates. If it was just that, everyone would be happy with their Nvidia cards running at 1024x768. Now it is about AA and AF with every piece of eye candy enabled. Imagine running a flight sim at 1600x1200 with everything set to max. That would look sweet!
You are confused. There are plenty of games out now that can hurt the card. It'll only get worse over the next 6-12 months.
Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.
Originally posted by mmp121 Not to mention the 600 Watt PSU that will be required to run 2 NV 6800 cards...
the sad thing is that when studying the power draw of different video cards, the 6800 Ultra came out only a sucking a few W more than the X800 XT when it was going full tilt and playing games, but drew a bit less when it was idling...
"And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz
Guys, I think when nvidia recommends the watts part, they take the CPU, HDDs, Optical drives, mobo, memory, fans, etc into consideration. I think the GF6800U takes around 150W themseleves in full load (read that somewhere, which i forgot), so in an LSI config you'd need like 300W just for the gfx...
add everything together, a 480W will be more than enough...
oh, and i think it makes more sense to link two 6800GTs together, so they don't tend to take too mcuh space (although its always a good thing to leave spaces anyways), and only 2 molexs altogether...
by the way, this is the 6800Ultra that asus came up with:
Originally posted by Chrono_Wanderer
add everything together, a 480W will be more than enough...
I doubt it. Since the SLI setup can only go in the high end machines right now. And the latest Pentiums consume over 100W <B>by themselves</B>.
Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
Currently the only MB u can use this on is the upcoming Intel AMD-64 bit workstation chip. Which if I am not mistaken is a dual CPU setup. Considering how much juice a single Intel's Prescott CPU takes, I would think that any serious PC user would have to be considering at LEAST a 550-600 Watt PS. This "solution" while awesome, is NOT for us regular joe's.
Comment