Announcement

Collapse
No announcement yet.

GeForce FX Canned already?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    They shouldn't really panic they aren't going to lose market share. Despite the 9700 being the the the fastest thing out there they haven't really increased there market share.
    Remember what happened when AMD shook Intels tree and caused them to panic.
    Chief Lemon Buyer no more Linux sucks but not as much
    Weather nut and sad git.

    My Weather Page

    Comment


    • #17
      Originally posted by Novdid
      Here in europe the Ti4800 is named Ti4200-8x and the SE version is just a plain Ti4800. A bit less confusing but still strange!
      Hehe. There's too much confusion over these names. One of my coworkers is looking to upgrade, and isn't sure what he'll be getting (though the the 4200-8x is high on his list), so I told him to make sure to read the description before ordering to see whether it's a 4200 or a 4600 GPU on the card.
      "..so much for subtlety.."

      System specs:
      Gainward Ti4600
      AMD Athlon XP2100+ (o.c. to 1845MHz)

      Comment


      • #18
        They shouldn't really panic they aren't going to lose market share.
        Tell that to Joe Typical (clueless) investor...
        "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

        "Always do good. It will gratify some and astonish the rest." ~Mark Twain

        Comment


        • #19
          Originally posted by THE_Editor
          GeForce Booting
          What a FUGLY noise, sounds just as fugly as it looks.
          main system: P4 Northwood 2.0 @ 2.5GHz, Asus P4PE (LAN + Audio onboard), 512MB Infineon PC333 CL2.5, Sapphire/BBA Radeon 9500@9700 128MB (hardmodded), IBM 100GB ATA-100, 17" Belinea (crappy), and some other toys...ADSL (1,5mbit/s down, 256kbit/s up...sweeeeeet!)

          Comment


          • #20
            Originally posted by Chrono_Wanderer
            I think the reason TSMC will stop is probably because nV has an updated core of GFFX that runs more efficient. (i.e. faster at lower clockspeed = less heat / die shrink to 0.09(?)). If they don't sell the GFFX in this month ppl will probably start to say nV30 is a vapourware hehe (oh yea and probably the big nV stock shareholders made the company to rush the chip out to the marketplace to compete with ATi).
            -They wouldn't lower the clockspeed
            -They waited THIS long to perfect .13, what makes you think they're moving to .09? And if they DID, they'd raise clock speed, not lower it.

            Comment


            • #21
              I don't think the shiping GeForceFX will produce that much noise, and except the preformance to be better when nVidia has released a new driver! nVidia won't sell so many GeForceFX (NV30) their main sales will be from NV31 & NV34

              Comment


              • #22
                nVidia moving to .09?

                C'mon, get real.

                Intel who has the best fabs in the World haven't moved to .09. If you care to compare FX to 9700 you can see that nVidia's (TSMC's) .13 process is no where near OK.

                It took even AMD quite some time to refine (T-Bred B) .13 process.

                First intel will move to .09 in Q3/Q4, AMD will move to that in Q1 04. After that (mid 04) come graphic chip makers. You can also forget about Parhelia II on .09 before mid 04 (0.13 this year is theoretically possible, but .09 hell no).
                Last edited by UtwigMU; 6 February 2003, 18:58.

                Comment


                • #23
                  Originally posted by Kooldino
                  -They wouldn't lower the clockspeed
                  -They waited THIS long to perfect .13, what makes you think they're moving to .09? And if they DID, they'd raise clock speed, not lower it.
                  There ISN'T 0.09 to move to.

                  0.13 is all we'll get for anoter year/year and half. MAYBE 2004 will show 0.09, but don't expect mass production too soon...

                  Comment


                  • #24
                    That GeFX fan noise is gonna be the listened by many hardcore gamers.

                    When you are king of the fill-rate you don't have to worry about the noise your fan does. You put some impressive photos on the box, a couple of magazine adverts of a quake3 screenshot with a 5 digit fps counter and some "friendly" reviews and you'll have a lot of people complaining to ATI for why their cards are so silent. And probably some companies selling replacement fans for the R300 looking like the FX one

                    Recently the "mighty" J.Carmack commented about the FX and he complained a bit about the noise but he also said that he's installed it permanently on his main PC.
                    If Doom3 starts up with a "GeforceFX, the way it's mean to be played" nVidia'll have a killer product.
                    Last edited by drslump; 7 February 2003, 15:59.
                    <font face="verdana, arial, helvetica" size="1" >epox 8RDA+ running an Athlon XP 1600+ @ 1.7Ghz with 2x256mb Crucial PC2700, an Adaptec 1200A IDE-Raid with 2x WD 7200rpm 40Gb striped + a 120Gb and a 20Gb Seagate, 2x 17" LG Flatron 775FT, a Cordless Logitech Trackman wheel and a <b>banding enhanced</b> Matrox Parhelia 128 retail shining thru a Koolance PC601-Blue case window<br>and for God's sake pay my <a href="http://www.drslump.biz">site</a> a visit!</font>

                    Comment


                    • #25
                      Originally posted by drslump
                      That GeFX fan noise is gonna be the listened by many hardcore gamers.

                      When you are king of the fill-rate you don't have to worry about the noise your fan does. You put some impressive photos on the box, a couple of magazine adverts of a quake3 screenshot with a 5 digit fps counter and some "friendly" reviews and you'll have a lot of people complaining to ATI for why their cards are so silent. And probably some companies selling replacement fans for the R300 looking like the FX one

                      Recently the "mighty" J.Carmack commented about the FX and he complained a bit about the noise but he also said that he's installed it permanently on his main PC.
                      Sorry No. If the playbacks are true it's like having your motherhovering next to you. You'll buy it and dump it or buy my ear plugs
                      Chief Lemon Buyer no more Linux sucks but not as much
                      Weather nut and sad git.

                      My Weather Page

                      Comment


                      • #26
                        I have a couple of friends who just use the PC for gaming. They are not rich so they bought nice CPU and GFX cards but cheap DVDs, HDs and fans.
                        When we had the last lan party at home their PCs noise was unbelieveble but they were so happy to have more fps than me.
                        Have you ever listened a cheap 16x DVD-Rom reading a game CD?
                        Or a completly fragmented cheap HD loading MOHAA?

                        I told them about the noise but they answered me that once they are playing they can't listen anything else but the game sounds.

                        btw. that mp3 is very loud at boot time but after it relaxes a bit.
                        <font face="verdana, arial, helvetica" size="1" >epox 8RDA+ running an Athlon XP 1600+ @ 1.7Ghz with 2x256mb Crucial PC2700, an Adaptec 1200A IDE-Raid with 2x WD 7200rpm 40Gb striped + a 120Gb and a 20Gb Seagate, 2x 17" LG Flatron 775FT, a Cordless Logitech Trackman wheel and a <b>banding enhanced</b> Matrox Parhelia 128 retail shining thru a Koolance PC601-Blue case window<br>and for God's sake pay my <a href="http://www.drslump.biz">site</a> a visit!</font>

                        Comment


                        • #27
                          just to clear things up a bit . i do understand that die sharink = more heat per surface area. but i didn't say that die shrink = cooler. probably i made some grammar confusion, but i said the following, quoting myself...

                          (i.e. faster at lower clockspeed = less heat / die shrink to 0.09(?)).

                          (i.e. faster at lower clockspeed = less heat / die shrink to 0.09(?)).
                          but of course die shrink = cheaper to mass produce a bunch of silicons (including less chip will be damanged during the manufacteuring process i.e. each piece of silicon is smaller so when u make a dent on the sheet of silicons u damanged less chip... i can't desribe this properly today because i havn't have 2 nights of sleep to finish up some project. so if anyone is kind enough to rephrase that it would be grealy apperciated)

                          and i don't really think the 4800 = 4200 in clock speed (although i think some 3rd party company does that)... because according to their theoretical performance yield 4800 HAS HIGHER FILLRATE than the 4600! (and 4800 will probably replace 4600 as it has the AGP 8X marketing BS . It is probably also a counter for the Radeon 9100 too...)

                          edit: check it out on their web site:

                          Comment


                          • #28
                            Originally posted by drslump

                            Recently the "mighty" J.Carmack commented about the FX and he complained a bit about the noise but he also said that he's installed it permanently on his main PC.
                            If Doom3 starts up with a "GeforceFX, the way it's mean to be played" nVidia'll have a killer product.
                            Carmack also stated that the GF FX and the R9700Pro are about the same, both performance and image-quality wise in Doom3 - with the current drivers, that is.
                            But we named the *dog* Indiana...
                            My System
                            2nd System (not for Windows lovers )
                            German ATI-forum

                            Comment


                            • #29
                              Originally posted by Chrono_Wanderer
                              <SNIP>
                              and i don't really think the 4800 = 4200 in clock speed (although i think some 3rd party company does that)... because according to their theoretical performance yield 4800 HAS HIGHER FILLRATE than the 4600! (and 4800 will probably replace 4600 as it has the AGP 8X marketing BS . It is probably also a counter for the Radeon 9100 too...)

                              edit: check it out on their web site:

                              http://www.geforce.com/view.asp?PAGE=geforce4ti

                              The 4800 is a faster clocked 4200 with AGP 8x. It's not faster than the 4600 though. Some manufacturer may choose to name their products differently too (à la Gainward)...

                              Comment


                              • #30
                                4800SE is a ti4400 level clocked agp 8x compliant
                                4800 is at ti4600 speeds agp8x card

                                Comment

                                Working...
                                X