Announcement

Collapse
No announcement yet.

Sweet Jeebus! Unreal!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    <font face="Verdana, Arial, Helvetica" size="2">I think NVIDIA is probably bending over backwards to support Apple and gain the OEM business from ATI.</font>
    At $600.00 a shot??????

    Joel
    Libertarian is still the way to go if we truly want a real change.

    www.lp.org

    ******************************

    System Specs: AMD XP2000+ @1.68GHz(12.5x133), ASUS A7V133-C, 512MB PC133, Matrox Parhelia 128MB, SB Live! 5.1.
    OS: Windows XP Pro.
    Monitor: Cornerstone c1025 @ 1280x960 @85Hz.

    Comment


    • #17
      nah ... NVidia is just scared about the ProMax DH-Max and what might come after that.


      Despite my nickname causing confusion, I am not female ...

      ASRock Fatal1ty X79 Professional
      Intel Core i7-3930K@4.3GHz
      be quiet! Dark Rock Pro 2
      4x 8GB G.Skill TridentX PC3-19200U@CR1
      2x MSI N670GTX PE OC (SLI)
      OCZ Vertex 4 256GB
      4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
      Super Flower Golden Green Modular 800W
      Nanoxia Deep Silence 1
      LG BH10LS38
      LG DM2752D 27" 3D

      Comment


      • #18
        While i agree that 600$ is a lot of money for the card,i have to admit that what we saw in the demos was extremely impressive(especially seing the doom 3 engine at work),and building a graphics chip with 57 million transistors must have been quite a challenge as well.

        This is the kind of card that's able to handle every single game that will be released in the next 2+ years without breaking much of a sweat.

        Now i have just one question...can the G800 compete with it???...

        Not that i doubt that matrox has the ability to(at least i hope they do),but it seems to me that even though they've had 2+ years to develop something new,going from a G400 to something that can pose a serious challenge to the GF3 in 3d performance seems like asking a lot.

        note to self...

        Assumption is the mother of all f***ups....

        Primary system :
        P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

        Comment


        • #19
          <font face="Verdana, Arial, Helvetica" size="2">This is the kind of card that's able to handle every single game that will be released in the next 2+ years without breaking much of a sweat.</font>
          That's the dumbest statement I've ever heard, do you have any clue what kind of games will be released in the next 2+ years???
          lol
          System 1:
          AMD 1.4 AYJHA-Y factory unlocked @ 1656 with Thermalright SK6 and 7k Delta fan
          Epox 8K7A
          2x256mb Micron pc-2100 DDR
          an AGP port all warmed up and ready to be stuffed full of Parhelia II+
          SBLIVE 5.1
          Maxtor 40g 7,200 @ ATA-100
          IBM 40GB 7,200 @ ATA-100
          Pinnacle DV Plus firewire
          3Com Hardware Modem
          Teac 20/10/40 burner
          Antec 350w power supply in a Colorcase 303usb Stainless

          New system: Under development

          Comment


          • #20
            Not that I agree *any* card has a lifespan of longer than 2 years anymore...

            but, doesn't Doom 3 qualify as what we'll see in 2 years? (I remember hearing something about it being released end of 2002)

            Comment


            • #21
              In what way is that statement dumb,durango????...

              You see,i've been using a Gf2 64 meg for about 9 months now,and to this day there isn't a single game out there that's forces to it work at resolutions under 1024*768 32bit in order to keep an acceptable frame rate(60+ fps),so even if i bought the Gf3 it woudn't be because my existing card can't handle the resolutions that i like to play anymore.

              Even the most demanding games are at least 18 months behind the best hardware,mostly because developers what to see their games played on the biggest user base possible(and affects potential sales).


              Last time i checked,apart from from all the dx8 features that the Gf3 has built in,it's easily twice as fast(or more-HSR)as my card in real world fps performance as far as effective fill rate is concerned.

              It could very well be the first card that will be able to hit 100 fps in Q3 at 1600*1200 32 bit,so i don't really forsee any game in the next 2 years that will force that particular card to play games at 1024*768 32 bit or under(at 60 fps),and that includes doom 3,which will probably be the most advanced engine available within the next two years.

              You see,you need 8 times less fill rate at 1024*768 32 bit compared to 1600*1200 32 bit(assuming the same target fps).

              I know that there will be even faster video cards later on,but the point here is that the Gf3 will be more than enough card for the next 2 years.

              So even if a developer starts building a new game TODAY,after seeing what the gf3 is capable of,and his intention is to make a game so demanding it will drive that the card to it's limit's,in which the it will only handle the game at the lowest possible resolutions(800*600 and below),the simple fact that it usually take's 2+ years to build the game anyways.

              So i ask again,in exactly what way was my statement dumb???...

              note to self...

              Assumption is the mother of all f***ups....

              Primary system :
              P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

              Comment


              • #22
                8?
                Where did you get 8? I get less than 2-1/2
                Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                Comment


                • #23
                  yea, i got 2.44, where'd u get 8?
                  although the graphics do indeed look nice in the quake3 shots, i do not think it will be worth the money. How do you guys think about the bang/buck ratio?

                  Comment


                  • #24
                    We've drawn into new frontiers with graphics technology.

                    Any new top-of-the-line card from any manufacturer will cost an arm and a leg. I do not expect ATI or Matrox to go below $400USD for their best offerings.

                    nVidia can charge $600 because they are months ahead of releasing a product that anyone will be able to compete against. Once ATI/Matrox have something to show, I bet the GF3 cards will have come down about 40% in price. (I'll be optimistic and say June is the soonest we'll see a Radeon 2 or G800)

                    nVidia is simply doing what any other technological company does. Charging a premium for its best product. (P4, Thunderbird when it was originally released, etc)

                    One side note, the MSRP for the GF3 has gone through rigorous marketing. People *will* buy that card at that price.

                    [This message has been edited by isochar (edited 27 February 2001).]

                    Comment


                    • #25
                      Sure, some rich bastard will always buy the best of the best, but i doubt the majority will. That card will cost atleast $1000 CAD locally for 1 single component in your computer. I could almost make 2 whole computers for the cost of that.
                      Now lets all hope that soon a card will be released as a successor to the radeon, the g400, or the kryo to rival nvidia.

                      Comment


                      • #26
                        We are all hoping Nvidea gets some real competition in the 3D performance area. Thats what people see in the reviews.

                        But lets face it they suck at almost everything else.
                        (2D, Twinview, video, QUALITY)

                        Comment

                        Working...
                        X