Announcement

Collapse
No announcement yet.

2D performance of G400 vs. GeForce?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 2D performance of G400 vs. GeForce?

    If I was to get a Creative Labs DDR GeForce card would it be a major step down in 2D performance compared to my trusty old Millennium G200 or a G400?

    Does anyone have any experience using a G400 and a DDR GeForce on the same monitor (obviously not at the same time ) ?

    I'd hate to sacrifice 2D performance for 3D performance.

  • #2
    I think you know the answer to that one. The G-series is reviewed all over the web as being the 2D king for a reason, because it is! If you want a Geforce card wait until the N15-N20 are out.

    ------------------
    PIII-450@600, 128 HDSRAM, Asus P3BF, G400/32, SBLive!,Brand stinkin' new Sony G400 19", (no Dual head) Nokia 447Xi 17",AOPEN DVD-1040 10x slot,PLEXTOR 8x4x32 ATAPI CD-RW,PromiseULtra66, and some fish,


    Comment


    • #3
      Hello Dentaku,

      I have both a G400MAX and Creative GeForce DDR, and I also have access to a G400 (vanilla), a Creative TNT2ultra.

      I can honestly say there is very little difference in 2D quality between any of these cards.

      Comment


      • #4
        My G400 oc to MAX scored about 30pts better than my Geforce DDR in Wintune 98's 2D bench. It was noticeably slower, I've since returned it because it is not worth the $300.


        ------------------
        ABIT BH6 rev 1.0, Pentium III SL35D 450MHz -> 600MHz, 192MB PC100 SDRAM @ 133MHz, Toshiba 6X DVD,
        Matrox Millenium G400 DH @ 160/200, Creative SBLive Value, 3Com Fast Etherlink XL PCI
        Supermicro SC701A ATX 300watt TurboCool PC Power & Cooling PS, Panasonic Panasync S17
        Last edited by dneal; 20 May 2022, 09:08.

        Comment


        • #5
          Forget the numbers for a while and compare the 2D image quality (crispness and vividness of colour) of both with a decent monitor at 1600x1200 32-bit.

          It´s enough to wash the doubts away, if you own a decent pair of eyes, of course.

          ------------------
          INTEL CELERON 433, G200 SD 8MB AGP (BIOS 2.6), INTEL ATLANTA 440LX moby (latestly BIOSED), SOUNDBLASTER PCI 128, 96 MB SDRAM 66Mhz, 4.3 WESTERN DIGITAL EIDE CAVIAR HD, CREATIVE 24X CD-ROM, AOC 5VLR 15", EPSON STYLUS COLOR 200.
          ALWAYS (ahem, well, almost) the latest Powerdesk and DirectX.

          Remember folks:
          "Computers are useless. They can only give you answers"
          Pablo Picasso

          Comment


          • #6
            Not only was it slower, there were obvious artifacts in video and 2D graphics at hirez and supposedly 32-bit color. G400 is far better for 2D, hands down. G200 and G400 are basically equals here too.


            ------------------
            ABIT BH6 rev 1.0, Pentium III SL35D 450MHz -> 600MHz, 192MB PC100 SDRAM @ 133MHz, Toshiba 6X DVD,
            Matrox Millenium G400 DH @ 160/200, Creative SBLive Value, 3Com Fast Etherlink XL PCI
            Supermicro SC701A ATX 300watt TurboCool PC Power & Cooling PS, Panasonic Panasync S17
            Last edited by dneal; 20 May 2022, 09:08.

            Comment


            • #7
              You won't find much of a real world performance difference, and image quality is more a card issue than a chip issue: ELSA and Canopus cards have better filtering than Creative or Leadtek cards, for example. Not all GeForce cards are created equal.

              (By the same token - to my eye, anyway - the Gigabyte G400s don't look as sharp as the Matrox cards - which are invariably excellent.)

              One thing that all GeForce cards have that you won't get on any G400 is color controls for overlays - very important for some people...

              Comment


              • #8
                Dentaku, I don't own a DDR GeForce, but I do have the SDR version in my machine at home, and I have a G400 DH in my machine at work. 95% of the time, I can see no difference in the image quality, as for performance, take my word for it when I say 2D speed these days is so close that only the computer can tell them appart. This is a fact, not just babble.

                ------------------
                PIII 500@560 (2.05V) - ABit BH6 (NV) - 128MB Corsair CAS2 - 3D Blaster GeForce (135/183) - 12 MB Voodoo2 - SBLive! Value - ViewSonic PS775 - 8.4 + 4.3Gb Quantum Fireball CRs & so on.
                Look, I know you think the world of me, that's understandable, you're only human, but it's not nice to call somebody "Vain"!

                Comment


                • #9
                  Hi,

                  I just got done doing a back-to-back-to-back comparison between my G400 SH 16, V3-3000, and a CL GeSpot. At resolutions above 1024, there is marked difference, I used visual c++ to compare the three cards. With the G400, the text is crisper, and there is virtually no screen wander. The V3 is better than the GeSpot, but is worse than the G400. The GeSpot shows more blurring and much more screen wander than either. Anyone who says that you can't tell the difference has to have some pretty poor eyesight.


                  Rags

                  Comment


                  • #10
                    Personnaly I don't own a GeFarse card (I'll get one as soon as I get a hole in my forehead) but from what I've seen at my friends house picture quality is really sucky above 1280x1024... And as for speed it really makes little difference this days, with processors on the rampage
                    _____________________________
                    BOINC stats

                    Comment


                    • #11
                      The reason I was asking is that 2D quality REALLY matters to me (I can spot misconvergence in a monitor a KM away )

                      I'd love to have a GeForce because of the 3D performance, but if it means sacrificing colour and text crispness I don't want to spend that much money.

                      I spend MUCH more time using Adobe Photoshop 5.5 and Illustrator 8 than I do playing games so I would probably be better of spending half as much money to get a plain G400 (to replace my quite slow in 3D G200)than it would be to spend WAY too much money (Canadian) on a great 3D accelerator with bad 2D.

                      Too bad they don't make motherboards with 2 AGP slots. I could keep my G200 and use the GeForce for D3D and OGL

                      Comment


                      • #12
                        I've been messing with hardware again today and have had another close look at the picture quality at several different resolutions on a Iiyama 410Pro 17" monitor and a Mitsubishi 900u 19". Here are the cards I tested (best quality first).

                        1. Matrox Millennium G400Max
                        2. Creative TNT2 Ultra Gamer

                        3. Matrox Marvel G200

                        4. 3dfx Voodoo 3 2000
                        5. Creative GeForce 256 DDR


                        6. STB Velocity 4400 (TNT).

                        The only suprise was the quality of the TNT2 ultra card - really noticably better than the GeForce above 1024x768.

                        Chris.

                        [This message has been edited by RoGuE (edited 16 January 2000).]

                        Comment


                        • #13
                          and you people think I talk a load of propaganda!

                          I had a Voodoo 3 3000 a while back, as well as a G200, they've all been hooked up to the same Professional Series ViewSonic monitor with BNC cables, and run at the same resolutions and colour depths. I swear to you I cannot see a difference, and I have very good eye sight.

                          Dentaku, don't look here for an unbiased opinion, go and ask a computer shop for demo, if you explain to them why you're asking, I'm sure they'll be understanding. Trust your own eyes, as you'll be the one using it.
                          Look, I know you think the world of me, that's understandable, you're only human, but it's not nice to call somebody "Vain"!

                          Comment


                          • #14
                            Real question here, what size/make monitor and what resolution are you testing at?
                            "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                            "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                            Comment


                            • #15
                              There is no propaganda about it, I just did a compare between all of those cards, and showed them to an nVidia-fan friend of mine, and he agreed that the clarity is more than obvious. So, that leaves us with some possible causes of Agent not seeing the same thing as the rest of the hardware community:

                              1. We are all wrong (not likely)
                              2. Agent doesn't have as good of eyes as he claims.
                              3. Agent got the best built GeSpot out there (from creative,,,,,yeah right)
                              4. He doesn't use high res/colour depth to make any difference
                              5. He just spent a large enough wad of cash on his GeSpot, that he has no choice but to see what he really wants to see.

                              No offense, Agent, I just find it hard to believe you cannot see the difference. When working with a high resolution/colour depth, running a text editor or CAD program, the difference really is marked. I know that I did not just get a bad GeSpot, either, a friend of mine had his Asus GeSpot DDR here, and DAMN!!!! there was a difference. The V3 3000 looked better....NO KIDDING!

                              Rags

                              Comment

                              Working...
                              X