Announcement

Collapse
No announcement yet.

GeForce3 and 2D quality...

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #76
    For those of you stup-err... OBSTINATE enough to insist that this is not a problem (*cough*superfly*cough), please refer to the following link:

    http://www.visiontek.com/support/gra...troub:2d:ghost

    Some choice quotes:

    "Try another monitor"

    "It could be a dodgy card"

    "your card may be poorly designed and give a bad signal"

    WTF? Nice f***ing cards... NOT!

    - Gurm

    ------------------
    Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
    The Internet - where men are men, women are men, and teenage girls are FBI agents!

    I'm the least you could do
    If only life were as easy as you
    I'm the least you could do, oh yeah
    If only life were as easy as you
    I would still get screwed

    Comment


    • #77
      I like the one about getting an Amplifier. How many watts do you need sunshine. Ah jitter is that bad you'll need at least 500watts for that one.

      [This message has been edited by The PIT (edited 24 May 2001).]
      Chief Lemon Buyer no more Linux sucks but not as much
      Weather nut and sad git.

      My Weather Page

      Comment


      • #78
        Look...I'm only giving my opinion based on use of my CARD on my SYSTEMusing my MONITOR,and i have absolutely no complaints whatsoever about the card's 2d output and i'm not the only one who has the same opinion,since another murcer(ACE)is also very pleased with the card's 2d on his system as well.

        I never said that everyone will get the same results as i or ace are getting,especially considering the amount of variables that can influence what the user see's on his/her screen.

        And yes i've asked other GF3 users for their opinion on it's 2d output and for the most part they're all in agreement that it's pretty good overall,but again that's only half a dozen users,not several thousand.

        Learn to make the distinction and can it with the smart*** remarks Gurm...
        note to self...

        Assumption is the mother of all f***ups....

        Primary system :
        P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

        Comment


        • #79
          I just installed a Gainward GF3 it has 2D that is perfect. Much to my surprise. The memeory is 3.8ns and it is the only Geforce 3 card that has DVI, video in and video out.
          Chubby Price but a great one...

          Comment


          • #80
            Man. I like this bunch, and I'm really sorry that the rest of you can't seem to get a non-Matrox card that has decent 2D.

            That said, if you're ever in the Chicago area, feel free to come over and I'll let you compare the VisionTek's output against the G400Max's on my KDS 19" monitor. I understand it's not a top of the line 52" monitor, but that's okay. It's still better than most average users ever see(you know, those who either buy Compaq's, HPs, etc, and those that only use the system's provided in the office). Combine that with the fact that this monitor is more than good enough to edit and output the digitized (either using my scanner or my digital camera) pics of my family, as well as for viewing and editing video (both more critical than what I spend the majority of the time doing anyway- gaming), and I'd say it's money well spent, and saved.

            Now, assuming that what many of you keep implying is true- that the Matrox has so much better 2D than ANY nVidia card, it would HAVE to be my monitor that makes them appear pretty well matched. If that's true, then I can't see buying a Matrox, even for the video quality, given what I do, since even though I'd save some money over the GF, I'd spend a great deal more to get a monitor that will show me to my personal satisfaction that my nifty-noodles Matrox is clearly superior in 2D to my GF3.

            Spend YOUR money however you see fit, and be happy with what you get for it, including the ability or lack of it to play relatively recent games at decent resolutions and color depths, and the ability to do video and photo editing to boot. For my money, and my requirements from my monitor's display quality, the nVidia is a better buy, since I CAN do all that work (with the same quality the Matrox gives me on THIS monitor), and still play those games I get involved in so much, and really have an enjoyable time at it. I enjoyed the G400Max for a while, but I have MORE enjoyment from the GF3. The only thing the G400 now has over it is the dual head output, and if nVidia's GF2MX is any indication, that will soon be a moot point with the release of GF3MX variants this fall.

            I'm really sorry for all of you that waited and waited forever to see Matrox release another great 2D/fairly decent 3D card for the masses of us out there that aren't using them for businesses. This especially goes out to those of you that have constantly made snide comments about our decisions to buy the competitions' products instead of fooling ourselves into waiting on vapor. While some of you have waited, I've enjoyed a GF2GTS for nearly a year and a half, and now I expect to do the same with a GF3, and this time I don't even have to give up all that image quality I did for the last one.

            Keep waiting if you like, but please, give the rest of us a break with the comments. We all have to make our own decisions, and as far as I can see, all of us here would really like to see Matrox give us another card that competes as well on all fronts as the G400 series did when new. That doesn't mean we can't use something that is clearly better than the current big M lineup for the meantime. And it surely doesn't mean my eyes are defective, nor that I spend all day worrying that my screen scrolls just a tiny bit more smoothly in browser windows. Most of us just don't notice nor need to notice such things for our home PCs.

            End of rant.

            Sorry guys, I just think the bickering over what is basically our observations (now 3 strong, thanks ALBPM), is just plain silly, and starting to border on the immature. These fori have been one of the refuges of sanity that I come to to talk with others about things that interest me, especially things Matrox. That doesn't make me a traitor, stupid, or crippled because I don't choose one right now.

            ------------------
            System specs:
            AMD T-Bird 1.33 (o.c. to 1.5Ghz)
            Asus A7M266
            256MB Crucial PC2100 DDR RAM
            Promise Fastrak 100 EIDE Raid controller running
            2x IBM DTLA 60GB 7200 RPM UDMA 100 hdd, RAID 0 array
            "..so much for subtlety.."

            System specs:
            Gainward Ti4600
            AMD Athlon XP2100+ (o.c. to 1845MHz)

            Comment


            • #81
              Ace, I fail to understand where you are coming from. I see no one saying anyone is stupid for the choice that they are making. I agree with you that one should do what is best for them. You have decided that you will get (and do get) more enjoyment from your nVIDIA card than your Matrox card. Good for you. No one is saying you are stupid for it. The main reply is from Rags. Now his opinion like his hardware is different. Clearly to this point he sees no reason to keep his GF3. I have used a GF2 on my monitor, including playing games with it and felt no compulsion to go buy one. That is just my position. I have friends who after playing on the GF2 machine went out and got one because it fulfilled their needs. I think you'll find that this firum has quite a few people who have equipment where the difference is clear. I haven't seen a GF3 in action, I have seen video footage of the XBox and GameCube in action. I expected the XBox to look better than the gamecube, it didn't. Maybe just a bad demo but I remember thinking the colours looked less bright than the gamecube and reminiscent of the GF2 I had seen. I have no problems with using non Matrox cards, if they satisfy MY needs.
              [size=1]D3/\/7YCR4CK3R
              Ryzen: Asrock B450M Pro4, Ryzen 5 2600, 16GB G-Skill Ripjaws V Series DDR4 PC4-25600 RAM, 1TB Seagate SATA HD, 256GB myDigital PCIEx4 M.2 SSD, Samsung LI24T350FHNXZA 24" HDMI LED monitor, Klipsch Promedia 4.2 400, Win11
              Home: M1 Mac Mini 8GB 256GB
              Surgery: HP Stream 200-010 Mini Desktop,Intel Celeron 2957U Processor, 6 GB RAM, ADATA 128 GB SSD, Win 10 home ver 22H2
              Frontdesk: Beelink T4 8GB

              Comment


              • #82
                My friend Dave called me up this morning.

                "Hey Jason, my new GF3 came in, but I have this problem... it's ghosting at 1024x768. My Voodoo3 never did that. What could it be?"

                So I told him to check the monitor (his Compaq special 17") and to check out VisionTek's web site for possibilities.

                HE is the one that gave me that link. I am not saying your GF sucks. I'm saying that there seem to be definite quality issues with ALL nvidia-based cards.

                When one of nVidia's only/biggest GF3 card producers calls the cards "poorly designed" and "dodgy", there is something VERY WRONG. These people are trashing their OWN cards.

                If you got a good one, great! It's fast, you like the colors, it's nice and crisp - you lucked out and I'm glad you're happy.

                As for me, my nVidia experiences are mixed. I bought two TNT2 Ultras when they first came out. Identical cards. One was blurry, one wasn't. Same for the Riva128. Same for the TNT.

                Right now my machine at work has a TNT2M64 in it. Looks fine. My home machine is a Kyro2. My Matrox card is in limbo right now because it won't do what I want it to do any more. But I do miss the picture quality.

                Am I calling you stupid for getting a GeForce? No. I'm insisting that there are DEFINITE problems with the GeForce series.

                - Gurm

                ------------------
                Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
                The Internet - where men are men, women are men, and teenage girls are FBI agents!

                I'm the least you could do
                If only life were as easy as you
                I'm the least you could do, oh yeah
                If only life were as easy as you
                I would still get screwed

                Comment


                • #83
                  I have no problem with that either,anyone picks hardware depending on what their needs are,but resorting to calling gf3 users blind(DURANGO)or stupid(GURM) because in at least some cases,the gf3's 2d output nearly matches that of the G400,which if you don't include dual head,was pretty much the last feature that matrox users could claim to be better than anyone else's card,is childish at best.

                  note to self...

                  Assumption is the mother of all f***ups....

                  Primary system :
                  P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                  Comment


                  • #84
                    Okay....for what it's worth, here's my report. I just picked up a VisionTek GeForce 3, and using my Sony GDM-F400. The 2D quality is greatly improved over the Asus 7700 GeForce2 it replaced. Much better color, reasonably sharp,etc. No detectable artifacts. 3D also just looks better with the same settings I was using in games with the 7700. Of course, with everything cranked up now, Tribes2 looks much better and isn't a slide show.

                    I sold my 7700 to a buddy at work for $125 (USD), picked up the Geforce3 for $350, so for $225, I got a card I'm really happy with. Funny how I would never sell my Matrox cards (they're in various systems around the place), but have no problem "trading in" my nVidia cards.

                    John

                    Comment


                    • #85
                      Oh dear looks like Gurms hit a tender spot here. By the way I was making a joke about the opticians and I did find find the comments from the link he provided rather amusing.
                      The G400 quality is outstanding and other cards still haven't quite caught up yet. Many reviewers still use the G400 to check monitors because it's reconised that they have the best display.
                      Poeple here have posted snaps of other cards here and snapshots from other cards for comparision. I couldn't tell the differance and neither could another collegue however another kid walked in and had a look and chose the G400 everytime.
                      At the end of the day it comes down to personnel preferance, eye sight, fps and how good your monitor is.
                      Chief Lemon Buyer no more Linux sucks but not as much
                      Weather nut and sad git.

                      My Weather Page

                      Comment


                      • #86
                        Before everyone thinks I've gone TOO soft, I _was_ saying that it was stupid to insist that there are no problems with GeForce3's.

                        nVidia seems to have very poor quality control. However, I will admit that when they are good, the GeForce3's seems to be substantially better than the GF2's and earlier cards.

                        The problem here is the "when they are good" issue. What is it, intrinsically, about nVidia chips that makes them so hard to design around? The TNT2Ultra cards I got weren't no-name. They were Guillemot Xentors. Top of the line. Hideously expensive at the time. Yet one of them was unreadably blurry.

                        Same thing with the Riva128. I wasn't buying some no-name. I was buying Hercules. And the same goes for the TNT. In that case it was two identical Diamond cards. And while Diamond's driver support has always been piss-poor, their hardware has always been good.

                        So what is it about nVidia chips that makes the 2D output a crapshoot? I think that is the underlying question. And to read Visiontek's comments, it's that the nVidia reference board design (which virtually every manufacturer starts from) is astonishingly poor, or at the very least difficult to work with.

                        It's not as if other manufacturers haven't had their share of problems. Voodoo3 3000's and 3500's had quality control problems. I got a blurry one of those too. Radeons (the first batch) seemed to have ISSUES with some Trinitron monitors.

                        But with nVidia, it's pretty much guaranteed that if you take their latest and greatest card (of ANY generation), and buy 100 of them... a LARGE number of them exhibit 2D quality issues. And those that don't have historically not stacked up to the offerings from other manufacturers. That last bit has, at last, changed. But that doesn't eliminate the high percentage of "dodgy" cards.

                        So all I was saying with my post was that people have to be stupid to NOT know that nVidia cards are a crapshoot.

                        Now, if you buy from a place with a good return/exchange policy, then you have no worries.

                        - Gurm

                        ------------------
                        Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
                        The Internet - where men are men, women are men, and teenage girls are FBI agents!

                        I'm the least you could do
                        If only life were as easy as you
                        I'm the least you could do, oh yeah
                        If only life were as easy as you
                        I would still get screwed

                        Comment


                        • #87
                          What if Matrox manages to somehow improve upon it's current superior image quality? Then all this "my GF3's image quality is almost as good as the G400" won't mean anything anymore and the only thing nVidia will have going for it again is fps in FPS.

                          slyfox

                          Comment


                          • #88
                            Yea, it's quite likely Matrox will be able to increase image quality over their G400. They did it with the G200-G400 transition and with every other new product.

                            So I expect Matrox to keep ahead of the rest in the 2D image quality for the next while.

                            ------------------
                            P3V4X | P3 650@897 | Matrox G400@171/233 | Viewsonic 17" E771 | 256 mb ram | Maxtor DiamondMax+ 40 | SB Live Value | Altec Lansing ACS45.2 | Win2k
                            Primary system specs:
                            Asus A7V266-E | AthlonXP 1700+ | Alpha Pal8045T | Radeon 8500 | 256mb Crucial DDR | Maxtor D740X 40gb | Ricoh 8/8/32 | Toshiba 16X DVD | 3Com 905C TX NIC | Hercules Fortissimo II | Antec SX635 | Win2k Pro

                            Comment


                            • #89
                              That bit about Matrox improving with each generation is not really true. Maybe if you start with the G200, but only if that's as far as you go back.

                              I was lucky enough to have a Matrox Millenium 1 in my first PC (I looked for THAT specifically as my #1 choice), and the output from that card was the best I've seen yet in 2D, and it had some nice features that are no longer available to us now, like the extended virtual desktop (1 monitor). Even my G400 doesn't seem to quite match THAT card, at least in resolutions that both have in common. (Granted, the G400's have a much higher res/refresh limit than the old Millenium had, but that's it). In actuality, I look at the situation as if Matrox took a couple steps back and is now working to get back to their old levels.

                              And in any case, even if Matrox were to announce such a card today, I'd still have at least six months of time with the GF3 before I could even think of getting the new Matrox card, if my past experiences are any indication.

                              That said, the above really belongs to a new post in the Crystal Ball forum, since the topic here is mainly the 2D output of the new nVidia GF3 cards and how it stacks up to current competition (of which the G4xx is still top of the heap).

                              ------------------
                              System specs:
                              AMD T-Bird 1.33 (o.c. to 1.5Ghz)
                              Asus A7M266
                              256MB Crucial PC2100 DDR RAM
                              Promise Fastrak 100 EIDE Raid controller running
                              2x IBM DTLA 60GB 7200 RPM UDMA 100 hdd, RAID 0 array
                              "..so much for subtlety.."

                              System specs:
                              Gainward Ti4600
                              AMD Athlon XP2100+ (o.c. to 1845MHz)

                              Comment


                              • #90
                                I appreciate the clarification in your latest post Gurm. Some of the previous posts (not just by you) struck a nerve because it seemed to be implied that because we think our cards are giving us nearly identical 2D results to our older G4xx's that we're either stupid, blind, or just silly. No offense meant on my part, and none taken. I just hate to see what started out as a useful discussion into to the output of the latest nVidia cards vs. our old standby G4xx's turn into some sort of flamewar.

                                FWIW, Matrox probably has the best overall quality control of any video card manufacturer, and because of that you tend to find that it's hard to get a G400 that doesn't live up to the same output quality standards. This is unlike the current state of nVidia cards, but based just on the responses seen is this topic, it looks like things ARE getting better in two different ways-

                                1) GF3 cards CAN output the same or nearly the same level of 2D as our G400s.
                                2) You may be taking chances when buying one that you got a good one, but at least NOW there is a chance.

                                On a side note, I ran 3DMark2K last night, in demo mode, and it's safe to say that now that the image quality is so much improved, and also now that the nvidia cards actually support EMBM (with no noticable slowdowns, btw), the demo looks better than ever! (The reflecting pool is still a pretty cool scene)

                                ------------------
                                System specs:
                                AMD T-Bird 1.33 (o.c. to 1.5Ghz)
                                Asus A7M266
                                256MB Crucial PC2100 DDR RAM
                                Promise Fastrak 100 EIDE Raid controller running
                                2x IBM DTLA 60GB 7200 RPM UDMA 100 hdd, RAID 0 array
                                "..so much for subtlety.."

                                System specs:
                                Gainward Ti4600
                                AMD Athlon XP2100+ (o.c. to 1845MHz)

                                Comment

                                Working...
                                X