Announcement

Collapse
No announcement yet.

Visual Quality test

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Dear isochar.
    What on earth are U using that MONSTER system for?
    I am obviously jalous...


    ~~DukeP~~

    Comment


    • #17
      Microsoft Word and browsing the internet.

      Comment


      • #18
        Well, after a few hours of using the Leadtek A250 TD Ultra, it's time to send the Parhelia back...
        Actually you probably need a new eyeglass prescription instead.

        Joel
        Libertarian is still the way to go if we truly want a real change.

        www.lp.org

        ******************************

        System Specs: AMD XP2000+ @1.68GHz(12.5x133), ASUS A7V133-C, 512MB PC133, Matrox Parhelia 128MB, SB Live! 5.1.
        OS: Windows XP Pro.
        Monitor: Cornerstone c1025 @ 1280x960 @85Hz.

        Comment


        • #19
          I can post the .tiff's of my desktop running at 1280x960@32bit for both cards. I had my friends try to tell me which looked better and they could not. (One has 20/20 vision) IMHO, the text on the Leadtek is *very* close to what the Parhelia was putting out with the anti-aliasing on. Perhaps there is a larger difference at higher resolutions, but by the time I have a 21" monitor I'll be looking to replace this card.

          The FAA is definately better than antyhing the Leadtek can provide. The problem is that the Leadtek can put out 100+fps in every FPS I play, while the Parhelia cannot - even when setting details, resolution, and color depth down. I'll trade the slight image quality difference in exchange for being able to play my FPS. Top it off I save $100.

          PS. I just went to an Optometrist last week.

          *edit* One more thing. I realize that most of you guys see anything above 30fps as pointless. (Which is why I bought the Leadtek to see if I could notice a difference speedwise between the two.) The choice speaks for my preferences.
          Last edited by isochar; 26 July 2002, 05:49.

          Comment


          • #20
            I can post the .tiff's of my desktop running at 1280x960@32bit for both cards.
            Doing a frame capture from the buffer won't reveal differences in the analog output of a video card.

            Perhaps there is a larger difference at higher resolutions, but by the time I have a 21" monitor I'll be looking to replace this card.
            Generally speaking resolutions at the 1600 and higher range are the trouble spots for modern ATI and NVIDIA cards. Consequently those are the resolutions afforded to you by moving to a 21" or larger monitor. So it creates a rather irritating situation when you have to back down from the very resolutions you paid all that money to get from a 21"+ monitor.
            Last edited by Ryu Connor; 26 July 2002, 07:14.
            <a href="http://www.unspacy.com/ryu/systems.htm">Ryu's PCs</a>

            Comment


            • #21
              True about the resolutions Ryu, but as he pointed out, he doesn't have a 21" monitor, so he doesn't use those resolutions anyway. He's more concerned (as many of us who are purchasing for home systems are) about overall performance in the games, web browsing, word processing, budget balancing, with just a smidgin of image/video editing type of stuff, all on 19" or smaller monitors (usually) and at resolutions of most likely 1280x and below. At those resolutions with the average person's equipment, many cards now out do a fine job, and as he mentioned with his Leadtek, some are for all intents and purposes indistinguishable from Matrox's IQ.

              I'd never argue that Matrox doesn't have the best IQ available. Just that for most common users, and many gamers, it's not visible, or the difference is so slight as to be outweighed by other considerations.

              (I'd still personally like to be able to test a Parhelia in my rig, to see how it compares against my Gainward Ti4600 overall, especially in my games. )
              "..so much for subtlety.."

              System specs:
              Gainward Ti4600
              AMD Athlon XP2100+ (o.c. to 1845MHz)

              Comment


              • #22
                True about the resolutions Ryu, but as he pointed out, he doesn't have a 21" monitor, so he doesn't use those resolutions anyway.
                I perceived his statement as a question and I was simply providing an answer to that question. I understand where he's at and where he is coming from.

                He's more concerned (as many of us who are purchasing for home systems are) about overall performance in the games, web browsing, word processing, budget balancing, with just a smidgin of image/video editing type of stuff, all on 19" or smaller monitors (usually) and at resolutions of most likely 1280x and below.
                At this point in time I'd almost have to say that the downsides to modern LCDs are worth facing in order to get DVI connections. If gaming and gaming performance were my first priorities I would have invested the money I spent into two 21" monitors into a high end LCD that doesn't ghost and could scale well. That way I'd just hook it into the DVI connecton of whichever 3D card is faster and leave the whole concern about analog image quality behind.

                Of course you don't have to spend $1500 to get a good LCD. You should be able to spend conisderably less and find a 15-17" unit that would serve your needs well.
                <a href="http://www.unspacy.com/ryu/systems.htm">Ryu's PCs</a>

                Comment


                • #23
                  I can honestly say that I will never consider an LCD until they substantially improve, or are replaced by Plasma screens. Having the Samsung 191FP next to my Sony G400 was like night and day. Whites were whiter, black was blacker, colors were more colorful. To top it off at 100hz my Sony doesn't hurt my eyes at all when reading text, so the 3x more expensive screen was a no-brainer.

                  Comment


                  • #24
                    Originally posted by Snake-Eyes
                    True about the resolutions Ryu, but as he pointed out, he doesn't have a 21" monitor, so he doesn't use those resolutions anyway. He's more concerned (as many of us who are purchasing for home systems are) about overall performance in the games, web browsing, word processing, budget balancing, with just a smidgin of image/video editing type of stuff, all on 19" or smaller monitors (usually) and at resolutions of most likely 1280x and below. At those resolutions with the average person's equipment, many cards now out do a fine job, and as he mentioned with his Leadtek, some are for all intents and purposes indistinguishable from Matrox's IQ.

                    I'd never argue that Matrox doesn't have the best IQ available. Just that for most common users, and many gamers, it's not visible, or the difference is so slight as to be outweighed by other considerations.

                    (I'd still personally like to be able to test a Parhelia in my rig, to see how it compares against my Gainward Ti4600 overall, especially in my games. )
                    Right on Snake. (Although I do a little bit more than a smidgin of image/video editing) For me the "difference [in image quality] was so slight as to be outweighed by other considerations". If the Parhelia was in the performance range of a ti4600 across the board, then I would have stuck with the Matrox card. Oh well. :/

                    Comment


                    • #25
                      Originally posted by isochar
                      I can post the .tiff's of my desktop running at 1280x960@32bit for both cards. I had my friends try to tell me which looked better and they could not. (One has 20/20 vision) IMHO, the text on the Leadtek is *very* close to what the Parhelia was putting out with the anti-aliasing on. Perhaps there is a larger difference at higher resolutions, but by the time I have a 21" monitor I'll be looking to replace this card.

                      The video signal clarity between them might be little. But the color contrast, vividness and some other visual effects between Matrox's and nVidia's are not exactly the same. At least my eyes can distinguish the difference of their output both on CRT and LCD monitors. The tested equipments are the following:

                      G400SH/32
                      G400DH/32
                      VisionTek GF2 GTS (video signal filters are removed.)
                      Abit GF420SD (GF4 MX 420 SDRAM)
                      CRT: Sony GDM-400PS (use 1280x960x32bpp)
                      LCD: 17.4" Fujitsu panel (use 1280x1024x32bpp)

                      Matrox driver's Anti-alias algorithm for edge smoothing on TTF(True Type Font) or NTF (Open Type Font) is somewhat different from nVidia driver's. Unfortunately, only the pure black color benefits from the nVidia's algorithm in my personal opinion. Instead, any color except the pure black color benefits and performs quite well from Matrox's algorithm.

                      This visual effect of nVidia's is not very good on LCD monitor because the edge of the font body is not very smoothy. If it is displayed on the CRT monitor, it looks fine. The reason is the electronic scanning beam cannot entirely make two contiguous pixels with disrupted color gap around their boundary. As a result, it is still somewhat smoothy.

                      If someone can provide around 1.8MB space, I can put two lossless screenshots generated by Matrox's and nVidia's win2k drivers for the comparison. (I cannot put them here because each of them is around 8x0KB lossless JPEG file)
                      Last edited by WayneHu; 27 July 2002, 15:30.
                      P4-2.8C, IC7-G, G550

                      Comment


                      • #26
                        But the color contrast, vividness and some other visual effects between Matrox's and nVidia's are not exactly the same.
                        The issue is the analog output. This is a much more subjective assessment since I know which card is in, plus I am only viewing them one at a time. (1 monitor, 1 computer) However, I have been using a G400max ever since they were released, as well as the Parhelia for a short time. Switching to the Leadtek card I didn't notice any difference in the accuracy of the image in front of me. (Blurriness, color, saturation, etc.)

                        I am just as impressed as Firingsquad was by the high-end Leadtek, and would whole-heartedly recommend any of you first-person shooter gamers do a head-to-head comparison as I did. (For $100 less it may be worth your time)

                        One thing I can definately tell you that you'll miss from the Parhelia is the FAA.

                        Comment


                        • #27
                          Originally posted by WayneHu
                          [B]

                          Matrox driver's Anti-alias algorithm for edge smoothing on TTF(True Type Font) or NTF (Open Type Font) is somewhat different from nVidia driver's. Unfortunately, only the pure black color benefits from the nVidia's algorithm in my personal opinion. Instead, any color except the pure black color benefits and performs quite well from Matrox's algorithm.
                          I cannot get myself upto screencapturing under various cards, but Matrox drivers antialias text and graphics under CorelDraw (8,9,10) while others don't or do it poorly.

                          Type a line of text in CorelDraw and zoom out untill it's about 4 pixels tall. On my G400 it's still legible while on GF2MX, Rage128 and on intel 810 integrated graphics it was not. I don't know about other apps.

                          It looks the same in IE and on Windows Desktop.

                          Comment

                          Working...
                          X