Announcement

Collapse
No announcement yet.

Help me find that Matrox interview !

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    To dr mordrid.

    You are right, and PCs have had 64bit color capability since Intel implemented mmx. The problem is the overhead in emulating colors higher than 8bit when you can't display them on a monitor. I was hoping that with the new Matrox card it would be a no brainer to display 10bit color on the monitor, without the overhead associated in emulating higher than 8bit color.
    AE and others utilize the mmx 64bit color instructions, but have to resort to additional algorithms (read overhead) in getting something visual on the screen. When video cards can actually display higher than 8bit color on a CRT, then it will reduce overhead and make for better editing. This could espcially be true for the direct to DVD market and direct to broadcast market. IT would be nice to edit 10bit source (Digi Beta) in 10bit color space.
    If Matrox comes out with a capture/editng card solution that actually exploits 10bit color, I will buy it.

    From what I've read at the Matrox forums, your guess is as good as anybodies on how the implement alpha.


    as far as the others and their name calling, the moment I dared to dis da matrox I sort of expected insults in return. goes with the territory when one speak their mind, and in truth, no hard feelings, nor do I take it personal. Jeez, you'd think I'd wandered into a Mac forum or something.

    Comment


    • #17
      DW: There are no 2D applications that Matrox is aware of that actually uses the alpha channel in the 2D desktop. Because of that, there should be no problems in 2D to cannibalize 6 of the 8 unused alpha bits to store the extra color information. For 3D, as long as the app does not use destination alpha (almost no one does) or perform frame buffer locks (a well behaved app shouldn’t), then there will be 'blind' support of GigaColor only requiring the end user to enable it in the control panel.
      Also, GigaColor does not require that a game's artwork be formatted in 2:10:10:10 format. In the case of a 3D game, all of the final pixels in a scene are usually created by the 3D rendering hardware itself. Even if we read 8bpcc textures, we then filter them (introducing fractional colors), perform pixel math on them, fog them and blend them. Because all stages of the 3D engine have greater than 10bits of precision, the Parhelia-512 will create a final render into the frame buffer which fully utilizes the 2:10:10:10 format. If a game were to use destination alpha and the end user had forced on GigaColor for 3D, then screen color artifacts would appear. Because there is the possibility that this will happen in some apps, we are adding the GigaColor ON / OFF switch in our control panel.


      As someone said to me. This is much ado about very little.

      Joel
      Libertarian is still the way to go if we truly want a real change.

      www.lp.org

      ******************************

      System Specs: AMD XP2000+ @1.68GHz(12.5x133), ASUS A7V133-C, 512MB PC133, Matrox Parhelia 128MB, SB Live! 5.1.
      OS: Windows XP Pro.
      Monitor: Cornerstone c1025 @ 1280x960 @85Hz.

      Comment


      • #18
        pff here people say ati has beter view than a matrox how can it be that i still got my matrog g400 iff it would be beter the ati i would have have one so that my statment hehe
        specs p4 2.8@3.2Ghz Giga byte xnpbla bla 2x80GbHD Raid 0 creative audigy iiyama vision master 502 (21inch) a logitech mx700
        video is ati 9700pro modded to 9800 speeds volt mod ect ect

        Comment


        • #19
          You are right, and PCs have had 64bit color capability since Intel implemented mmx
          It's gotta be said: What. The. Hell. Are. You. Talking. About? These things are completely unrelated.
          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

          Comment


          • #20

            quote:
            --------------------------------------------------------------------------------
            Originally posted by Greebe
            [How many colors do you see? ...
            --------------------------------------------------------------------------------

            well, it is a jpeg, and they don´t compress in gigacolor.

            seriously though, even if it was in some whacky 10bit format, would i be able to tell the difference, on a normal graphics-card?
            I mean wouldn´t my graphics-card round the color precision down to 8bit?
            Last edited by TdB; 20 May 2002, 09:09.
            This sig is a shameless atempt to make my post look bigger.

            Comment


            • #21
              Can the eye really tell the difference between 10-bit color precision and the regular 8-bit that most video cards display?
              System Specs:
              Gigabyte 8INXP - Pentium 4 2.8@3.4 - 1GB Corsair 3200 XMS - Enermax 550W PSU - 2 80GB WDs 8MB cache in RAID 0 array - 36GB Seagate 15.3K SCSI boot drive - ATI AIW 9700 - M-Audio Revolution - 16x Pioneer DVD slot load - Lite-On 48x24x48x CD-RW - Logitech MX700 - Koolance PC2-601BW case - Cambridge MegaWorks 550s - Mitsubishi 2070SB 22" CRT

              Our Father, who 0wnz heaven, j00 r0ck!
              May all 0ur base someday be belong to you!
              Give us this day our warez, mp3z, and pr0n through a phat pipe.
              And cut us some slack when we act like n00b lamerz,
              just as we teach n00bz when they act lame on us.
              For j00 0wn r00t on all our b0x3s 4ever and ever, 4m3n.

              Comment


              • #22
                Originally posted by BuddMan
                Can the eye really tell the difference between 10-bit color precision and the regular 8-bit that most video cards display?
                well, in some sitations, yes!

                if all the color information is in one color-channel, then you only get 8bit color information. for example, a bluescale photo, you only get 256 shades of blue, and the other colorbits are zero, which IS a visible limitation.

                I just thought of something: what if we, instead of a fixed number of bits per color, could distribute all 32 bits optimally ,between the different colorchannels, for each pixel, so that an all blue pixel would have (0.32.0.0) bit color range.
                some sort of adaptive colorbit distribution. would that work?
                Last edited by TdB; 20 May 2002, 09:34.
                This sig is a shameless atempt to make my post look bigger.

                Comment


                • #23
                  SMART

                  Comment


                  • #24
                    well, it is a jpeg, and they don´t compress in gigacolor.
                    Sure they can. Even the standard libjpeg6b does the 12-bit/channel JPEG compression.

                    As far as us seeing 10-bit: yes, we can, in some situations, such as the green part of the spectrum. Also, it would be nice to have more than 256 choices when an RGB display renders something in black & white. I hate that pixelly look you get when black meets "almost but not quite black".
                    Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                    Comment


                    • #25
                      jojolimited said:


                      From what I've read at the Matrox forums, your guess is as good as anybodies on how the implement alpha.
                      It's not a guess. I've beta'ed most of those products.

                      Also: the logic for the disconnect between compositing software and the hardware still holds. It would cause too many compatability issues that tech support departments would rather avoid.

                      Dr. Mordrid
                      Dr. Mordrid
                      ----------------------------
                      An elephant is a mouse built to government specifications.

                      I carry a gun because I can't throw a rock 1,250 fps

                      Comment


                      • #26
                        Joel -
                        I use Vegas Video everyday, and it uses DirectX as it's plug-in interface. If the underlying Video/graphics card has hardware accelerated support for DirectX transforms (including Alpha channel processing), then Vegas will use it, simply because it's directX. Newteks' recent upgrade to Aura (2.5) has started using DirectX and DirectShow as it's standard for Plug-ins and video interfaces. Proprietary plug-in interfaces are becoming passe, and none to soon.

                        As far as 10bit, I think someone else here put it well. You really can tell the difference between 256 shades of red, green or blue and 1024 shades. makes for much better color transitions in film and video. I think the card supports 1024 levels of gray (between black and white). This will make for much better looking video, so I may go ahead and purchase the card for that reason alone.

                        What I'm hoping is software developers will find a way to take advantage of the new 10bit color space and work around the 2bit alpha limitation. Maybe the folks over at matrox video will surprise all of us. We can only hope.

                        Comment


                        • #27
                          To Mordrid

                          At this point in time, you are correct about the disconnet. But we have to move ahead somehow, someday and get out of the 8bit color world. If Matrox Video figures out a way to take full advantage of the card in a new line of products, they will also make the whole disconnet argument moot and make a lot of sales in the process. NO reason they couldn't create 10bit plug-ins, with a 10bit codec. I have full faith that somehow they will.

                          If so, they will own the world in the desktop video/film editing market. And I mean, own it.

                          Considering Media100 just released a set of cards with 10bit color support, hosted in a Compaq computer for 'ONLY' $60,000.00 US. Matrox could move in and blow them away with an affordable (in comparison) Digisuite supporting the same features.

                          Comment


                          • #28
                            "Somehow, someday....."

                            Sounds like a '60's song

                            Until and unless some way is found to implement on-card alpha channel without causing tech support departments trouble it may become theoretically possible and yet universally ignored by the software writers. For now that's a big if.

                            As for putting the Parhelia into editing devices, who knows what the future holds?

                            Dr. Mordrid
                            Last edited by Dr Mordrid; 20 May 2002, 12:00.
                            Dr. Mordrid
                            ----------------------------
                            An elephant is a mouse built to government specifications.

                            I carry a gun because I can't throw a rock 1,250 fps

                            Comment

                            Working...
                            X