Announcement

Collapse
No announcement yet.

'Fusion' cards

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • doesnt fusion mean fusing high speed edram with the graphics core like the bit boys with their xba technology?

    ------------------
    MSI K7TPro2 Duron 750@900
    256Ram G400DH32mb
    Pioneer SCSI 16xDVD
    DFI NFIIUltra 400
    756Ram ATI 9550 256mem
    Lite-On DVDR/RW/DL
    Windows XP pro
    msn messenger id: gchisel
    Be aware that a halo has to fall only a few inches to be a noose

    Comment


    • System 1:
      AMD 1.4 AYJHA-Y factory unlocked @ 1656 with Thermalright SK6 and 7k Delta fan
      Epox 8K7A
      2x256mb Micron pc-2100 DDR
      an AGP port all warmed up and ready to be stuffed full of Parhelia II+
      SBLIVE 5.1
      Maxtor 40g 7,200 @ ATA-100
      IBM 40GB 7,200 @ ATA-100
      Pinnacle DV Plus firewire
      3Com Hardware Modem
      Teac 20/10/40 burner
      Antec 350w power supply in a Colorcase 303usb Stainless

      New system: Under development

      Comment


      • doesnt fusion mean fusing high speed edram with the graphics core like the bit boys with their xba technology?

        the answer to what fusion is has been asnwered somewhere in this whole thread.. that leaves over 300 messages with the possible answer... I can tell you that your guess wasn't the right one

        Comment


        • This may be just a mistake but in the (german) PC magazin im reading they write following in the preview for the next months:

          modern 3d acc... blah blah ....What nVidia, Matrox & co can do, you will next read months.

          as i said, maybe they just wrote that for no reason, but maybe they wrote it for a reason


          [This message has been edited by Topha (edited 01 February 2001).]

          Comment


          • Were gearing up for an exciting March, so be sure to read all about it in the next issue of Matrox Graphics In-the-Loop!

            Have a great day!

            This is a part of the Matrox In-the-Loop newsletter. Sounds like there is a lot of activity at Matrox now, since they are gearing up for an exciting March

            ------------------
            - [GDI]Raptor

            Comment


            • "I love the smell of Napalm in the air ..."
              <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

              Comment


              • ok, well here it is! They get a G200Marvle, a GF2 Ultra, a "sample" bit boys card, Radeon 64meg Vivo, and 256megs of like rdram @800mhz with a 64bit interface they walk over to a candu reactor, toss all that shit in there stick a monitor adapter on the thing and hope for the best!!!
                1.221GHZ Baby!

                Comment


                • O.K. you graphics engineers ... what if Matrox developed a card that processed the DirectX APIs directly? Wouldn't this minimize the data transfers to the card? I'm not that familiar with the DirectX architecture.
                  <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                  Comment


                  • I think it must be a scooter of some sort...

                    M.
                    year2000:Athlon500/MSI6167/256M/10GIBM/6GSamsung/18GSCSI IBM/CL2xDVD/RR-G/HPPSPrinter/G400DH32M/DeltaDC995/MX300/ADSPyro1394/AHA2940UW/3comXL100

                    Comment


                    • No responses to my DX card question yet? Also, how far down in the graphics architecture would one have to go to minimize data transfers to/from the graphics card? Is the GDI layer sufficient? Don't make me go and research this myself!
                      <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                      Comment


                      • I'm not a graphics engineer but i'll try to answer the questions with my current, altho limited, understanding....

                        DirectX is a horrid, horrid implementation of a graphics engine. it is obfusicated, messy, and annoying. All that said, is a great idea, and supports the latest rendering techniques and eye candy.

                        Making a card that handles the DirectX card natively would be pointless and bad for several reasons: 1) for OpenGL you would have to write a DirectX wrapper (or at least this would be the easiest)- Matrox tried this and got seriously laughed at for it. This could also piss SGI off, as the big M and SGI are now in an agreement where SGI sells G450's with workstations. Bad thing to loose. 2) you would loose linux support becuse of the reason above, which would seriously damage any company making any attempt at a serious card. Also, Matrox was one of the first major graphics cards to support linux - this is important to them and i doubt they are going to threaten it 3) Microsoft rewrites the engine of DirectX every release, and it would mean that a new release would make your brand new graphics card useless. 4) alot of the DirectX calls are mostly software hacks anyways, even for the GeForce 2. Doing this in hardware would increase die size and cost of the final product, especially for all of the different techniques that exist. 5) how do you implement stuff like EMBM without having an engine/card drivers figuring it out? wouldn't you need to increase the amount of ram/processing capabilities on the card because you not only have to hold all the scene data, you then have to process it for effects and etc, and then render it. last i checked alot of this DirectX will figure out automagically, ie you tell it to put lights at point a, DirectX will figure out if it splashes on point b.... having to have a card do this would probably slow it down (unless you can fit a duron onto the die of the vid card too)...

                        the overhead of translating directx info to the native code that the graphics card uses is very minimal, especially when you consider that your processor is capable of just how many instructions per second?
                        "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                        Comment


                        • I understand that Windows specific hardware would have a limited appeal to Matrox and probably wouldn't have the market to justify the support required. Matrox could develop a card with an on-board processor to take over most of the graphics kernel/driver processing that's currently done in the CPU and load OS specific code onto that board. I wasn't sure that OpenGL doesn't use the DirectX APIs; that makes it more difficult to support this approach. Another problem with a coprocessor approach is that the on-board processor becomes a bottleneck as the CPU becomes faster and faster. At what point in the architecture would one minimize data transfers yet capture the multiple graphics APIs? The point here is to eliminate/minimize AGP usage.
                          <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                          Comment


                          • <font face="Verdana, Arial, Helvetica" size="2">"I love the smell of Napalm in the air ..."</font>
                            Oh No!! Xortam has his infamous BBQ again !!
                            Hide with your wife and children, but do NOT hide in the kitchen (we saw Tylau go there last), not in the basement (DJ Roberts...) nor in the bush (Gore )

                            Jord.

                            ------------------
                            This cat, is cat, a cat, very cat, good cat, way cat, to cat, keep cat, a cat, fool cat, like cat, you cat, busy cat! Now read the previous sentence without the word cat...
                            Jordâ„¢

                            Comment


                            • Nope Jorden ... you missed the meaning ... try again.
                              <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                              Comment


                              • I'm not talking about a silicon solution ... I'm talking about a coprocessor solution. The software running on the graphics card can be updated as easily as one updates drivers. I'm not talking about introducing new APIs but supporting existing APIs and moving some of the processing out to the graphics card where system RAM <-> graphics board RAM transfers can be greatly reduced. The graphics vendor would need to work closely with the OS vendors to assure support at the chosen interface. Its likely that MSFT doesn't wasn't to support above the current VxD interface and maybe somebody with some driver experience could comment.
                                <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                                Comment

                                Working...
                                X