Announcement

Collapse
No announcement yet.

Speculation

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Speculation

    / mode ignore my bad English ON

    I was thinking about G800, dual chip, T&L, FSAA and so on.

    This is my idea of what is to came; it is based of some consideration that I'll explain later:

    a dual chip card;
    the chips will be a little enhancement over the G400 (more fill rate);
    at the top of the pipeline, a new chip for T&L;
    the dual chip will work well for the FSAA, or will be a chip at the end of the pipeline that will do all the work on FSAA;
    it will be the best solution on the market.


    Now, how I came to this:

    For me, modularity is the key. I mean, if Matrox work on a multiple chip solution, it can bring out new cards about every 4 month simply upgrading a chip each time, without having to change too much of the rest (and without putting too much effort in upgrading the driver).
    The owner of the old version will simply wait an year, for having a totally new card a lot more powerful and a lot more stable than nVidia and 3dfx.
    I think that in the coming time, it will be foolish to completely rewrite the driver every time a new card is out.

    For me, nVidia will continue to release a very new card every 8 month; will they come out with a new driver each time?
    People who bought geforcE have had a stable set of driver only now, after releasing of geforcE 2.
    I think nVidia will not endure so much in this mode. About a few year (month?) ago, it was the gaming market that pushed the hardware market. Now the hardware market is going really far...without consideration from the gaming market!!!!
    For me nVidia is like an SR71 that have took off after a long running...and lost (or forget?) it's engine on the runway.... ;-)

    Let's imagine, then, the way it could work for Matrox:

    - a few(very few) people working on the enhancement of the G400 chip. With 18 micron production, it will be a lot more enhanced for itself; few people could work on emproving the structure, so to have a really doubled fill rate. And i think it's really possible, cause the reduction of cost from the release of the G400 to now should give the possibility to improve the chip rendering pipeline.

    - other few people working on the board layout. Again, not a huge work, so will be people available for the next task...

    - Engineering a T&L chip. This will be very very huge work to do. But remember that engineers cost to mantain: since release of G400, do you think that Matrox have had them earn money for nothing? Naaaaaa..... ;-)

    - If the final power of the two enhanced G400 will be not so high, work on a FSAA chip. Not too difficult, too, but again a considerable job.

    - Really few people on the driver work. This is the main advantage. Using something that already exists as the core of the new board, it will be unnecessary to completely rewrite the driver. There's only a case in which the driver should be rewrited: in the mind of the programmer.
    Yep, cause we (programmers) use think that everything is better when rewroten...


    This is how I have reached mi ipotesis. And , I forgot, a litte bit of foolish...

    ^_^
    Byez, Drizzt!
    Sat on a pile of deads, I enjoy my oysters.

  • #2
    Let me say that I don't believe Matrox is working on a multichip solution (at least for up comming cards). Of course I won't be too bitter is Matrox proves me wrong.

    Matroxusers.com is reporting on driver sets for a "Fusion" line of G800 and G450 cards. I think this is NOT a multi-chip, but instead the T&L unit. i.e. fusion of the renderer and the T&L unit. Why make a G450 with two chips? Why not just buy a G800? Since G450 is an updated G400, the g400 would need to be scalable. Furthermore, the G450 Fusion was not yet announced. I think Matrox is waiting for DX8 before completing the T&L unit.

    Af far as getting cards out faster simply by adding more chips...well, Matrox was never accused of putting out too many cards too quickly.

    Comment


    • #3
      I think that the "Fusion" is not the multi-chip card, but the card with better, than DDR SDRAM memory (how it is called, FCRAM?) and therefore higher clocked processor, some sort of Max.

      Any multichip card in present time will become a disaster in Win2k.

      Comment


      • #4
        Why would that be? can you give me a good reason?

        Comment


        • #5
          I personally hope that the G800 will be a tile based architecture with say 4 tripple-texture pipelines at 200 MHz.

          Tile based rendering has some important advantages:
          - Enormous less amounts of memory bandwidth required, speeding up 32 bit color modes clock for clock.

          - Because a pixel is only calculated once (when all polygons are opaque, which most are in a game) the effective fillrate will be much higher than a normal z-buffer based renderer, as most games have an overdraw of 3.

          - Instead of rendering polygon by polygon, you render pixel by pixel. That way you can don't need to access the framebuffer in case of transparent polygons. That will decrease memory bandwidth and probably will eliminate the performance hit for transparent rendering.

          - Because of the pixel by pixel rendering, for each pixel the framebuffer is only accessed once per pixel. So you can render internally at a higher bit depth, which results in better quality with multitexturing and transparency, while there is no performance hit as input and output to memory is still in 32 bit.

          Secondly I want the matrox to be able to do per pixel shading as on the Geforce 2, as this will be a very cool quality enhancing feature for upcoming games.

          And third ofcourse I want a very fast T&L engine.

          So Matrox, if this isn't what the G800 will be, then go back to the drawing board.

          Comment


          • #6

            Check these rumours out:
            http://www.vr-zone.com/

            Twice the fillrate of the G450 doesn't sound all that impressive to me, but what do I know. I want to know how they plan to solve the ram bandwidth issue, fillrate is great, but without incoming data, it's just sitting there, tiling would be ideal, but is Matrox into taking big risks? Also, does pin compatible have any impact on the possible design of the card?

            I doubt that a G800 in Q1 2001 would mean a while lot to most people without some serious specs.

            Comment


            • #7
              I heard a rumor a while ago that pretty much said that the G800 would use either a signle or a multichip configuration.

              the problem here is that on a traditional multichip configuration, you get a much higher fillrate than your CPU.

              So, as the rumor went, the G800 was a segmented chipset. You had the traditional chip - something like the G400 or G450, with engineering enhancements, but no T&L. The next part is that on the higher end boards, you have a *seperate* T&L engine chip, with the possibility of linking multiple T&L chips together and effectively doubling the power of the T&L engine.

              Thus, you can get a value board that has no T&L, a mid range board that has one T&L chip on it, maybe offering performance comparable to the Geforce or Gefore2, and a high end board that uses 2 T&L chips (or more). It also allows the performance to be enhanced without a full redesign of the architecture.

              Another part of this is that the T&L chip would be fully compatable with the G450.


              Thus, maybe the G800 AGP has no T&L, and the G450 Fusion has 1 T&L chip, and the G800 Fusion Plus has 2 T&L chips.

              As the topic of this thread indicates, it is speculation only.

              Don't take this with a grain of salt - use the whole salt shaker instead.

              -Luke



              [This message has been edited by DGhost (edited 17 June 2000).]
              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

              Comment

              Working...
              X