Announcement

Collapse
No announcement yet.

Parhelia!!!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by TDB

    are we talking about 64bit color?

    BTW wasn´t the G200 the first chip EVER that could do 3d in 32bit color?
    Not really, but the key is "reasonable speed."

    even my S3 Virge based Diamond Stealth 3D 2000 could do 24 and 32 bit rendering, but speed?? well, let's change subject...
    "Dippadai"

    Comment


    • greebe i bow to your superior knowledge but surely multichip solutions have a place in a graphics companies line up. i mean although it may be more difficult to implement multi chip solutions it cant be that hard as you can get dual cpu's which just have slightly special motherboards so the gfx card would just need a slightly different curcuit board?if this is so i'm sure engineers could get the chips to adress different memory banks simultaneously with a bit of research and voila you should have a card which operates at least 50 % faster than the single chip solution without having to develope a new core. it could be introduced when the single cpu is looking a bit long in the tooth to get revenue from an already existing core and memory architecture design.if it came to being a crude design you could get each chip to render 1/2 the screen and merge the results for output or get them to do alternate frames.
      is a flower best picked in it's prime or greater withered away by time?
      Talk about a dream, try to make it real.

      Comment


      • Originally posted by borat
        greebe i bow to your superior knowledge but surely multichip solutions have a place in a graphics companies line up. i mean although it may be more difficult to implement multi chip solutions it cant be that hard as you can get dual cpu's which just have slightly special motherboards so the gfx card would just need a slightly different curcuit board?
        No no no, no no no no no NO. You don't magically get a dual CPU machine just by getting a new motherboard. There's a lot of circuitry on a CPU designed around talking to other CPUs. Entire areas of silicon are dark on the die of a uniprocessor machine. It takes a ton of research and testing to make an SMP-capable processor.

        Most CPUs are made with SMP in mind, most graphics chips are not.
        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

        Comment


        • The Radeon 8500 GPU was designed from the onset to be multi-gpu capable.
          Let us return to the moon, to stay!!!

          Comment


          • Ah yes. And we know how good ATi is with multi-chip video cards. Hardware is useless without the drivers to run it.
            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

            Comment


            • while multi chips can work very well.
              (scan line interleave is quite efficent for doubleing performance.)
              you have the BIG problem of shared textures, most dual chip cards don't share textures, but have a separate cache of for each chip...whic is quite a waste of expensive memory chips.

              and if they share there texture memory, they will incurr a big bottleneck..

              as to outsourceing, I think matrox would be ok if they outsourced the opengl driver development to SGI
              In fact it would be a very good idea

              Comment


              • SLI only helps your performance if you are fillrate limited. It doesn't help much/any with any of the many other things that can cause a performance ceiling.
                Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                Comment


                • Also, ATI's multichip implementation is not quite like 3dfx's was. with ATI, each chip renders every other frame, which is more like a hack added in at the last moment. In fact, last i checked they used an AGP Bridge on the Rage Fury MAX, the card acctually detected as two devices, and Windows 2000 gibbed horribly with it.

                  3dfx's multichip architecture was much more sophisticated...
                  "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                  Comment


                  • Re: Who knows?

                    Originally posted by Gix
                    There are some guys outside Matrox who definitely know about this chip. I mean the manufacturers... Who are they?
                    I said the chip manufacturers. In the days of Millennium II, G100 it was NEC. Am I wrong?

                    Comment


                    • I'm gonna git me one of them Paranoias and I don't care how much they are.

                      dj
                      My Packurd bell 166Megahurtz runnin at 233 on a ABIT ITH5 muther board,
                      128MB EDO ECC RAM and a hole bunch of other cool stuff.

                      Comment


                      • Funny how DJ can't spell "get," but "paranoia" is no problem

                        Comment


                        • Ahhh.....

                          Here you all are!

                          I was wondering were you all were hiding out!
                          If there's artificial intelligence, there's bound to be some artificial stupidity.

                          Jeremy Clarkson "806 brake horsepower..and that on that limp wrist faerie liquid the Americans call petrol, if you run it on the more explosive jungle juice we have in Europe you'd be getting 850 brake horsepower..."

                          Comment


                          • Did you get my PM ?

                            MK
                            <font size="1">
                            Celeron II 700 @ 1,1 GHz
                            ASUS CUSL2-C, Bios 1009 final
                            Alpha 6035MFC, 60 -> 80mm adapter
                            2 x 80mm Papst Cooler 19/12dB
                            256 MB PC133 Crucial 7E (CAS2)
                            Maxtor Diamond MAX VL40
                            ATI Radeon 8500 64MB @ Catalyst 3.0
                            Hauppauge WinTV TV-Card
                            Iiyama Vision Master Pro 400
                            Plustek Optic Pro U12B
                            HP Deskjet 959C
                            Plantronics LS1 Headset
                            all on W2k Professional SP2
                            </font>

                            Comment


                            • MAXX v1 was poor for the reasons stated - please be aware that ATI designed an entirely new bridge chip for the 8500 and beyond. ATI simply hasn't released a MAXX because of what it would cost - an 8500 MAXX would be at least as fast as the new gf4 and at least as expensive.

                              The rumored specs of the new matrox card remind me of those of the upcoming ATI R300 and GF5 (with the R300 having better rumored specs with it being built by former Art-X empoyees that designed the Flipperchip in the Gamecube).

                              There are many ATI users that would buy Matrox in a heart-beat (I'm a former G200 user and had it paired with a voodoo2 until I replaced it with a voodoo3) - if only because it isn't nvidia. Many ATI customers want what Matrox users want - 2d and 3d quality.

                              Here's hoping....

                              Comment


                              • Why does everybody seem to think that small companies (Bit Boys, Art-X, etc) can make wonderful chip designs, yet a company like Matrox thats been around for 25+ years, and has (probably) several 100 million $$ more invested in R@D (over the years), with proven engineers working for them cant?

                                my faith is in Matrox to do everything them selves. Remember they made VRam (or was it WRam?) back in the day. Whats to stop them making their own RAM now?

                                if a G400 chip costs around US$40 to manufacture, why not design some ram that fits your needs 100% and have it custom manufactured? might cost another US$40, but is that a problem if you would be spending around US$40 (guess) for 300Mhz DDR ram anyway? DDR probably doesnt suit Video memory as well as a memory chip designed specifically for the task.

                                You could have a single RAM chip, with pins that line up exactly with your GPU, and then put it on the back of the PCB. just run very short (<1mm) traces through the PCB straight to the RAM. Then you have no hard trace routing issues, and if the ram is designed specifically, you shouldnt have any timing issues.

                                You could then have the ram running at a very low clock (100Mhz or so), and just have a monster wide datapath to it. You could use old .35 micron technology for the ram, saving more money, not have to worry about heat, thus saving more $$ on heatsink design.

                                The only downside is having lots of pins on the GPU itself. This could be overcome if you went to a partially serial design.

                                But then again, as long as it has a dual channel DVI output, capable of supplying 1600X1200 to my flat panel, Ill be happy.

                                Ali

                                Comment

                                Working...
                                X