Announcement

Collapse
No announcement yet.

Absolutely Amazing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    RC put an addendum on their post;

    "Addendum:

    According to the information published, NV30 does not support such function as texture lookup in vertex shader, what is one of the main features 3.0 can provide. Moreover, NV30 does not support branching in Pixel Shaders. In this case we doubt NV30 will support all the functions out of 3.0 and also want to beg your pardon for the misleading information that CineFX corresponds to Pixel Shaders 3.0.

    As for Pixel Shader 2.1, we want to point out that this "standard" is quite "flexible" in terms of functionality, because the support of it can be declared, for example, if a chip supports 1024 instructions without branching or 512 instructions with static branching."
    Dr. Mordrid
    ----------------------------
    An elephant is a mouse built to government specifications.

    I carry a gun because I can't throw a rock 1,250 fps

    Comment


    • #17
      G250 was only used by HP as I recall, and who cares about HP

      But, on you go, was it a die shrink of the G200? Me runs off to ebay to see if there's any of them around (or were they all on board?)

      P.
      Meet Jasmine.
      flickr.com/photos/pace3000

      Comment


      • #18
        I believe they were a shrink. I remember that it ran with just a heatsink.
        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

        Comment


        • #19
          the G250 was a G200 in .25, hence it was 'only' a die shrink.

          afaik, there were a few sold as OEM cards, not only to HP.
          Despite my nickname causing confusion, I am not female ...

          ASRock Fatal1ty X79 Professional
          Intel Core i7-3930K@4.3GHz
          be quiet! Dark Rock Pro 2
          4x 8GB G.Skill TridentX PC3-19200U@CR1
          2x MSI N670GTX PE OC (SLI)
          OCZ Vertex 4 256GB
          4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
          Super Flower Golden Green Modular 800W
          Nanoxia Deep Silence 1
          LG BH10LS38
          LG DM2752D 27" 3D

          Comment


          • #20
            There are technical differences between the G400 and the G550, which are basically the same core.
            They changed the memory interface from 128bit sdr to 64bit ddr.
            They can surely keep all the same functions and just add some more pipelines? With a die shrink to 0.13µ (while u wait...), it would be possible to use the xtra space...
            PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
            Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
            +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

            Comment


            • #21
              Originally posted by Wombat
              I believe they were a shrink. I remember that it ran with just a heatsink.
              With only a heatsink ? so did the G200. And the G400 too, if a larger one. AFAIK, the G400Max was the first one to get a fan, due to its overclocking.

              Why, then, chipset's didn't have heatsinks, and CPUs often came with large passive heatsinks...

              Comment


              • #22
                Evil, you should look over your sig.

                Comment


                • #23
                  Originally posted by Evildead666
                  There are technical differences between the G400 and the G550, which are basically the same core.
                  They changed the memory interface from 128bit sdr to 64bit ddr.
                  They can surely keep all the same functions and just add some more pipelines? With a die shrink to 0.13µ (while u wait...), it would be possible to use the xtra space...
                  The Problem is and this has been stated many times that using 0.13µ with the P is that the layout of the chip would have to be totally changed and the Fab making the P is going to 0.9µ in 2003-4
                  Why is it called tourist season, if we can't shoot at them?

                  Comment


                  • #24
                    Evil, you have no idea how much work you just glossed over.
                    Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                    Comment


                    • #25
                      Originally posted by Wombat
                      Evil, you have no idea how much work you just glossed over.
                      I can try to guess that... But will not talk about it aloud.

                      and yep, I am back from lurk mode.

                      and btw, 8 pipelines with one TMU is pretty optimal for 256Bit memory interface running aproximately @ 300Mhz... more TMU's would most likely to cause pipeline(s) stall in some cases, less would left some unused.

                      In parhelia's case, Memory Controller Efficiency could be better and those 16 TMU's are now sucking all the bandwidth that Interface can offer:

                      - 4 texture samples x 4 bytes each x 16 TMU's x 220Mhz = 53 GigaBytes/s and with texel cache hit ratio 66% (amount of found texture samples from cache) it still makes: 53 GB/s x 0.33 = 17 GB/s.

                      - Remember that you need to one write to z-buffer for every pixel per clock (4 bytes each with 32 bit z) and also 4 pixels write to framebuffer per clock making in 1280x1024x32bpp, 32 bit Z & aprox. scene complexity 7 running at 60 fps:
                      -- needed fillrate: 1280x1024x60x7=525Mpixels/s
                      -- framebuffer bandwidth needed to achieve that: 525Mpixels x 4 bytes for color value to back buffer x 2 to fill the Z=4.2 GB/s
                      -- Total memory bandwidth needed with previously calculated texture bandwidth: 17GB/s + 4.2GB/s = 21 GB/s

                      of course that example doesn't even include geometry overhead, which is propably quite some nowadays.

                      as you can see, they would need to get that memory clock a way more up or make caches much more efficient than I had in my example...(those propably are already, but even more wouldn't be a bad thing.)
                      "Dippadai"

                      Comment


                      • #26
                        Hiya Nappe

                        Nice to see you back. Where have you been?


                        Regards Michael
                        Interests include:
                        Computing, Reading, Pubs, Restuarants, Pubs, Curries, More Pubs and more Curries

                        Comment


                        • #27
                          well, Here and There...

                          Bitboys at least have now showed their own working hardware though their first actually product is Scalable Vector Graphics chip for low end PDA's and cellular phones.

                          yesterday they officially launched their new website with all new stuff. (including also 'Axe' chip information that I was talking about a year ago already.)

                          if last autumn looked great on graphics market, this doesn't look good at all. two big horses are gaining a way over the others advantage and smaller companies have their problems.

                          but one thing is a fact: I am cursed. I have now 3 times selected a product to support / stand behind that would not be anything else than winner and everytime I have been prooven being wrong. G800: never came out. G550 has some of it left, but I am definately sure that G800 would have been much more. Axe/Avalanche: shutting down the proprietary manufacturing line killed this one. never will be found on the stores. parhelia: most likely huge delays( and I really mean it.) caused Matrox to get it out a bit unfinished. (it really was quite a lot late. I don't say how much, but pretty much.)

                          My personal life has been pretty much in downhill, and I am starting now 5th year on 4th year school, with almost no funds (because of very loysy summer job I had.) and no place to do my master thesis at software engineering. So it will pretty much obious that I am stucked with AIW Radeon. To get Parhelia, they would need to push the price down from 450 Euros to under 150 Euros to be cheap enough for me.

                          so not much... but at least I had a chance meet Bitboys IRL at this year Assembly.

                          I am so tired all of this shit. I am visiting same boards all over and over again stating same things again and again to the ppl that will forget my words within 5 minutes. No one really remember If I have been right somewhere (I hate ppl posting "WHAT DID I TELL YA!!!11§12§§2"), but everyone remembers when I have been wrong. If I wouldn't be so broke, I would leave this all technology stuff behind and leave to the wild side in the Lapland for undecided time. (propably until I would run out of money again.)

                          or maybe I am having a bad day again... (it would be 3rd in the straight.)
                          Last edited by Nappe1; 5 September 2002, 03:19.
                          "Dippadai"

                          Comment


                          • #28
                            Originally posted by Nappe1
                            parhelia: most likely huge delays( and I really mean it.) caused Matrox to get it out a bit unfinished. (it really was quite a lot late. I don't say how much, but pretty much.)
                            Hey I know what your talking about, if the P came out a bit sooner then it did, well People would be calling it the next great thing, instead of totally panning it now. We could place the blame on someone well known/hated around here...but I'm not going into specifics...just read between the lines and if your an old timer you know who I'm talking about.

                            Nappe...also remember bad things happen in 3s, so your luck will hopefully turn around soon
                            Why is it called tourist season, if we can't shoot at them?

                            Comment


                            • #29
                              Nappe
                              Dont let things get you down. I appreciate your enthusiasm

                              As for Parhelia, I think the truth is that Matrox may not have enough engineers. Most have been gobbled up by the big 2.

                              Regards MD
                              Interests include:
                              Computing, Reading, Pubs, Restuarants, Pubs, Curries, More Pubs and more Curries

                              Comment


                              • #30
                                98 G200 DX5? (first of G-core based cards)
                                99 G400 DX6 (teh win, what G200 should be, on par with competition, FSAA, aniso promised in the core factsheet, EMBM ...)
                                00 G450 (dieshrink 128bit SGRAM->64bit DDR, value business)
                                01 G550 partial DX7 (slower version of G800)
                                02 Parhelia partial DX9 (first of parhelia core based cards)
                                03 ?

                                What happened after G400. Matrox had lot's of OEM deals with G200 and was leading desktop 3D, why didn't they bring on the G600 and G800?

                                From above pattern one can safely assume that Matrox will release another card next year. It's likely in the lab stage now.
                                Last edited by UtwigMU; 9 September 2002, 12:31.

                                Comment

                                Working...
                                X