Announcement

Collapse
No announcement yet.

OK! PCI Express is announced

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Originally posted by spadnos
    Well - it makes sense (to me at least).

    The on-board RAM runs at 333-500 MHz, possibly more. That's a 128-256 bit wide bus as well, so these cards are getting exclusive access (barring the actual CRTC in the chip) to 5GB/s to 15 GB/s or more. Anything that accesses main memory will compete with the CPU(s) for it, slowing down other processing. Also, the transfer of data across the PCI-Ex bus is slower - even 16 lanes equates to only (should be "only", since it's still wicked fast) 3.2 GB/sec.

    I guess it was a valid rant then, just like now.

    - Steve
    You're not competing with the CPU. The CPU runs from its cache 97-99% of the time. RAM largely sits idle. Basics of system memory design. So no, they won't be competing for the RAM. Even if they were, you haven't cited any potential problems that AGP doesn't/wouldn't also have.
    Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

    Comment


    • #47
      At least one advantage to PCI-Express would seem to be a more uniform software interface, no more AGP GART crap, sideband addressing, or other junk, just the same thing for either a video card or a dial-up modem (other than bandwidth of course.)

      Comment


      • #48
        Not true. Witness nVidia's newly announced "TurboCache" cards. I bet this is just the start of the new "GART" nightmare. I hope it isn't, but I bet....
        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

        Comment


        • #49
          I would be willing to bet that one of the TurboCache cards could run entirely out of the on card RAM, and that it's memory addressing scheme is almost entirely platform independant (unlike the GART).

          from my understanding PCI express gains a lot of ground in being able to 1) stream data, and 2) stream non-linear data compared to AGP. from my current understanding, the only thing that *current* (and optimized) architectures really gain an improvement on is geometry and texture uploads to the card, which are not usually done on the fly. one of the promising apps of PCI-E in graphics cards right now appears to be VMR - but that is going to rely on newer, better architectures that have better implementations of it.
          "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

          Comment


          • #50
            Originally posted by Wombat
            You're not competing with the CPU. The CPU runs from its cache 97-99% of the time. RAM largely sits idle. Basics of system memory design. So no, they won't be competing for the RAM.
            Well - actually, you would still be competing on most of todays processors. The CPU fetches from cache 97% of the time, but it also runs 10-15 times as fast as the memory bus, so that 3% of the time in the CPU is between 30% and 45% of the system memory bandwidth. I suspect that these numbers are a little unrealistic, or there wouldn't be real advantages to increasing memory bus speed, which I think there are. There would be no need for DDR2, PC4000, or whatever - everyone should be happy with PC133 (which I'm not )or DDR266.
            Even if they were, you haven't cited any potential problems that AGP doesn't/wouldn't also have. [/B]
            That was my point - the rant is just as valid now as it was before. The promise that AGP would allow really cheap yet still very fast video cards never panned out. Instead of getting a video card with no memory, we ended up going from 4-16M on a high end card to 128-256M or more.

            - Steve

            Comment


            • #51
              It's not the bandwidth that's the issue (for processors), it's the latency. And a 97% cache hit rate is quite low.
              Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

              Comment


              • #52
                Originally posted by spadnos
                That was my point - the rant is just as valid now as it was before. The promise that AGP would allow really cheap yet still very fast video cards never panned out. Instead of getting a video card with no memory, we ended up going from 4-16M on a high end card to 128-256M or more.
                AGP sadly was never designed to be able to scale to the demands that video games and 3d graphics in general now demand. the complexity of the models and the resolution of the textures constitutes *many* orders of magnitude of expansion compared to what was being designed when they started developing AGP. remember that the first AGP chipset introduced was the Intel 440LX chipset, which was *announced* mid 1997. Quake was released in 1996 and Quake II was released at the end of 1997. That is the level of graphics that came to market around the time AGP did. Quake III came out another 2 later, and was probably the first graphics engine that took advantage of the benefits that AGP offered. Doom 3, from my experience, contains about 50 times the number of polygons on screen at any given point in time, in addition to the texture data.

                Originally posted by Robert Duffy in his .plan
                To put things in perspective, most production levels in DOOM 3 contain more media assets than all of Quake 3: Arena. When we started working on the memory foot print, our goal was a 256MB system. In most cases loading up an area of the game on a 256MB system works fine, the problems arise when you start to transition from one area to the next ( successive map loads ). Memory fragmentation starts to really work against us and it ultimately made it just not feasible for a reasonable play experience to support 256MB.
                Originally posted by Robert Duffy in his .plan
                In Ultra quality, we load each texture; diffuse, specular, normal map at full resolution with no compression. In a typical DOOM 3 level, this can hover around a whopping 500MB of texture data. This will run on current hardware but obviously we cannot fit 500MB of texture data onto a 256MB card and the amount of texture data referenced in a give scene per frame ( 60 times a second ) can easily be 50MB+. This can cause some choppiness as a lot of memory bandwidth is being consumed. It does however look fantastic :-) and it is certainly playable on high end systems but due to the hitching that can occur we chose to require a 512MB Video card before setting this automatically.

                High quality uses compression ( DXT1,3,5 ) for specular and diffuse and no compression for normal maps. This looks very very close to Ultra quality but the compression does cause some loss. This is the quality that for instance the PC Gamer review was played in.

                Medium quality uses compression for specular, diffuse, and normal maps. This still looks really really good but compressing the normal maps can produce a few artifacts especially on hard angled or round edges. This level gets us comfortably onto 128MB video cards.

                Low quality does everything medium quality does but it also downsizes textures over 512x512 and we downsize specular maps to 64x64 in this mode as well. This fits us onto a 64MB video card.
                edit: if they had the same level of experience optimizing hardware and software for AGP texturing as they do now, you might have seen it work out for a little while - at least until these last few generations of games/hardware.
                Last edited by DGhost; 16 December 2004, 20:46.
                "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                Comment


                • #53
                  Having multiple high speed slots for multiple video cards or video and raid sound like a good thing to me.

                  ALso look at this, it seem PCI-ex may be able to do things AGP was *supposed* to be able to do.

                  Comment


                  • #54
                    I'm really disappointed they didn't get rid of the fan.
                    no matrox, no matroxusers.

                    Comment


                    • #55
                      Well... bearing in mind that Matrox employees have informally 'told' this forum that they think Matrox will have a good card ready for Longhorn, and yet this card brings nothing (really interesting) new to the years-old Parhelia (4 lifecycles worth of ATI/Nvidia cards!), what are the chances of that Longhorn card being any better received than even the Parhelia was (good reviews except by gamersie 75% of users).

                      I have to admit, I wonder why they really bothered. Even some of the old G400 cannot be convinced to upgrade to the much more powerful P8X, and while I appreciate that this is the architecture of the future, M should probably be working on getting some nice juicy new features to their customers (DX9 ? )

                      Comment


                      • #56
                        Originally posted by Whirl-Secret
                        Well... bearing in mind that Matrox employees have informally 'told' this forum that they think Matrox will have a good card ready for Longhorn, and yet this card brings nothing (really interesting) new to the years-old Parhelia (4 lifecycles worth of ATI/Nvidia cards!), what are the chances of that Longhorn card being any better received than even the Parhelia was (good reviews except by gamersie 75% of users).

                        I have to admit, I wonder why they really bothered. Even some of the old G400 cannot be convinced to upgrade to the much more powerful P8X, and while I appreciate that this is the architecture of the future, M should probably be working on getting some nice juicy new features to their customers (DX9 ? )
                        not very likely. NVidia has already gone through no less than 2 major design families for their graphics cards since Matrox introduced the P. Between now and Longhorn there will be no less than one or two more major design revisions...

                        with every major design change these companies learn how to improve things and how to optimize. NVidia learned lessons from the GeForceFX series of cards and implemented optimizations and design changes in the GeForce6 series of cards to improve performance. ATI underwent less radical changes between the 9700 and the X800 series, but they are still there and they do have an impact on performance/stabilty/features.

                        Sadly, Matrox has done very little in terms of core changes for their cards except for disabling features so that they can sell cheaper cards. they introduced a new platform which had it's issues (like, say, the GeForceFX) only instead of bucking up to their responsibility as the manufacturer, designer and primary support provider and getting off their asses and fixing the problems, they clammed up, withdrew and completely shot themselves in the foot.

                        if Matrox had been as aggressive as, say, NVidia with fixing problems with their cards and drivers they would really not be having a problem. But... they didn't...

                        and now they lack the nessicary design experience to be a true competetor... or at least to avoid another Parhelia...
                        "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                        Comment


                        • #57
                          Yeah. Only Matrox could start with something like the G400, and end up with the G450, and then the G550. WTF?
                          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                          Comment


                          • #58
                            Originally posted by Wombat
                            You know, every time you post, you just become a bigger *****le. Kruzin, Greebe, myself, and others here have some VERY reliable sources. We were right about the 550, about Parhelia, about the Px50s. A long tradition of knowing what we're talking about.
                            Perhaps you don't like my posts or me, but, in contrast to you, I NEVER INSULT ANYONE.
                            If you guys have 'reliable sources' that you can't (are not allowed to?) mention here, please say that. But nobody said this in this form, and for someone without your 'sources' (like me), it looks like some people around here are just using every opportunity to bash Matrox, sometimes reasonably and sometimes not...
                            If somebody says 'I have my sources' and does not explain what (or who?) they are, you won't believe him, right?

                            We had always different opinons here, but this is a shame...
                            P IV 3,06 Ghz, GA-8ihxp i850e, 512 MB PC-1066 RDRam, Parhelia 128 mb 8x, 40 + 60 gb IBM 7200 upm/2048 kb HD, Samtron 96 P 19", black icemat, Razer Boomslang 2100 krz-2 + mousebungee, Videologic sonic fury, Creative Soundworks

                            Comment


                            • #59
                              It's a funny group round here... you just have to go with the flow and hope Matrox pull through...
                              Asus P4C800-E Deluxe, Pentium 4 3GHz, 2Gb DDRRAM, Gainward BLISS GeForce 7800 GS+ 512MB, Matrox TripleHead2Go Digital, 3x Iiyama 4637 18.1" TFTs, Audigy 2 ZS, Matrox RT.x100, Silentmaxx Acoustic Case

                              Comment


                              • #60
                                Originally posted by Enak
                                It's a funny group round here... you just have to go with the flow and hope Matrox pull through...
                                You're damned right Enak, as always there's nothing we could do except hoping the best
                                P IV 3,06 Ghz, GA-8ihxp i850e, 512 MB PC-1066 RDRam, Parhelia 128 mb 8x, 40 + 60 gb IBM 7200 upm/2048 kb HD, Samtron 96 P 19", black icemat, Razer Boomslang 2100 krz-2 + mousebungee, Videologic sonic fury, Creative Soundworks

                                Comment

                                Working...
                                X