Announcement

Collapse
No announcement yet.

Dual graphics cards from Alienware

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Dual graphics cards from Alienware

    And they don't even have to be the same card!

    Take a look at All News Releases related news releases, photos and videos distributed by PR Newswire, with investor relations and company news.
    Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

  • #2
    does one card render into the framebuffer of the other over the PCI-Express bus??

    I can't see how this is going to be very efficient.

    Comment


    • #3
      i seem to recall a company marketing a similar technology a while ago...
      "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

      Comment


      • #4
        I would like to see a PCI Express version of Parhelia (8x core) combined with this to beef up surround gaming a bit. That would rock.

        Comment


        • #5
          The press release tells me nothing about what they're even offering. I can't even tell what advantages it's supposed to have over current technology.
          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

          Comment


          • #6
            I was guessing this would happen with the announcement of PCI Express. My guess would be that the mobo will contain multiple X16 slots (at least I think that's the AGP replacement type). You'll use an independent video card for each monitor you use. That way you could, in theory, do surround-gaming, CAD, graphic design, etc. without suffering performance hits that a single video card processing such a high resolution would have.

            At that's what I would assume Alienware is talking about. Multiple video card, multiple monitor solution.

            Jammrock
            “Inside every sane person there’s a madman struggling to get out”
            –The Light Fantastic, Terry Pratchett

            Comment


            • #7
              So nothing special beside the multiple graphics ports? THat's revolutionary? Maybe the should patent it and sue everyone else.

              Comment


              • #8
                Did anyone bother to read the press release?

                It says:

                significantly enhance the performance of graphics intensive applications including extreme gaming, professional design and engineering, real-time rendering and animation, and
                flight training and simulation modules.
                I don't know bout you but I would buy a system that would allow two graphics cards. The benefits are huge.

                I know it doesn't go into detail but even a 35-50% increase in perforance on all fronts would be nice.
                Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

                Comment


                • #9
                  I think boards with multiple high speed PCI-ex slots will not be uncommon soon, it pretty much on every mobo makers roadmap.

                  But if you had to pci-ex cards doing alternate frames and has some kind of genlocking mux you should get near double performacne than a single card.

                  But I guess sending the complete frame to the FB to the "output" card should work, maybe they have an onbaord frambuffer and dac/dvi interface so the complted frames are sent to that.

                  Comment


                  • #10
                    Gaah.

                    ~~DukeP~~ runs screamingly away, waving his hands in the air shouting: Its comming - the bad days of SLI have risen again!!!

                    ATI MAXXX anyone?
                    Voodoo?

                    Since modern cards arent even fillrate limited, Im _very_ interested in seing how they will make this work.
                    Shared shader executions?

                    Not likely to work.

                    ~~DukeP~~

                    Comment


                    • #11
                      and have both cards process graphic commands in parallel
                      I really don't see how your gonna gain much performance this way. I think of it as the the way dual CPU's work in systems. I smell 15% MAX performance increase IF you run two of the exatct same cards. ( I can forsee this crashing and burning if two different cards are used. Especially since your gonna be using DIFFERENT cards)

                      Are any of the games gonna be compatible with this new technology? What about OS compatibilty DRIVER compatibility?

                      And IF you can put two different cards, then what kind of performance should we really expect?

                      I'm with DukeP, I'm VERY interested to see how they pull this stunt off. I'm very skeptical about this.

                      OH!!! imagine putting TWO power hungry video cards in one, gonna be very VERY hot inside, computer case. PSU upgrade and maybe they'll add a water cooling system or sell it as a fancy fandangled heater to Inuits?
                      Last edited by ZokesPro; 12 May 2004, 23:59.
                      Titanium is the new bling!
                      (you heard from me first!)

                      Comment


                      • #12
                        or sell it as a fancy fandangled heater to Inuits?
                        LOL
                        _____________________________
                        BOINC stats

                        Comment


                        • #13
                          I wouldn't touch anything from Alienware anymore, not even with a 100 foot pole.

                          It might be ok, if you'd stick with their catalogue'd products only, but as soon as you dare to ask for customization, such as an additional HD or a localized keyboard, you're lost.
                          Despite my nickname causing confusion, I am not female ...

                          ASRock Fatal1ty X79 Professional
                          Intel Core i7-3930K@4.3GHz
                          be quiet! Dark Rock Pro 2
                          4x 8GB G.Skill TridentX PC3-19200U@CR1
                          2x MSI N670GTX PE OC (SLI)
                          OCZ Vertex 4 256GB
                          4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
                          Super Flower Golden Green Modular 800W
                          Nanoxia Deep Silence 1
                          LG BH10LS38
                          LG DM2752D 27" 3D

                          Comment


                          • #14
                            Originally posted by Marshmallowman

                            But if you had to pci-ex cards doing alternate frames and has some kind of genlocking mux you should get near double performacne than a single card.
                            Unfortunately, I doubt it. It's called AFR, and ATI tried it a little while back, dual-core on a card.
                            You get a small improvement, certainly not double.
                            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                            Comment


                            • #15
                              Just read this on Yahoo news:

                              The "Video Array" system uses a customized computer
                              motherboard and what the company calls a "video merger hub"
                              to allow a computer to have two graphics cards essentially
                              working in parallel, with one card handling the top half of
                              the monitor and the other card the bottom half.

                              Special software controls the load, so that if one half is
                              not particularly intensive that graphics card can do some of
                              the work for the other half. Both graphics cards connect to
                              the hub, which in turn connects to the monitor.

                              The liquid cooling system needed to keep the computer from
                              all but melting down dissipates around 800 watts of heat, the
                              company said. But it is looking at the possibility of a 1
                              kilowatt supply to ensure the machine is adequately powered.

                              The new "ALX" systems, expected to sell for at least $4,000,
                              will deliver about a 50 percent boost to graphics
                              performance, Gonzalez said, compared to the best of the
                              current single-card graphics solutions.
                              Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

                              Comment

                              Working...
                              X