Announcement

Collapse
No announcement yet.

AMD-ATI Plans COMBINED CPU/GPU In 2008!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD-ATI Plans COMBINED CPU/GPU In 2008!

    Great EETimes.Com news story!

    Check this out:



    Aiming to leapfrog archrival Intel Corp., Advanced Micro Devices will deliver a wide range of merged x86 CPUs with on-board graphics accelerators starting in late 2008. AMD announced its so-called Fusion program Wednesday (Oct. 25) upon the formal completion of its $5.4 billion acquisition of graphics and chip set designer ATI Technologies Inc. The merged company will ship versions of the combined processors for laptops, desktops, workstations, servers and consumer electronics devices geared for emerging markets.
    Jerry Jones

  • #2
    Hope the lawsuit doesn't throw a monkey wrench into their plans .

    Kevin

    Comment


    • #3
      It'll be settled like all the others; a license and a tithe. There's no benefit in it going any other way.
      Dr. Mordrid
      ----------------------------
      An elephant is a mouse built to government specifications.

      I carry a gun because I can't throw a rock 1,250 fps

      Comment


      • #4
        Bloody ridiculous! No choice of graphics; if the graphics goes west, replace the CPU (or vice versa); double the cooling problems. Sounds parallel to MS including media players, browsers and other unnecessary junk into an operating system. Great boost for Intel!
        Brian (the devil incarnate)

        Comment


        • #5
          Erm. I think it is sensible. I like the concept.

          What I see is a small integrated core to do desktop / video simple tasks. The optional main graphics card can be shut down. To save power. Your Nvidia G80 etc then be fired up when needed. Imho good for notebooks. All you need is some kind of video switch to make it work, I imagine.

          I don't think it is intended to replace high end graphics. Plus you will probably have better access to onboard memory.
          ______________________________
          Nothing is impossible, some things are just unlikely.

          Comment


          • #6
            Just as there were significant performance gains to be had by integrating the memory controller into the CPU (its was so efficient with DDR that the AM2 upgrade to DDR2 shows only ~5% improvement), I'm sure there are signficant performance gains to be had integrating the GPU as well.

            --wally.

            Comment


            • #7
              For starters easier/tighter implementation of GPGPU processing, especially if each core or die gets it's own GPU. In a multi-Socket F that could make onboard SLI possible.

              Note also that they plan on using upper scale GPU cores. If that means their X1600 Mobility core, wow. Just wow.
              Dr. Mordrid
              ----------------------------
              An elephant is a mouse built to government specifications.

              I carry a gun because I can't throw a rock 1,250 fps

              Comment


              • #8
                Silly question where would the HDMI/DVI encoder go? On the motherboard or could it be integrated into the socket?
                ______________________________
                Nothing is impossible, some things are just unlikely.

                Comment


                • #9
                  so with a mobility x1600 core, it would not be DX10 capable.. seems like a bad choice
                  We have enough youth - What we need is a fountain of smart!


                  i7-920, 6GB DDR3-1600, HD4870X2, Dell 27" LCD

                  Comment


                  • #10
                    Yep would be a bad choice for Vista.... but what about STB and Apple etc.
                    ______________________________
                    Nothing is impossible, some things are just unlikely.

                    Comment


                    • #11
                      Probably through the socket to a MB connector.

                      IMO what we're looking at with Fusion;

                      If you don't need high end graphics you've got an all-in-one solution. Cheap, low-power & perfect for mini PC's and HTPC.

                      If you want/need high end graphics you install a PCIe card and the onboard graphics becomes a physics/floating-point co-processor on the die with a very high speed interconnect. Full time GPGPU

                      Socket F (no CPU pins, just 1,207 pits that connect to sprung balls on the socket)



                      News sez the merger is finished and the press release says this;

                      "With the anticipated launch of Windows Vista, robust 3D graphics, digital media and device convergence are driving the need for greater performance, graphics capabilities, and battery life," said Phil Hester, AMD senior vice president and chief technology officer. "In this increasingly diverse x86 computing environment, simply adding more CPU cores to a baseline architecture will not be enough. As x86 scales from palmtops to petaFLOPS, modular processor designs leveraging both CPU and GPU compute capabilities will be essential in meeting the requirements of computing in 2008 and beyond."

                      NOTE: GPGPU

                      Fusion processors are expected in late 2008/early 2009, and the company expects to use them within all of the company's priority computing categories, including laptops, desktops, workstations and servers, as well as in consumer electronics and solutions tailored for the unique needs of emerging markets.
                      Last edited by Dr Mordrid; 26 October 2006, 10:26.
                      Dr. Mordrid
                      ----------------------------
                      An elephant is a mouse built to government specifications.

                      I carry a gun because I can't throw a rock 1,250 fps

                      Comment


                      • #12
                        FUSION - More than we thought



                        Turns out it's not only integrated graphics w/GPGPU but a move towards rapid modular processor design. Pick your modules according to the task, sometimes including >1 GPU if need be.

                        Bottom line: designer CPU's

                        Cool Fusion: AMD's plan to revolutionise multi-core computing

                        AMD's Fusion CPUs will not be mere system-on-a-chip products, the chip company's chief technology officer, Phil Hester, insisted yesterday. Instead, the processor will be truly modular, capable of forming the basis for a range of application-specific as well as generic CPUs, from low-power mobile chips right up to components for high-performance computing systems.

                        Start equipping CPUs with their own graphics cores and it certainly sounds like you'll have an SoC part on your hands. Not so, says Hester. Fusion will not just link monolithic components on a single die - the traditional SoC architecture - but will break these components down into more basic parts that can be mixed and matched as needed then linked together using AMD's Direct Connect technology.

                        According to Hester, Fusion's roots lie much further back than discussions between AMD and ATI about graphics technology more tightly connected to the CPU, let alone the more recent takeover negotiations. When 'Hammer', AMD's original 64-bit x86 architecture, was in development, AMD designed the single-core chip to be able to be equipped with a second core as and when the market and new fabrication technologies made that possible. Going quad-core will require a suitably re-tailored architecture, as will moving up to eight cores and beyond to... well, who can say how many processing cores CPUs will need in the future?

                        Or, for that matter, what kind of cores? According to Hester, this is a crucial question. AMD believes building better processors will soon become a matter not of the number of cores the chip contains but what specific areas of functionality each of those cores deliver.

                        To build such a chip, particularly if you want to be able to change the mix of cores between different products based on it, you need a modular architecture, he says. So Fusion will break down the chip architecture into its most basic components: computational core, clock circuitry, the memory manager, buffers, crossbar switch, I/O, level one, two and three cache memory units, HyperTransport links, virtualisation manager, and so on - then use advanced design tools to combine them, jigsaw-fashion, into chips tailored for specific needs. Need four general purpose cores, three HT links but no L3 cache? Then choose the relevant elements, and let the design software map them onto the die and connect them together.
                        >
                        >
                        Example Fusion core components and 2 possible designs

                        Last edited by Dr Mordrid; 26 October 2006, 17:30.
                        Dr. Mordrid
                        ----------------------------
                        An elephant is a mouse built to government specifications.

                        I carry a gun because I can't throw a rock 1,250 fps

                        Comment


                        • #13


                          Turns out it's not only integrated graphics w/GPGPU but a move towards rapid modular processor design. Pick your modules according to the task, sometimes including >1 GPU if need be.

                          Bottom line: designer CPU's

                          Cool Fusion: AMD's plan to revolutionise multi-core computing

                          AMD's Fusion CPUs will not be mere system-on-a-chip products, the chip company's chief technology officer, Phil Hester, insisted yesterday. Instead, the processor will be truly modular, capable of forming the basis for a range of application-specific as well as generic CPUs, from low-power mobile chips right up to components for high-performance computing systems.

                          Start equipping CPUs with their own graphics cores and it certainly sounds like you'll have an SoC part on your hands. Not so, says Hester. Fusion will not just link monolithic components on a single die - the traditional SoC architecture - but will break these components down into more basic parts that can be mixed and matched as needed then linked together using AMD's Direct Connect technology.

                          According to Hester, Fusion's roots lie much further back than discussions between AMD and ATI about graphics technology more tightly connected to the CPU, let alone the more recent takeover negotiations. When 'Hammer', AMD's original 64-bit x86 architecture, was in development, AMD designed the single-core chip to be able to be equipped with a second core as and when the market and new fabrication technologies made that possible. Going quad-core will require a suitably re-tailored architecture, as will moving up to eight cores and beyond to... well, who can say how many processing cores CPUs will need in the future?

                          Or, for that matter, what kind of cores? According to Hester, this is a crucial question. AMD believes building better processors will soon become a matter not of the number of cores the chip contains but what specific areas of functionality each of those cores deliver.

                          To build such a chip, particularly if you want to be able to change the mix of cores between different products based on it, you need a modular architecture, he says. So Fusion will break down the chip architecture into its most basic components: computational core, clock circuitry, the memory manager, buffers, crossbar switch, I/O, level one, two and three cache memory units, HyperTransport links, virtualisation manager, and so on - then use advanced design tools to combine them, jigsaw-fashion, into chips tailored for specific needs. Need four general purpose cores, three HT links but no L3 cache? Then choose the relevant elements, and let the design software map them onto the die and connect them together.
                          >
                          >
                          Example Fusion core components and 2 possible designs

                          Dr. Mordrid
                          ----------------------------
                          An elephant is a mouse built to government specifications.

                          I carry a gun because I can't throw a rock 1,250 fps

                          Comment


                          • #14
                            And...only for Windows?...

                            Comment


                            • #15
                              Originally posted by Nowhere
                              And...only for Windows?...
                              Both AMD and Intel have been supportive of Linux. ATI has been much less so than Nvidia by a long shot. Althouh most Linux purists, would insist that binary only drivers is not real support. It'll be interesting to see what AMD makes ATI do.

                              But in my Linux image processing application (two capture cards, each at 640x480 30 fps) using the Nvidia binary drivers (Ubuntu 6.06, GeForce 5500) the X server only uses 7-10% CPU on AMD 4800+, my app uses about 35%. The X.org nvidia driver is broken on Ubuntu and my app won't run as the X server takes 85+% CPU when running my code. But with Fedora Core5 the X.org nvidia driver works taking 30-40% CPU, obviously I choose to run the much better performing Nvidia proprietary driver so my app has room to grow.

                              --wally.

                              Comment

                              Working...
                              X