Last year there was great interest in an ATI prototype video compressor that ran not in the CPU but in the GPU's pixel shaders on ATI x1000 series graphics boards.
That little app could encode MPEG's at speeds up to 5x faster than most AMD or INTEL CPU's. It was called the Avivo Transcoder. Impressive....and AMD took notice at last years Computex in Taipei.
Avivo Transcode article from that time;
This kind of retasking, not only by ATI but others as well, gave rise to a new term; the GPGPU
GPGPU
General-Purpose Computing on Graphics Processing Units (GPGPU, also referred to as GPGP and to a lesser extent GP^2) is a recent trend in computer science that uses the Graphics Processing Unit to perform the computations rather than the CPU. The addition of programmable stages and higher precision arithmetic to the GPU rendering pipeline have allowed software developers to use the GPU for non graphics related applications. Because of the extremely parallel nature of the graphics pipeline the GPU is especially useful for programs that can be cast as stream processing problems.
http://en.wikipedia.org/wiki/GPGPU
Now comes the merger.
Applications
The following are some of the non-graphics areas where GPUs have been used for general purpose computing:
My take is that with this tech coming on strong in not only scientific but pro and user apps (ie: compressors like Avivo Transcoder and later games, editing software, 3D etc.) AMD/ATI want to get an open source GPGPU platform in the market before Intel comes out with as propriatory one that would require licensing.
That little app could encode MPEG's at speeds up to 5x faster than most AMD or INTEL CPU's. It was called the Avivo Transcoder. Impressive....and AMD took notice at last years Computex in Taipei.
Avivo Transcode article from that time;
This kind of retasking, not only by ATI but others as well, gave rise to a new term; the GPGPU
GPGPU
General-Purpose Computing on Graphics Processing Units (GPGPU, also referred to as GPGP and to a lesser extent GP^2) is a recent trend in computer science that uses the Graphics Processing Unit to perform the computations rather than the CPU. The addition of programmable stages and higher precision arithmetic to the GPU rendering pipeline have allowed software developers to use the GPU for non graphics related applications. Because of the extremely parallel nature of the graphics pipeline the GPU is especially useful for programs that can be cast as stream processing problems.
Now comes the merger.
AMD, ATI and the GPU
Breaking the monopoly
By Guy Kewney, Newswireless.net
Published Monday 24th July 2006 13:27 GMT
Comment "We may lose business on Intel boards, but we will break the Intel monopoly." With these words, AMD's CFO Bob Rivet announced the takeover of graphics chip maker, ATI, offering a future of joined-up shared processing, split between CPU and GPU.
The deal, announced today, goes back some time. Last year, at Computex in Taipei, it was apparent that ATI and AMD were falling in love with the idea of using the powerful graphics processor to run computer programs, not just for animating video.
At that show, software developers were invited to the launch of the new dual-core AMD processors, with prototype applications that ran, not on the x86 central processor, but on the graphics chip. Examples included video editors which could handle the output stream live, in real time.
This concept is probably beyond the grasp of the typical financial analyst, and in the short term, the City and Wall Street will probably panic, seeing only the probability that ATI will lose customers who make Intel motherboards, coupled with the possibility that end-users who want Nvidia graphics will have to buy Intel.
"In 2008 and beyond, AMD aims to move beyond current technological configurations to transform processing technologies, with silicon-specific platforms that integrate microprocessors and graphics processors to address the growing need for general-purpose, media-centric, data-centric and graphic-centric performance," said the official statement.
>
>
And the GPU isn't just for drawing pictures. Talk to any crypto expert and you'll find they are all trying to find ways of harnessing that extraordinary power. To quote Wikipedia: "Recent developments in GPUs include support for programmable shaders which can manipulate vertices and textures with many of the same operations supported by CPUs, oversampling and interpolation techniques to reduce aliasing, and very high-precision colour spaces. Because most of these computations involve matrix and vector operations, engineers and scientists have increasingly studied the use of GPUs for non-graphical calculations."
And: "Because all these applications exceed an actual GPU's usage target, a new term, GPGPU is usually employed to describe them. While GPGPUs are the same chips as GPUs, there is increased pressure on manufacturers from "GPGPU users" to improve hardware design, usually focusing on adding more flexibility to the programming model."
The black hole of the processor has, at last, started to attract the GPU and the GPGPU. AMD feels that it has to move, now, before it becomes part of Intel, rather than part of a generic processor platform. If it is right, then the question of "how much did you pay for ATI?" is irrelevant. It may be a question of "How can you expect to survive, without ATI?"
Breaking the monopoly
By Guy Kewney, Newswireless.net
Published Monday 24th July 2006 13:27 GMT
Comment "We may lose business on Intel boards, but we will break the Intel monopoly." With these words, AMD's CFO Bob Rivet announced the takeover of graphics chip maker, ATI, offering a future of joined-up shared processing, split between CPU and GPU.
The deal, announced today, goes back some time. Last year, at Computex in Taipei, it was apparent that ATI and AMD were falling in love with the idea of using the powerful graphics processor to run computer programs, not just for animating video.
At that show, software developers were invited to the launch of the new dual-core AMD processors, with prototype applications that ran, not on the x86 central processor, but on the graphics chip. Examples included video editors which could handle the output stream live, in real time.
This concept is probably beyond the grasp of the typical financial analyst, and in the short term, the City and Wall Street will probably panic, seeing only the probability that ATI will lose customers who make Intel motherboards, coupled with the possibility that end-users who want Nvidia graphics will have to buy Intel.
"In 2008 and beyond, AMD aims to move beyond current technological configurations to transform processing technologies, with silicon-specific platforms that integrate microprocessors and graphics processors to address the growing need for general-purpose, media-centric, data-centric and graphic-centric performance," said the official statement.
>
>
And the GPU isn't just for drawing pictures. Talk to any crypto expert and you'll find they are all trying to find ways of harnessing that extraordinary power. To quote Wikipedia: "Recent developments in GPUs include support for programmable shaders which can manipulate vertices and textures with many of the same operations supported by CPUs, oversampling and interpolation techniques to reduce aliasing, and very high-precision colour spaces. Because most of these computations involve matrix and vector operations, engineers and scientists have increasingly studied the use of GPUs for non-graphical calculations."
And: "Because all these applications exceed an actual GPU's usage target, a new term, GPGPU is usually employed to describe them. While GPGPUs are the same chips as GPUs, there is increased pressure on manufacturers from "GPGPU users" to improve hardware design, usually focusing on adding more flexibility to the programming model."
The black hole of the processor has, at last, started to attract the GPU and the GPGPU. AMD feels that it has to move, now, before it becomes part of Intel, rather than part of a generic processor platform. If it is right, then the question of "how much did you pay for ATI?" is irrelevant. It may be a question of "How can you expect to survive, without ATI?"
The following are some of the non-graphics areas where GPUs have been used for general purpose computing:
* Physically based simulation - Game of life, Cloth simulation, Incompressible fluid flow by solution of Navier-Stokes equations
* Segmentation - 2D and 3D
* Level-set methods
* CT reconstruction
* Fast Fourier Transform
* Tone mapping
* Sound Effects Processing
* Image/Video Processing
* Raytracing
* Global Illumination - Photon Mapping, Radiosity, Subsurface Scattering
* Geometric Computing - Constructive Solid Geometry (CSG), Distance Fields, Collision Detection, Transparency Computation, Shadow Generation
* Neural Networks
* Database operations
* Lattice Boltzmann Method
* Cryptography
and more....* Segmentation - 2D and 3D
* Level-set methods
* CT reconstruction
* Fast Fourier Transform
* Tone mapping
* Sound Effects Processing
* Image/Video Processing
* Raytracing
* Global Illumination - Photon Mapping, Radiosity, Subsurface Scattering
* Geometric Computing - Constructive Solid Geometry (CSG), Distance Fields, Collision Detection, Transparency Computation, Shadow Generation
* Neural Networks
* Database operations
* Lattice Boltzmann Method
* Cryptography
My take is that with this tech coming on strong in not only scientific but pro and user apps (ie: compressors like Avivo Transcoder and later games, editing software, 3D etc.) AMD/ATI want to get an open source GPGPU platform in the market before Intel comes out with as propriatory one that would require licensing.
Comment