Announcement

Collapse
No announcement yet.

The AMD + ATI merge

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • The AMD + ATI merge

    From your point of view, this merge so far is a good or a bad thing? For the customers, for AMD, for ATI, for the industry in general?
    26
    Good, good, it's good!
    0%
    8
    Nah, not so good, probably a bad move.
    0%
    6
    Too soon for me to decide / I don't really care
    0%
    12
    "For every action, there is an equal and opposite criticism."

  • #2
    I think it's good, but needs to catch up financially from the effects of the merger. That will take time, once the dust settles you will see some cool stuff from AMD.

    I'm optimistic.

    Cheers,
    Elie

    Comment


    • #3
      Good thing, which we'll really start seeing with Fusion.
      Dr. Mordrid
      ----------------------------
      An elephant is a mouse built to government specifications.

      I carry a gun because I can't throw a rock 1,250 fps

      Comment


      • #4
        It's good if we continue to have real competition for Intel and NVIDIA, across the range. ATI/Sapphire-made boards are as close as we'll get to Matrox quality in a gaming machine, so would hate to be resigned to NVIDIA as the only choice.

        Fusion? (Edit: Thanks az)
        Last edited by Pace; 30 August 2007, 04:37.
        Meet Jasmine.
        flickr.com/photos/pace3000

        Comment


        • #5
          Bad for Canada, the profits will be going elsewhere now. Bad for the consumer - it's gonna be harder to mix and match cpu and gpu brands now. There is less incentive for nvidia to make amd based chipsets, and less incentive for intel to make crossfire compatible chipsets. I hope I'm wrong (and I know some stubborn folk here will tell me I am).
          Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
          Laptop: MSI Wind - Black

          Comment


          • #6
            Fusion = CPU-integrated graphics core.
            There's an Opera in my macbook.

            Comment


            • #7
              AMD needed a partner for Fusion development, ATI could very well need their resources... I don't believe anyone could say ATI influenced short term AMD CPU plans , the question is more how AMD management influenced ATI. Guess the R600 was far in development before the merger and this is one "failure" ATI has to take the blame for. Maybe AMD could be blamed for not rushing the R650 to the market and keeping too quiet about the R700.
              I hope AMD has future plans for ATI and they didn't buy them only for Fusion and mb chipset development.

              Will Nvidia give up on the revenues from AMD chipsets ? If they do we'll be left with ATI, maybe VIA will come up with something () and SIS will surely keep building entry level chipsets. Should I bring up that rumor about Nvidia making a CPU ? Who knows, ATI and AMD merged, Intel intends to enter the graphics market, what's to say Nvidia won't build a CPU in the future.

              Comment


              • #8
                AMD could help ATI get power consumption under control, and also help them with advanced processes and good contacts to external Fabs (volume).
                There's an Opera in my macbook.

                Comment


                • #9
                  AMD paid far too much for ATI (good for ATI shareholders, bad for AMD shareholders), putting themselves into great financial risk. _IF_ they survive without needing to sell off too much, they probably end up with a much smaller R&D than before. So all in all, I think it's very bad for the AMD shareholders and bad for consumers as I think this leads to Intel dictating prices and when/how to introduce new tech (i.e. milk the consumer), as they did before the AMD K7.

                  Comment


                  • #10
                    |Mehen|, interestingly (at least here) Nvidia chipsets still totally own AMD platform...they seem more and more popular on Intel side too...
                    And I really can't wait AMD Fusion/Intel counterpart...I think I even might be able to wait for it with upgrading (still on Athlon XP 1700+ here...though low end Core chips are temtping). I really wonder how Fusion/etc. will influence market...lower prices are a given I guees... But how popular this model of computer architecture will be, with all its pros and cons? How will it influence writing of 3D intensive apps? Will we even need separate GFX card in a bit more distant future?

                    Comment


                    • #11
                      Originally posted by Nowhere View Post
                      Will we even need separate GFX card in a bit more distant future?
                      Yes, if you want to play. Broad and fast memory buses on mainboards would be too expensive, and the heat of a fast CPU and GPU on one chip would be very hard to cool.
                      There's an Opera in my macbook.

                      Comment


                      • #12
                        Ahh, yes, but I'm wondreing more if we can expect some sort of paradigm shift in creating games if most people would have Fusion/etc. Perhaps different coding techniques... (separate GFX cards will indeed have far greater memory bandwith...but far, far slower link to CPU; there's an article on Arstechnica comparing PS2 to PC and touching similar issues)

                        As far as cooling goes...perhaps look at it this way: instead of two smaller coolers you'd have one larger, and I doubt it would be more than two times as expensive (bonus: larger coolers are usually more quiet?)

                        Comment


                        • #13
                          AFAIK, Fusion won't be in mobile chips for a while...
                          There's an Opera in my macbook.

                          Comment


                          • #14
                            Wiki (yeah, I know) claims something entirelly different...
                            http://en.wikipedia.org/wiki/AMD_Fus...ry_Information (second point; also further down low power version is mentioned)

                            PS. Found something just now...there's also "Z-RAM" technology mentioned there...it seems it will be used for L3 cache, with ~5 (?) times the density of current cache memory. Might help memory issues, I guess?

                            And it seems Intel will also jump the bandwagon... http://en.wikipedia.org/wiki/Nehalem_(CPU_architecture)

                            Comment

                            Working...
                            X