Announcement

Collapse
No announcement yet.

Radeon 9700 & 9000 launch

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Well, quoting from ATI to launch RV250 and R300 chips on July 17

    "However, considering the chip’s (R300) overly top-end market positioning and its pin counts, which reaches as high as 1,176, they plan to observe the market acceptance before entering mass card production."

    So it might be having yield problems with the R300 chip and so aren't planning on making a whole lot of 9700 cards until they know people are really willing to pay for them.

    I guess we will see just how hard it will be to get a Radeon 9700 if you want one.

    Comment


    • #47
      Gah... ATI's site is being crippled with hits. Looks like they brought upon themselves their very own DOS attack

      -[Ch]ams

      Comment


      • #48
        Originally posted by GT98

        Last time I checked (well in the USA) The expectant pricing was supposed to be the same as the Parhelia-$399.
        Someone allready explained this one.

        Hey I'm slightly dissapointed in the Parhelia myself, but do you have to ask yourself this also...does it really matter if you get 100fps with a Parhelia or 350fps with a Radeon 9700 when usally the max refresh rate of a monitor in high rez is 85hz, which translates into 85 fps? Also its not totally fair Compairing the P and the R9700, since the P is more or less a transion card from Dx8 to Dx9 and also meets all the specs for AGP useage Without an external power supply? I think that might scare off some Customers (i.e. Your major manufactures of PCs) with that. But time will tell with that.
        External power supply, who cares if it uses NUCLEAR POWER.. its a HIGH-END card, only serious gamers will buy it. So its just one more wire plugged, nothing more than that.

        And funny that 3,5x power doesnt NOW mean anything, but when Parhelia wasnt announced you guys gloated here how Parhelia is going to kill GF4. You can allways use higher resolution and better detail settings, AA and AF. And if it delivers 150fps, great.. no more dips to 20fps if you get 80fps MIN. Thats what i want.. 30-60 MINFPS, couldnt care less of the MAX or AVERAGE FPS, only minimum counts. UT2003 is released about the same time as R300, so there is game that REALLY shows whats its all about.

        And IMHO its a TOTALLY fair comparison, but if you are a Matrox fan, you just wont like the result, but thats another matter...

        Normal comparison of two HIGH-end cards, nothing more, nothing less. Matrox card is Final and ATI's cards are Alpha/Beta's.. so prepare that things will IMPROVE from that.

        Pe-Te

        For the record, i currently own a R8500LE[285/295] and im VERY happy with it, it costed ~HALF of those GF3's, and we all can see whos on top now. Drivers work just great, and frequent flow of drivers [ official/leaks ] will address any problems that might rise with NEW games.

        Comment


        • #49
          Seeing the lack of a second DVI connector makes me think they didn't spend too much time improving their dual display features. I would still like to know if they will have indepenent DH upto atleast 16x12 per display in Win2K (my main requirerment). I would also like to know how this video passthough will affect the need for hardware overlays on both displays.

          It's interesting to see the FPGA design being used for GPUs. This will allow higher clock speeds, but it's going to be the end for replacing the stock HSF. Since the core is exposed and the HSF will be epoxied onto the chip. There is a high chance of damaging the core when removing the HSF. Did anyone else notice the lack of mounting holes on the 9700 board?

          Looks like ATI will be in the same boat as Matrox as far as having to build their drivers from scratch.
          I should have bought an ATI.

          Comment


          • #50
            Improve? Not necessarily. Those alpha Parhelias run faster than the retail ones.
            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

            Comment


            • #51
              The performance of the 9700 is what I expected out of the Parhelia. The hype surrounding the P along with the fact that Matrox made such a big deal out of it, ruined the card for me. I fully expected "next gen" performance out of the P and it wasn't there. Since I play mostly games, it just wasn't worth the $400. Looks like I'll get the 256MB 9700 AIW when it comes out I am really looking forward to HDTV support so I can play my games on my 40" widescreen. Too bad the 9700 doesn't support surround gaming
              Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

              Comment


              • #52
                Originally posted by PeTe
                External power supply, who cares if it uses NUCLEAR POWER.. its a HIGH-END card, only serious gamers will buy it. So its just one more wire plugged, nothing more than that.
                Well you have to remember not everyone is a gamer...the Parhelia offers alot to people who use their Video Cards/PC more then a gorifed console game machine.


                And funny that 3,5x power doesnt NOW mean anything, but when Parhelia wasnt announced you guys gloated here how Parhelia is going to kill GF4. You can allways use higher resolution and better detail settings, AA and AF. And if it delivers 150fps, great.. no more dips to 20fps if you get 80fps MIN. Thats what i want.. 30-60 MINFPS, couldnt care less of the MAX or AVERAGE FPS, only minimum counts. UT2003 is released about the same time as R300, so there is game that REALLY shows whats its all about.
                Well with the paper Specs we had at the time and assumption of a higher clockspeed the Parhelia should have out performed the GF4 with no problem. But with the clockspeed being lowered for whatever reason, this isn't really the case unless your running at higher Resolutions with FAAx16 on. Also why did Epic choose to Run UT2003 on Parhelia at E3? If it performed so horribly why did they use it? Something is not adding up there.
                Why is it called tourist season, if we can't shoot at them?

                Comment


                • #53
                  Originally posted by Helevitia
                  The performance of the 9700 is what I expected out of the Parhelia. The hype surrounding the P along with the fact that Matrox made such a big deal out of it, ruined the card for me. I fully expected "next gen" performance out of the P and it wasn't there. Since I play mostly games, it just wasn't worth the $400.
                  I don't think Matrox Made such a big deal out of it as much as the Online Communty did. Coupled with it being Matrox's First real 3d product since the G400..well expectations where high for it. I think that after the bar being set so high for it by everyone..well they where damned from the start no matter how good it was..
                  Why is it called tourist season, if we can't shoot at them?

                  Comment


                  • #54
                    if i spend $400 for a card, i want it to last as long as posible. so if a card gets 300+ FPS then it might last me 4 years. if it only gets 150 fps then i might have to replace it in 2 years.

                    That's what i get from fps scores, a longevity rating.
                    some of you might say that the parhelia will improve with future games when they start using it's features. but the r300 also has those features, and so will proportionaly outperform a parhelia as time goes on.

                    and most of you agree that gaming matters. if you just cared about 2d then you would keep using your g400's.

                    I love matrox, but like Helevitia said, i was expecting P to perform like this.

                    Comment


                    • #55
                      It will be interesting to see what the final shipping cards clock speed will be. Also hopefully we'll get a few screen shots to look at.
                      Anands review also shows how hard it is shake off a bad rep.
                      Chief Lemon Buyer no more Linux sucks but not as much
                      Weather nut and sad git.

                      My Weather Page

                      Comment


                      • #56
                        Originally posted by Buback
                        if i spend $400 for a card, i want it to last as long as posible. so if a card gets 300+ FPS then it might last me 4 years. if it only gets 150 fps then i might have to replace it in 2 years.
                        That is spot on with my thoughts. I'm not even thinking of buying a Parhelia at its current price. I'd hope any card costing $400 to last 2 to 3 years before I considered it well past its use-by date (very much like my Max has done!). The Max managed to mix it with the big boys back then and Matrox did a stunning job keeping the card well supported and running quicker.

                        The Parhelia only lands middle ground which is a minefield of low price units (ATI8500/GF4-4200) that manage to outpace the big P. Being outpaced is one thing, being WAY outpriced and outpaced is another thing entirely.

                        IMO, the Parhelia has ONE killer feature: 16xFAA. It's nothing short of stunning (going by what I've seen on the net so far). However, this is the only feature that'll be used by the majority of its users. As long as the required fixes are done by Matrox, this will sell quite a few cards.

                        Sure most of us dream of TH setup but not that many have the funds or desk space for such a dream. Excellent it may be, overally practical it is not.

                        The most the Parhelia will sensibly last is the current round of 3D game engines (including UT2K3). This should see 18 months service but past that point things will probably have moved on a tad too much for the Parhelia to offer 'smooth' framerates at 'expected' resolutions. The ATi's could well offer a longevity option that should be considered a 'feature'. True, ATi drivers haven't been the best but they've improved - old stories are hard to put down!

                        If you're not into FPS's then the Parhelia is still a compelling purchase albeit too dear for a lot. The ATis would seem better value tho.
                        Cheers, Reckless

                        Comment


                        • #57
                          This Radeon is really taking the spot...

                          The only chance of me buying a Parhelia now is for the 64Mb version being at least 100 bucks cheaper and the clock remains the same (and I'd look for some overclockin' capabilities, most for the core).

                          The R9700 is 400 bucks now, but soon there'll be a 9500 one, between the 200-300 field... and for sure it'll kick some major arse.
                          <p><font face="Verdana, Arial, Helvetica" size="1">"Dadinho o C@r@$, meu nome agora � Z� Pequeno" - City Of God</font></p>
                          <p><font face="Verdana, Arial, Helvetica, sans-serif" size="1">A64 @ 2,25 + 1GB + GT6600</font> </p>

                          Comment


                          • #58
                            It's hard to judge the "longevity" of cards. The G400 certainly seemed to hang around for a long time.
                            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                            Comment


                            • #59
                              Why does everyone still rip on ATI's drivers? I'm running a 8500, and have been for a few months now. While the original drivers back in Fall may have been bad news, now days this thing is near perfect as far as I can tell. It plays everything I throw at it, and plays it extremely fast without rendering problems.

                              If Radeon 9700 can kick the asses of all the current cards with it's basically Beta drivers, and do it without problems, I'd say ATI has cured its driver disease.

                              What really drives me up the wall is when I read Nvidiots raving about their drivers. All I know is my G400 and Radeon could play classic NFS4 while the fantastic Geforces couldn't. I didn't enjoy digging through a dozen Geforce driver revisions to get it going (I had to run the 5.25 drivers for it work right.) I guess they were optimized for Quake 3 or something. Ha.

                              Nvidia's marketing dept has stunningly turned many PC users in to raving idiots with regards to their drivers. They are not remotely perfect. I think ATI is finally seeing this and now has branded their drivers too. It's corny but just listen and read what people say "Oh, my drivers are Catalyst", blah blah. Getting all dramatic (or is it romantic )with video card drivers works!

                              Comment


                              • #60
                                Originally posted by Fogel
                                I don't think GF4 get 100FPS in 1600x1200 with FSAA x4. I think it is more like 30-40FPS with GF4 and 75-100FPS with Radeon 9700. Radeon 9700 seems to have as effective AA as the Parhelia, but maybe not the same IQ.
                                I didn't say it did, I was just speaking in general. I was also picking out the point you made, I doubt the 9700 offered 350% of the performance, its actually 250% then, with a 150% lead over Ti4600.
                                Originally posted by FanBoy&trade;
                                does it really matter if you get 100fps with a Parhelia or 350fps with a Radeon 9700 when usally the max refresh rate of a monitor in high rez is 85hz, which translates into 85 fps? Also its not totally fair Compairing the P and the R9700, since the P is more or less a transion card from Dx8 to Dx9
                                The point is UT2003. This has been covered many times.

                                So, what do we compare the Parhelia with then? The cards that were announced 6 months before it, or announced 2 months after? Or no cards, because we can only compare it against DX8 cards (unfair to them), or DX9 cards (unfair to Parhelia).

                                Basically, Parhelia should be compared against everything. It's meant to be an all round card, so it should be judged on its poor gaming performance compared to the new cards, as well as the extra features it has over them.

                                PeTe:
                                Matrox card is Final and ATI's cards are Alpha/Beta's
                                But what is the purpose of those early cards? Matrox seem to be targetting stability/reliability, ATi perhaps targetting performance. Maybe more reliable drivers from them might slow things down, while Matrox can optimise. All supposition though. (and just read Wombats post)

                                P.
                                Meet Jasmine.
                                flickr.com/photos/pace3000

                                Comment

                                Working...
                                X