Announcement

Collapse
No announcement yet.

Do believe the hype?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Tom Pabst didn't write that "review". It was written by the editor of an NVIDIA fan-site.

    Dr. Mordrid
    Dr. Mordrid
    ----------------------------
    An elephant is a mouse built to government specifications.

    I carry a gun because I can't throw a rock 1,250 fps

    Comment


    • #47
      Hi there.

      Will MATROX sell the Parhelia-chip to 2nd parties like Ati and NVidiot does? I can remeber that MATROX tried to do so in past with the G400 (Gigabyte/external Bios on MoBo?).
      Work-Box:P4C3.0GHz; DFI LAN Party875Pro, GeiL Golden Dragon 512MB PC3500 DDRAM, ==>>PARHELIA 128+ZALMAN HEATPIPE MOD<<==, 2 x WD360 Raptor 36Gig RAID 0, MAXTOR 6Y080L0 80Gig, Plextor PX-W4824A, Toshiba SD-M1612, 2x BenQ FP767 17"TFT

      MEDIA-BOX:P4C3,2GHz; ASUS P4P800 Deluxe, GeiL Golden Dragon 512MB PC4000 DDRAM, Radeon9800XT, 2xHitachi HDS722512-VLSA80; RAID0, Plextor PX-116A, PX-708A, Plextor Premium/T3B

      Comment


      • #48
        Originally posted by lAmerZ
        Hi there.

        Will MATROX sell the Parhelia-chip to 2nd parties like Ati and NVidiot does? I can remeber that MATROX tried to do so in past with the G400 (Gigabyte/external Bios on MoBo?).
        They have said they will not do this again.
        Cheers, Reckless

        Comment


        • #49
          I'm well impressed with the features / screenies / movies that I've seen of the card so far.

          It's given me a little hope after the disappointment of seeing Nvidia dominate the market so much (ATI is starting to challenge more now, but for a while it was bad..).

          I currently have a GF2mx and I'm eyeing up the GF4 Ti4200 .. I've been holding back waiting for Parhelia but I don't know if I can justify the cost..

          It looks superb but when you translate $400 into English £'s more often than not you're looking at £400.. optimistically I'd say it would be £350 (tho certainly nowhere near the actual direct currency conversion of £275).

          I'd wait for a 64mb Parhelia card if they could get the price below £200...

          Comment


          • #50
            Well, I have a Gainward Ti4600, and it's the first nV card that I can actually say has very good image quality. I don't know if this applies to all of Gainward's 4600's or not, just that this one is finally right. Add to that great support for everything coming out for at least the next since months (in the way of games), as well as great speed, plus the fact that I had to upgrade my video card by the end of April / beginning of May or not have the cash to do so for another year, and you can see why I won't have a Parhelia for a while.

            If the card had made it out just a bit sooner, I wouldn't have bought the 4600. I don't expect major driver bugs with the Parhelia, though I expect there to be some, especially where new features are concerned (ie., the ones not in other chips besides Parhelia at the moment). In any case, this card should be fine for everything out now and in the next year or so. In fact, based on what I've read so far, while it may not have the highest framerates, I very much expect it will continue the tradition that Matrox established with the G400 series (I have had a G400Max since they first came out), where they don't have the highest framerates, nor the highest overall average framerates, but they also don't have the lowest dips either. Basically, I think the Parhelia will be very well suited to maintaining a stable overall framerate, even with most features enabled. If I could afford one right now, I'm certain I'd buy it for that fact alone (now that there's finally another M card that should be able to keep up with all other modern accelerators again).

            For those of you that questioned having the need for high framerates, 30 doesn't get it for most games I play. I LIKE 3D first person shooters mostly, and while 60fps is considered smooth by most, I've found that even my 4600, while maintaining at least 61fps (minimum), 150+ maximum, 90 average, in UT at 1024x768x32 bit color, 4xAA, 8 tap anisotropic filtering, all options on, the gameplay isn't always as smooth as I'd like. Why? My refresh rate is set at 75Hz, and so when the card dips below that it makes the game perceivably stutter (most noticable when you're in the middle of a hot fight, and it can make all the difference in determining which end of the weapon you find yourself). In my opinion, the accelerator needs to maintain at least 75fps in this type of game (or whatever you consider and acceptable refresh rate), so that vsync can be enabled (eliminating tearing, which causes similar symptoms as the stutter), and the framerate can be absolutely smooth at all times. Until now, since most other manufacturers have had chips that have a wider variance in framerates from minimum to maximum, higher performance has been a must, since it also implies a higher minimum, making it more likely that the card itself won't dip below that magic number when vsync is enabled.

            Again, I expect Parhelia will do better in this area overall (once you find settings that give you the average framerate you want, odds are good that the minimum will be closer to the average with this card, even thought the maximum will also).

            For now, I'm pretty happy with this Gainward, since it does everything I was hoping it would do. In the future, I look forward to Parhelia, once my next upgrade cycle hits, since I'd like to take advantage of the 10bit-per-channel color (talk about awesome), and also, mainly, because though it felt for a while like Matrox had given up on us, I haven't really ever given up on them (don't be fooled because I have owned other cards- I buy what serves my needs best when I can afford to upgrade, but Matrox is always at the top of my list of products to look at when they have some that meet the criteria. )
            "..so much for subtlety.."

            System specs:
            Gainward Ti4600
            AMD Athlon XP2100+ (o.c. to 1845MHz)

            Comment


            • #51
              Reckless:
              Err, I don't understand guys! Matrox release a part that is (probably) performant enough and offers more features than any other current competitor and you're still going to wait.
              Perhaps I should have phrased my point differently... I have some other purchase I'd like to do first (digital photocamera, ...), and I have decided for myself to wait till the end of this year (perhaps the beginning of next) to upgrade my computer (I don't play that many games, just occasionally, and for 2D it still manages). Basically, that is thinking approx. 7-8 months ahead; so my point should have been read bearing that in mind...

              Also, I have a policy of not buying the first of the crop (no matter what hardware/software it is), I just like to get some of the bugs sorted out before I get stuck with them (past experiences........)...

              Jörg
              pixar
              Dream as if you'll live forever. Live as if you'll die tomorrow. (James Dean)

              Comment


              • #52
                The card seems to be well rounded...
                Some will say it should have DX9 shaders, it would be nice, but as Tom and Anand mentioned, it's easier to impliment Vertex shaders and emulate them in software if a user doesn't have a compliant card..
                they are focusing on what technologies will be most important first.. or atleast trying to..

                the only problem or possible stumbling block is the die process, the .15 process is a bit big for such a complex chip, and it may be a hot sucker...
                If the .13 were availible I bet they would have included more features such as the Programable Shaders of DX9...

                I like that they have stuck to their market and raised the bar again on Image Quality...
                10bit color - AA 2D fonts - 5th order Filters - Dynamic Depth Tensilation etc


                When can I pre-order ?!?!
                Craig
                1.3 Taulatin @1600 - Watercooled, DangerDen waterblock, Enhiem 1046 pump, 8x6x2 HeaterCore Radiator - Asus TUSL2C - 256 MB Corsair PC150 - G400 DH 32b SGR - IBM 20Gb 75GXP HDD - InWin A500

                Comment


                • #53
                  Well, at least this leaves something for Matrox to improve on for their next iteration.

                  (I believe in the hype. However, my budget's for this year is a bit tight. So no upgrading for me until next year...which should be just nice to see what the competition has to offer, and what Matrox can do for an encore. )

                  But I do encourage those with the means to go on a spending spree. This will at least help the rest of us get out of this economic doldrums.

                  Originally posted by Stringy
                  The card seems to be well rounded...
                  Some will say it should have DX9 shaders, it would be nice, but as Tom and Anand mentioned, it's easier to impliment Vertex shaders and emulate them in software if a user doesn't have a compliant card..
                  they are focusing on what technologies will be most important first.. or atleast trying to..

                  the only problem or possible stumbling block is the die process, the .15 process is a bit big for such a complex chip, and it may be a hot sucker...
                  If the .13 were availible I bet they would have included more features such as the Programable Shaders of DX9...

                  I like that they have stuck to their market and raised the bar again on Image Quality...
                  10bit color - AA 2D fonts - 5th order Filters - Dynamic Depth Tensilation etc


                  When can I pre-order ?!?!
                  Craig

                  Comment


                  • #54
                    Nice write-up Snake-Eyes ... thanks.
                    <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                    Comment


                    • #55
                      I don't quite understand what all the "Hype" is about waiting till the 0.13 micron die shrink is about.


                      They can and "should" do that without changing or adding any new features. A die shrink is just that... - You get more die per wafer and reduce your manufacturing costs......DOH!!!!!!


                      Oh!! No!!

                      You guys let "UncleDOH" get to you with all that Bullshit he was spreading about going to 0.13 processing and from PS1.3 to PS2.0 all at the same time......ROFL!!!!!!!!
                      Oh!! we'll have a Copper back-end....ROFL!!!!!

                      NO!! You can still do 0.13 with AlCu you don't have to use Copper.

                      You guys really crack me up.....ROFLMAO!!!!!!!!!!!

                      Oh!! That Hurtz!!!!

                      Sorry !!!

                      There is nothing wrong with the current 0.15 micron geometry for the GPU. There's not really any significant performance increase that will magically occur with a die shrink to 0.13u. Ok maybe it will run a little bit cooler but that's about all.

                      I work with 0.13u product Yield/Integration everday and have survived numerous die shrinks. They are a pain in the Arse but they do cut costs.

                      So, if you guys are going to let some clueless idiot convince you that you better wait ....

                      It's your loss.


                      Paul
                      "Never interfere with the enemy when he is in the process of destroying himself"

                      Comment


                      • #56
                        Nice one Paul! One Up.



                        ~~DukeP~~

                        Comment


                        • #57
                          There is one thing I don't really emotionally understand about the whole announcement and then the reaction. Almost every board I have visited in the last little while (particularly ARS, and not just MWNH) has had an overwhelming chorus of naysayers, but not just naysayers, they seem to be trying to poke any holes possible in this card. I know that there are many fans of nVidia's offerings, and many of them blinded by their own brand loyalty. I know some of us here are too.

                          It is really odd, I've honestly never seen such a reaction to the announcement of a new product, there are the few who would like it or like to see it and there are the louder ones who seem to be trying to downplay its release and make it look bad. There is also this crap where the GL problems keep getting dragged out. Maybe I bought a G400 too late to experience them, but many people almost take it personally that the GL ICD was subpar (or was it not there at all?) and that makes the card a worthless pile or something. I got my G400 in 2000 and have been pleased with it, even with most games that I play. I do concede that they are generally light and usually 2D.

                          It's probably just me though, I have had very few problems with ANY computer parts I have bought. The video card I had before the G400 was a TNT1 (in 1998 when it was the top single-card solution) and it definitely served my purposes, but it turned out blurry when I got a 19" monitor (420GS) in 1999. I also remember that nVidia announced that the TNT1 would sport a 125MHz core and then they backed it down to 90MHz. I did not take it personally and I did not form a negative opinion on nVidia over it. At the same time, I don't see the people who don't usually own nVidia cards dragging that out every time a new GeForce is announced.

                          This is a pointless tangent it seems, but I can logically understand the zealotry, every product, brand, company has those who defend them/it jealously, but sometimes it seems ridiculous and I can't emotionally understand it. I guess the point is that there seems to be some real hostilities some have built up against Matrox for reasons I can't truly understand. At least I know what it feels like at the Macintosh end of things.

                          Comment


                          • #58
                            The sheer fact that guys like MWNH show up and spend so much time writing extended arguments against the Parhelia shows just how much of a threat it is. (IMHO )

                            Comment


                            • #59
                              I think the problem is that some people aren't satisfied with anything that doesn't have their personal stamp of approval on it. If they don't have one in their personal system, it *must* be a piece of crap. Obviously.

                              They are overlooking that their assessment of a product relies on numerous criteria that they have established for judging that product, all arbitrary. Those criteria might be the best for them, but <b>their criteria don't apply to everyone</b> -- which is these people's (fanboys, Nvidiots, whatever) number one mistake.

                              I value quality over quantity. I value features over performance. I feel that Matrox has always produced the best product they can, supported it to the best extent they could, and in general worked the hardest to give me the most for my money. They support their Linux customers not as an afterthought, but as an equal with their Windows customers. As a private company, they are not subject to the whims of the stock market, which is a factor in compromising many companies' long-term plans in favour of short-term returns to investors.

                              Anyway, those are the reasons why I support them. Not because I'm a fanboy, or because I just have a need to be different from the rest of the McNvidia world. It just seems fairly clear to me that Matrox is working the hardest, out of all the current players, to produce a top-notch product above all else, and doing the best job of listening to their customers while doing it.

                              Oh, and Haig rules.

                              Comment


                              • #60
                                As many of you know (one in particular, if he ever replies to tell me who amongst us he really is, heh), I frequent several forums, including nVNews. Even though that site is primarily nV focused, I've found that usually the people on their forums are fairly level-headed when it comes to hardware. But in the case of Parhelia, even folks over there seem to be a bit like elsewhere- knocking on the card due to 'probable driver issues' and some even complaining that this was a 'paper launch with no benchmarks'.

                                That never ceases to amaze me either. None of the more recent video chipsets/cards out at the moment have had anything other than a paper launch either (my biggest gripe, obviously), instead having a big press rollout touting all the new features and capabilities of the chipset. So far this Parhelia release is true to that form, so I don't see the point the others are trying to make there.

                                As far as drivers go, the people in question keep refering back to the G200/G400 releases and using this as a basis for what kind of experience we'll have when Parhelia arrives. Again, I don't think it applies here- Matrox has had several card releases in the meantime, even if none of them was targetted as high as the Parhelia is (G450/G550, several flavers of each, etc.), and have continually updated drivers for their whole range. So Matrox obviously has had driver experience under each of the currently available APIs, even if it was for older hardware. I had to give them one point with their arguments, that being that there will likely be bugs, but I expect those bugs to be mainly in the area of completely new, Parhelia-specific features, ie., the 2-10-10-20 color, FAA, and DM areas, mainly because nobody's done anything with these before with consumer-grade video cards. Even so, Matrox has had alpha hardware for a while (and before that was running demos of some of this stuff using software emulation), so I expect that at least the basics of these new features will work out of the box, and only need fine tuning, and application specific debugging.

                                All in all, it really does seem to be people feeling threatened that their new hardware (R8500, GF4, etc.) is no longer the top of the pile, and trying to rationalize to themselves the purchase of said products by way of criticizing the new kid on the block.

                                Don't expect everyone to behave the same way though. In fact, some of us (er, look below again. I'll keep admitting I have a Ti4600) aren't afraid to admit that this card has great promise. In fact, I just wish I could find a buyer for mine that would let me keep 75% of what I paid. I'll gladly be a guinea pig for the first batches of Parhelia (if that's what they think the first drivers will be).
                                "..so much for subtlety.."

                                System specs:
                                Gainward Ti4600
                                AMD Athlon XP2100+ (o.c. to 1845MHz)

                                Comment

                                Working...
                                X