Announcement

Collapse
No announcement yet.

My parhelia to be replaced.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    you're still not looking at the complete picture but are forcibly focussing on one small part, the hardware and its capabilities.

    Do you think anyone would pay a couple of hundred $ for drivers? I doubt it. And pro level hardware often require filters much better than the crap that's on stuff by default (as you experienced). Using high-end filters on consumer parts will just price them out of the market (like Matrox has done? ). There might be other board-level differences too between the consumer and prosumer cards. But even if there weren't it makes more sense to bundle the card and drivers together as a single package, if only to counter pirating of the high-end drivers on cheap hw.

    Comment


    • #32
      Originally posted by Ribbit
      Wow, where to begin.....

      About the differentiation thing:

      My point is, a Radeon 8500 and a FireGL 8800 are almost identical hardware (remember, the hardware is the bit you're actually buying), yet the latter one was several times more expensive. Drivers for one will specifically and deliberately not run on the other, even though they would surely work just fine. I don't think tweaked drivers are worth several hundred $/£ (talking about new hardware). It's like having better roads which only cars with sunroofs or certain paint colours are allowed on.

      I guess I'm particularly annoyed because the "low-end" drivers deliberately don't run on my "high-end" part.

      Are Matrox not doing this?

      Correct me if I'm wrong, but in Matrox' lineup I see a base 'prosumer' model (the Parhelia/P8X), and higher-end models which are very different hardware-wise (quad displays, different ramdacs, lots more memory, PCI-X). Now that's worth charging more for.

      DVI output quality

      Just to clarify for Wombat: The FireGL has a VGA plug and a DVI-I plug. The signal quality on the VGA plug is as good as on my G400 (actually it's a tiny bit worse, but you have to look very hard to notice). The analog signal quality on the DVI plug isn't as good as on the VGA port. Hope that clears that up, Wombat.
      only the base model FireGL has one DB-15 and a DVI port. The rest have dual DVI. that is, in and of itself, a substatial change.

      the difference between the different Matrox cards is not as much as you would suspect. Different PCB's with some more components. the GPU is the same, and in theory most of the R&D costs went into development of the GPU. The rest is pretty easy to add/debug/test. So yes, Matrox does do this. yes, they have the Parhelia HR256 does *not* have $2k more manufacturing/development costs per model over the Parhelia 256, which in and of itself does not have $300 more hardware costs over the Parhelia 128.

      What you are paying for with the FireGL cards over the Radeons:

      1) Driver development/testing. FireGL drivers (and Quadro drivers) are designed and tested for numerous professional level applications. They are supposedly hassle free on them, and have been "certified" by the company as "good". In the professional market, this is a BIG thing. The drivers are also tested more and often times performance features are disabled if they cause an impact on the visual quality or stability. While they may come from the same code base, they are aimed at a different market.

      2) Driver support. FireGL and Quadro drivers (and hardware) are better supported. As is most everything in the "professional" markets, assuming you paid for it.

      3) Product design. The product is usually tweaked again for stability - more so than the desktop counterpart. it is also tested more and certified for use with certain combination of hardware. if you buy hardware from the HCL's and it doesn't work, you will have tech support doing all sorts of stupid human tricks trying to figure it out. While people say there is no difference between Quadro and GeForceFX chips, i am willing to bet that most normal everyday GeForceFX chips have a significant amount of silicon disabled, and that they were unable to pass verification (kind of like speed binning processors).

      It is the whole package. it is not just the drivers, and not just the hardware. When you buy the card, you get everything - lock, stock and barrel.

      Besides, having software not enable features does not make the hardware that much less useful. Do you *really* used Antialiased Lines in OpenGL? How abou two-sided lighting? And OpenGL overlay planes? How about Gendac/Frame lock features? Seriously?

      What about any of these applications?

      PROFESSIONAL CERTIFICATIONS: CAD
      • Alias AutoStudio Family
      • Ansys
      • Autodesk Architectural Desktop, AutoCAD,
      Inventor, Lightscape, Mechanical Desktop, VIZ
      • AVEVA: PDMS
      • Bentley Microstation
      • Co|Create OneSpace
      • Dassault CATIA
      • ESRI ArcGIS
      • ICEM Surf
      • MSC.Nastran, MSC.Patran
      • PTC Pro/ENGINEER Wildfire, 3Dpaint, CDRS
      • SolidWorks
      • UDS NX Series, I-deas, SolidEdge,
      Unigraphics, SDRC

      PROFESSIONAL CERTIFICATIONS: DCC
      • Adobe After Effects, Premiere
      • Alias Maya
      • Apple Shake
      • Avid Xpress, Xpress DV, Xpress Pro
      • discreet 3ds max, character studio,
      combustion
      • Kaydara MOTIONBUILDER
      • Maxon CINEMA 4D
      • Newtek LightWave 3D
      • Right Hemisphere: Deep Paint 3D
      • Side Effects Houdini
      • Softimage|XSI, Softimage 3D
      "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

      Comment


      • #33
        Originally posted by dZeus
        you're still not looking at the complete picture but are forcibly focussing on one small part, the hardware and its capabilities.
        Well, yeah. I bought a piece of hardware, and I want to be able to use its capabilities as much as possible. This card spends the vast majority of its time doing serious desktop stuff at 1600x1200@85Hz, but I also want to play games on it. And it's a very capable gaming card, even if the drivers try to say otherwise.

        The Linux drivers treat it as what it really is - an R200. Why can't the Windows drivers do the same?

        Originally posted by DGhost
        only the base model FireGL has one DB-15 and a DVI port. The rest have dual DVI. that is, in and of itself, a substatial change.

        Actually it's to do with the age of the card. The 8700 had VGA+DVI-D, the 8800 had VGA+DVI-I, and newer ones have dual-DVI. Just a minor nitpick.

        What you are paying for with the FireGL cards over the Radeons: etc. etc. etc.
        Okay, [my] eyes opened a little, but I still think my original point stands. If I want to play games on my (when new) $1000 pro-version-of-a-gaming-card, why should ATi stop me (or at least, make it a bit less comfortable)? And let's face it, the Radeon (and GeForce) are gaming chips first, where the pro features are almost an afterthought.
        Blah blah blah nick blah blah confusion, blah blah blah blah frog.

        Comment


        • #34
          Now that I have the Geforce FX 5900 in there, I am able to post comments:

          2D is not really noticably worse than Parhelia.

          3D is significantly faster than Parhelia. I am now able to run 2x AA, 8x aniso, and run at higher res, all with double the framerate of the parhelia with all that turned off.

          I should have a linux distro to test on shortly...
          Let us return to the moon, to stay!!!

          Comment


          • #35
            Originally posted by PAugustin
            About the recruiting thing, someone asked me the source. It comes from Matrox web site itself, I'm receiving thier job offers
            if I am reading the post correctly, they need people for the Imaging division, not MGA.

            Comment


            • #36
              As for the Linux drivers, there is no comparison.

              The NVidia drivers were a pleasure to install under Mandrake 10.0. They just work.

              As for the Matrox drivers, man are they a pain in comparison...
              Let us return to the moon, to stay!!!

              Comment


              • #37
                This is the same thing I was thinking of doing... though I may just have to keep my Parhelia for those few games that I play in Triple-Head that are just too odd on ONE screen... X2: The Threat immediately comes to mind... along with Mafia.

                I just can't handle not having decent drivers in linux... Though the one thing I worry about most (and I KNOW will happen) is that as soon as I buy a nVidia card to help running Linux, Matrox will release GOOD drivers....

                Leech
                Wah! Wah!

                In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                Comment


                • #38
                  I figure I'm even saving money by buying this card and reselling th e Parhelia. I saw no reason not to make the switch, especially considering that I never even had 3 monitors hooked up to it.
                  Let us return to the moon, to stay!!!

                  Comment


                  • #39
                    I have the three monitors, and I love it... the problem I'm having right now with the set up though... is that it's too HOT! Lousy summer.... I have to leave two of the monitors off because they generate too much heat. And with my system opened, my harddrives are running around 100 F.... AND I have two 120mm fans blowing on them!!

                    Leech
                    Wah! Wah!

                    In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                    Comment


                    • #40
                      Maybe you'd do better with the system closed, and better direction.
                      Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                      Comment


                      • #41
                        38 C for HDDs is not very warm at all

                        Comment


                        • #42
                          Originally posted by leech
                          I have the three monitors, and I love it... the problem I'm having right now with the set up though... is that it's too HOT! Lousy summer.... I have to leave two of the monitors off because they generate too much heat. And with my system opened, my harddrives are running around 100 F.... AND I have two 120mm fans blowing on them!!

                          Leech
                          Go to any hardware store and buy a big fan. It will mess up the carefully (or not) designed cooling scheme of your tower, but who cares. Brute force all the way

                          It might even benefit you...

                          Comment


                          • #43
                            It DOES benefit... lowers the heat of my harddrives by about 10 degrees each... Here's the full situation..

                            I have two 10,000 rpm Ultra160 SCSI Hard Drives. They are both Full-Height (got them pretty cheap at the time), so unfortunately they generate A LOT of heat. If the room is cool, I can leave my side of the case on (the case is an Antec P160) and the front 120mm fan will cool them sufficiently, and keep all my system nice and cool (I have another 120mm fan on the back of the case blowing outwards). The CPU keeps fairly cool with or without the side on. But those hard drives.... well, when the air on the outside is extremely hot anyhow... and then it blows hot air INTO the case, then it creeps out the back... I figured more air flowing directly onto the harddrives while they're hot is best... The rest of the PC is just running fine...

                            Now if I had half-height drives, I think they would be running quite a lot cooler, sice they'd have more room seperating the drives. Here I only have about 2.5-3.5 inches. Besides, if something else in my computer fried, it'd just cost me money... the information on my drives cost me a lot more than just money.

                            Leech
                            Wah! Wah!

                            In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                            Comment


                            • #44
                              Originally posted by dZeus
                              38 C for HDDs is not very warm at all
                              I had one of them die on me twice, had to RMA the drive.... both times due to heat. I've seen them float up to 105 F or so.. one of them is rated to go as high as 75 C before dying... but the other one I think was only 45 C. so 38C is pushing too close for my comfort....

                              Leech
                              Wah! Wah!

                              In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                              Comment


                              • #45
                                which HDD is specced to 45C?

                                that's like speccing a P4 to 50 C max.

                                Comment

                                Working...
                                X