Announcement

Collapse
No announcement yet.

Parhelia and Geforce4?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Rimfaxe, as far as I know, Nvidia planned to release the MX420 with 128bit SDR, but because the production of SDR has stopped they switched to DDR at the last moment...it doesn't really matter anyway, 64 bit DDR is actually a little bit worse than SDR!
    All work and no play makes Jack a dull boy.

    Comment


    • #17
      You *SHOULD* consider ATI. They are cheap, fast, and the drivers are great now.

      Parhelia is overpriced, why even consider it for gaming when the competition is faster and cheaper.

      Not all mx420's are SDR

      Comment


      • #18
        Originally posted by Rimfaxe
        will matrox drivers/powerdesk give me trouble or just "feck" up my computer so it cant be used? can a videocard, a gforce excist/beeing installed and work properly on a computer with a G400MAX on? do anyone know if i can choose what videocard my games are to be played on?
        I sometimes run a G400 AGP with 2 NVidia PCI GF cards - triple-head Surround Gaming before and without a Parhelia.

        PowerDesk and nView coexist well enough, controlling their respective hardware.

        Many games, however, are not multimonitor aware, and default to the primary monitor - hence if you wish to game on the GF card you may find it necessary to set your mainboard to boot with the PCI card as primary, and use the G400 as a secondary. Some older games will choke when they encounter more than one card, and hence you may need to temporarily disable the secondary altogether - easy enough to do, but a nuisance nonetheless.

        Comment


        • #19
          Originally posted by Ashley


          I sometimes run a G400 AGP with 2 NVidia PCI GF cards - triple-head Surround Gaming before and without a Parhelia.

          PowerDesk and nView coexist well enough, controlling their respective hardware.

          Many games, however, are not multimonitor aware, and default to the primary monitor - hence if you wish to game on the GF card you may find it necessary to set your mainboard to boot with the PCI card as primary, and use the G400 as a secondary. Some older games will choke when they encounter more than one card, and hence you may need to temporarily disable the secondary altogether - easy enough to do, but a nuisance nonetheless.
          Thank you very much! That was a good answer and the answer I would like.

          Comment


          • #20
            Hi,

            Originally posted by BokChoy
            You *SHOULD* consider ATI. They are cheap, fast, and the drivers are great now.
            Drivers are great now? I disagree.... and they never manage to deliver decent support for Xfree + DRI. Ridiculous...

            Hard to admit, but these days a GF4-Ti 4200 has the best price/performance ratio.

            cu,
            kaasboer

            Comment


            • #21
              Originally posted by kaasboer
              Hi,



              Drivers are great now? I disagree.... and they never manage to deliver decent support for Xfree + DRI. Ridiculous...

              Hard to admit, but these days a GF4-Ti 4200 has the best price/performance ratio.

              cu,
              kaasboer
              I think it's pretty safe to say that for gaming you wouldn't be running Linux. In fact, I don't see anywhere the original poster is going to be running any Linux derivative. So stick to the original subject.

              The drivers for Windows are great. The TI4200 is also a good card for the price, considering you get one with a good 2D filter.

              THe point is, looking for a Parhelia at that price/performance ratio is a waste of time.

              Comment


              • #22
                Rimfaxe, as far as I know, Nvidia planned to release the MX420 with 128bit SDR, but because the production of SDR has stopped they switched to DDR at the last moment...it doesn't really matter anyway, 64 bit DDR is actually a little bit worse than SDR!
                That's pretty much ridiculous. You don't just switch memory controllers at the last moment, and you have your partners contracted for components WAY early.
                Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                Comment


                • #23
                  And it's not just changing the mem controller. A switch of that magnitude would require a complete PCB redesign.
                  You don't just change bus width "at the last moment".
                  Last edited by Kruzin; 17 August 2002, 16:47.
                  Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

                  Comment


                  • #24
                    ??? NVidia chips since the Riva 128 have been able to support both a single bank 128-bit "wide" or a dual bank 64-bit "deep" data bus. *Numerous* memory configurations have to be supported if you are a chip vendor, and multiple public board designs around the same chip are delivered to prospective customers (board manufacturers)...

                    Comment


                    • #25
                      Considering the Geforce4 mx 420 or whatever the hell designation it is, is just that, a designation of the chip design itself entail that it could be DDR or SDRam depending on the <i>manufacturer</i>?

                      As far as the original post goes, to keep on topic. My question is.... does this game actually REQUIRE T&L? And if so, why don't you just wait 'til you can afford the Parhelia, I don't think it'll kill you to just wait 'til you have the money. Better that way in the long run so you save yourself some hair (pulling it out while having troubles with the weird configuration) and money for the 'cheapo' g-fart card.

                      Leech
                      Wah! Wah!

                      In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                      Comment


                      • #26
                        Considering the Geforce4 mx 420 or whatever the hell designation it is, is just that, a designation of the chip design itself entail that it could be DDR or SDRam depending on the manufacturer?
                        Hell no. That makes about as much sense as saying that the same engine will burn gasoline if put in a race car, but diesel if it's installed in a big rig.
                        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                        Comment


                        • #27
                          No Wombat. 64bit ddr or 128bit sdr = the same. Well actually the sdr will be slightly faster because of latency. To the video card or end-user it makes no difference. They are interchangable due to cost and supply.
                          Eg. the geforce MX used sdr. the geforce MX200/400 used 64bit ddr while some of the early mx 200/400's used 128bit sdr. they just gradually shifted ram type while maintaining the same core/board.
                          Oh my god MAGNUM!

                          Comment


                          • #28
                            No Wombat. 64bit ddr or 128bit sdr = the same. Well actually the sdr will be slightly faster because of latency. To the video card or end-user it makes no difference. They are interchangable due to cost and supply.
                            No, they absolutely are not.
                            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                            Comment


                            • #29
                              Wombat is right in that 64 bit DDR and 128 bit SDR are very different from an implementation standpoint (you'd need 128 traces for the 128 bit SDR, while 64 traces for the 64 bit DDR, for starters), The Geforce MX GPUs are designed to use either DDR or SDR, IIRC.

                              AZ
                              There's an Opera in my macbook.

                              Comment


                              • #30
                                Originally posted by Wombat
                                Hell no. That makes about as much sense as saying that the same engine will burn gasoline if put in a race car, but diesel if it's installed in a big rig.
                                Actually tank engines are built with different fuels in mind to be more shortage proof. T55 certainly can use both.

                                Comment

                                Working...
                                X