Announcement

Collapse
No announcement yet.

The Big CountDown thread

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • OK, OK!! Let's get back to this thread........

    Days since presentation at GDC: 33
    Days to VigilAnt (2 suggested months): 12
    Days to E3: 27
    Days to Ant's birthday (May 16th): 20
    Doom III fps: 36

    Comment


    • Originally posted by Greebe
      at least it's not a GF2 in disguise as a GF4
      Hey, we all know that the GF4MX is just a GF4TI with some die-parts broken / disabled...
      But we named the *dog* Indiana...
      My System
      2nd System (not for Windows lovers )
      German ATI-forum

      Comment


      • ROFL
        The only GF4Ti 4600 I came into contact with had everything disabled
        [size=1]D3/\/7YCR4CK3R
        Ryzen: Asrock B450M Pro4, Ryzen 5 2600, 16GB G-Skill Ripjaws V Series DDR4 PC4-25600 RAM, 1TB Seagate SATA HD, 256GB myDigital PCIEx4 M.2 SSD, Samsung LI24T350FHNXZA 24" HDMI LED monitor, Klipsch Promedia 4.2 400, Win11
        Home: M1 Mac Mini 8GB 256GB
        Surgery: HP Stream 200-010 Mini Desktop,Intel Celeron 2957U Processor, 6 GB RAM, ADATA 128 GB SSD, Win 10 home ver 22H2
        Frontdesk: Beelink T4 8GB

        Comment


        • Didnt you know thats how nVidia name their chips. the number after the NV is the percentage of the chip that works.

          Eg: a NV25 has 25% of the core working, a NV17 has 17%.

          Gee, everybody knows that!

          I have a Riva128, an NV3 I think, so it only has 3% of a working die.

          Once they get to a NV100 nVidia will close down. Their chip will be as fast and feature rich as it can be. Not so bad since they designed it in 1985 or so.

          Ali

          Comment


          • I wonder when they will enable more 2D quality

            Comment


            • Probably when Matrox show them what true 2D quality is again...!
              then they'll wake up and smell the noise..

              PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
              Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
              +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

              Comment


              • I wonder when they will enable more 2D quality
                according to techchannel.de they want to do something about it.
                see the following topic



                In an interview with techcannel David Kirk said, NVidia wants to improve 2D quality of their card. They blame the manufacturers for using cheap components when building gfx-cards with NV chips. Kirk said there will be a 'NVidia certification logo' for manufacturers following the guidelines from NV.
                (Short translation, maybe someone can do better)
                I wonder if the rumours about the new Matrox chip forced them to think about improving their 2D image quality ?

                Comment


                • It could also be because of the constant nagging on the sole feature Matrox (and ATI?) are better in?

                  I wonder if the rumours about the new Matrox chip forced them to think about improving their 2D image quality
                  Edit: question mark after ATI, never seen an ATI.....
                  Join MURCs Distributed Computing effort for Rosetta@Home and help fight Alzheimers, Cancer, Mad Cow disease and rising oil prices.
                  [...]the pervading principle and abiding test of good breeding is the requirement of a substantial and patent waste of time. - Veblen

                  Comment


                  • Yes ATI's are a tad better overall, reservations for Gainwards which seem to have good quality aswell.

                    Comment


                    • ATi uses a 400MHz RAMDAC don't they?

                      My G550 has a 360MHz RAMDAC.
                      From what i remember b4 3D cards realyy came out, it was that which enabled good 2D quality, but prob because they can accept fast refresh rates at high resolutions.

                      It would be hard to find a non-expensive monitor that would be able to refresh 1600x1200@120 or 150Hz...
                      PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
                      Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
                      +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

                      Comment


                      • I know very very little about these things, but I believe that the frequency of the RAMDAC does not actually determine quality.

                        True, it limits the anmount of data one can transfer, but says very little about the extent to which distortion comes into play.

                        I think (but never dream of pretending to know) that color saturation, sharpness etc, has little to do with the RAMDAC frequency (But then again, I do not know if a 350 Mhz RAMDAC actually means its bandwith is 350 mhz??)

                        All in all, I know nothing, just thinking that Mhz isn't all for quality of cards.

                        Umf
                        Join MURCs Distributed Computing effort for Rosetta@Home and help fight Alzheimers, Cancer, Mad Cow disease and rising oil prices.
                        [...]the pervading principle and abiding test of good breeding is the requirement of a substantial and patent waste of time. - Veblen

                        Comment


                        • You are correct Umfriend.
                          [size=1]D3/\/7YCR4CK3R
                          Ryzen: Asrock B450M Pro4, Ryzen 5 2600, 16GB G-Skill Ripjaws V Series DDR4 PC4-25600 RAM, 1TB Seagate SATA HD, 256GB myDigital PCIEx4 M.2 SSD, Samsung LI24T350FHNXZA 24" HDMI LED monitor, Klipsch Promedia 4.2 400, Win11
                          Home: M1 Mac Mini 8GB 256GB
                          Surgery: HP Stream 200-010 Mini Desktop,Intel Celeron 2957U Processor, 6 GB RAM, ADATA 128 GB SSD, Win 10 home ver 22H2
                          Frontdesk: Beelink T4 8GB

                          Comment


                          • I think the speed of the RAMDAC is just one small piece of the quality equation, in that if you come close to the RAMDACs limits, it might put out worse quality than it would if the limit was way higher and you wuldn't come close to it - but, like umf said, I believe other aspects to play a way bigger role (which ones, I don't know )

                            AZ
                            There's an Opera in my macbook.

                            Comment


                            • The problem with image quality is that we have to enter the wonderfull world of analogue (bleurgh!).

                              I suppose a good quality picture has several elements - sharp edges, realistic and vibrant colours, decent contrast, and no flicker. The RAMDAC frequency only really affects the last of these. However, the quality of the DAC and the way it works can stronly influence the rest as well, most particularly the colour quality. If your DAC has a limited analogue range, or limited precision, your colours may be washed out, inconsistent or unrealistic.

                              Sharpness of image is partly dependant on the DAC (response times etc), but you can have a decent DAC and still end up with a poor picture because of bad board design, or cheap components. Analogue means that errors are cumulative, relatively undetectable (and therefore uncorrectable) and so any noise on the circuit (from other components on the board, or from other parts of the computer, or even from things in another room) can potentially add electrical signal noise to the system, resulting in shimmering, fuzziness and even distorted colours. I remember having a soundcard that managed to introduce electrical noise into my graphics card - most annoying.

                              Of course even the best graphics card will still look bad on a bad monitor - a decent CRT monitor has co-axial signal cable that reduced interferance, and a high dot pitch, decent beam-confinement and good shadow mask/grill.

                              The best solution would probably be to go with a digital flat panel - no analogue signal noise to get in the way, up until the last moment! In fact, I'm surprised noone has done a digital interface CRT monitor (or maybe they have - anyone know of one?) - it ought to mean better quality pictures, and cheaper graphics cards (although more expensive monitors).

                              Oh well, thank goodness I haven't had to seriously consider analogue noise since university. Digital all the way for me. And anyone who thinks that records give better quality than digital music... well...!

                              LEM

                              Comment


                              • yeah there is at least one with DVI. according to tom it is some sort of fake though.
                                no matrox, no matroxusers.

                                Comment

                                Working...
                                X