Announcement

Collapse
No announcement yet.

FPS, can you tell the difference?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Conclusion to this thread: the human eye is imperfect and is always tricked, brown eyes are infinitely slower than blue eyes because the eye fairy cast a magic spell on the babies with blue eyes at birth, brunettes with brown eyes are supposedly the cutest (though I do have a weak spot for blue-eyed auburns women) and I have the biggest pecker around. Now, about those Parhelia 2 specs...
    What was necessary was done yesterday;
    We're currently working on the impossible;
    For miracles, we ask for a 24 hours notice ...

    (Workstation)
    - Intel - Xeon X3210 @ 3.2 GHz on Asus P5E
    - 2x OCZ Gold DDR2-800 1 GB
    - ATI Radeon HD2900PRO & Matrox Millennium G550 PCIe
    - 2x Seagate B.11 500 GB GB SATA
    - ATI TV-Wonder 550 PCI-E
    (Server)
    - Intel Core 2 Duo E6400 @ 2.66 GHz on Asus P5L-MX
    - 2x Crucial DDR2-667 1GB
    - ATI X1900 XTX 512 MB
    - 2x Maxtor D.10 200 GB SATA

    Comment


    • #47
      Well, I'm glad to see at least one person learned something
      Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

      Comment


      • #48
        Re: FPS, can you tell the difference?

        Originally posted by Helevitia
        I can tell the difference....
        Me too...

        Comment


        • #49
          OK... brown eyes here, I notice anything under 85 Hz (or 84, as my monitors report) - 60 is horrible, 75 still bad but much better - I notice flickering the most with a white image.

          I really always thought "those FPS zealots! 25 MUST be enough, it's enough for TV too", but since I've played a little on my new PC, I see that 25 clearly is NOT enough... don't really know when it's enough though, since I can't get any decent FPS oout of my Kyro 2

          I notice flickering in the cinema on white walls etc., when the picture is not moving much.

          And I think 3dfx really had something in that T-Buffer and their motion blur, would have made games much more playable, like films need less "fps" than games to be smooth... seemed like the stupidest ideo to me though at the time *LOL

          AZ
          There's an Opera in my macbook.

          Comment


          • #50
            Hey guys, i've noticed something, and maybe the doc or someone else can explain it to me: in low resolution (1024x768 or less, I can't really se the difference between between 85 and 100 Hz, but at higher resolution like 1440x1080 or 1600x1200, I can easily see the difference between 85 and 100 Hz...; at those res, 85 Hz looks like 75 Hz at lower res, and 100 Hz looks like 85 Hz at lower resolutions... I've noticed this on 5 different monitors (and 3 video cards), so I don't think it's monitor-related
            What was necessary was done yesterday;
            We're currently working on the impossible;
            For miracles, we ask for a 24 hours notice ...

            (Workstation)
            - Intel - Xeon X3210 @ 3.2 GHz on Asus P5E
            - 2x OCZ Gold DDR2-800 1 GB
            - ATI Radeon HD2900PRO & Matrox Millennium G550 PCIe
            - 2x Seagate B.11 500 GB GB SATA
            - ATI TV-Wonder 550 PCI-E
            (Server)
            - Intel Core 2 Duo E6400 @ 2.66 GHz on Asus P5L-MX
            - 2x Crucial DDR2-667 1GB
            - ATI X1900 XTX 512 MB
            - 2x Maxtor D.10 200 GB SATA

            Comment


            • #51
              Heres a nice little program to let you see the difference between 60 fps and 30 fps http://sdw.arsware.org/FPSCompare/
              you can adjust the fps as well

              Comment


              • #52
                FPS has been so drasticaly over-rated now for so long, it is all people care about. Tweaking a card to get 10fps more is worthless in my opinion and a waste of good gaming time. Simply to say you get 110FPS instead of your buddy getting 100FPS is worthless. Since you can not physically see any diffrence after 60-80FPS, and note most XBOX and PS2 games aim for 60FPS and they all look great.
                Fenrir(AVA)
                "Fearlessness is better then a faint-heart for any man who puts his nose out of doors.
                The length of my life and the day of my death were fated long ago"
                Anonymous lines from For Scirnis

                Comment


                • #53
                  I have blue-green eyes. I find anything under 75hz hard on the eyes with my monitor (though I like 85Hz much better than even 75).

                  Also, as far as fps goes, I can easily spot when my 4600 drops from 200+ fps to 150fps in my games. It generates a perceptible stutter in gameplay, that is almost as annoying as constant stuttering with framerates below 30fps (first person shooter games, other game types, like role-playing games, may not be so bad). It's not that the card isn't keeping up, it's that stuttering. I've found using vsync with a 75hz refresh rate, after having tweaked out my game settings to make as sure as humanly possible that any situations I'm likely to see won't go below that with the video card's framerate (ie. 75fps in an extremely intense situation), I get absolutely smooth gameplay, barring hard drive hits or other such nuisances (rare, but these do sometimes happen).

                  As far as what is perceived as smooth, it does matter exactly what medium I'm enjoying. Movies and such on a TV don't need the framerates I expect from an interact high-motion game on my non-interlaced monitor.
                  "..so much for subtlety.."

                  System specs:
                  Gainward Ti4600
                  AMD Athlon XP2100+ (o.c. to 1845MHz)

                  Comment


                  • #54
                    i dont understand this at all if your card drops fromm 200 to 150 fps how can it be observable given that 150 is most likely way more that your monitor refreshes at anyway?it cant be the drop that affects what you see more like the lack of frames for some period of time wile the crappy card sortsitself out again ie .2 of a second without a frame for example. how can a card produce such results though?could it be that the drivers were designed only to give benchmark fps and not real world performance?
                    is a flower best picked in it's prime or greater withered away by time?
                    Talk about a dream, try to make it real.

                    Comment


                    • #55
                      Dunno borat, in the few (fairly rare) instances that my G400 had drops from, say 60fps to 35fps, while playing UT, the effect was the same. Even though at the time 35fps was what I considered acceptably smooth.

                      All I can say for sure is that this card is maintaining a framerate in the 70's as a minimum no matter what situation I'm in, and with vsync enabled, my play is smooth. I could care less about maximum framerate results, so long as I get that minimum that allows me to have a refresh rate that doesn't hurt my eyes and still allows the game to feel smooth. (I think most people, and therefore the manufacturers tend to have stressed max fps too much anyhow- all I really look at benchmarks for is to see if I got an improvement in results, and specifically, higher framerates in general mean my lowest will be higher too..)
                      "..so much for subtlety.."

                      System specs:
                      Gainward Ti4600
                      AMD Athlon XP2100+ (o.c. to 1845MHz)

                      Comment

                      Working...
                      X