Announcement

Collapse
No announcement yet.

Half-Life: Counter-Strike

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Back when I use to play CounterStrike, (~ 1 1/2 years ago) my P3-700 and G400 MAX would run at roughly 40-60 fps @ 1024x768. I'm gonna attach the autoexec.cfg file that I use to use for you guys to look at and see what settings I was using.

    I agree that running HL at 16bit is definately better (speed wise) than running 32-bit.
    Attached Files
    Last edited by mmp121; 18 February 2003, 21:19.
    Go Bunny GO!


    Titan:
    MSI NEO2-FISR | Intel P4-3.0C | 1024MB Corsair TWINX1024 3200LLPT RAM | ATI AIW 9700 Pro | Dell P780 @ 1024x768x32 | Turtle Beach Santa Cruz | Sony DRU-500A DVD-R/-RW/+R/+RW | WDC 100GB [C:] | WDC 100GB [D:] | Logitech MX-700

    Mini:
    Shuttle SB51G XPC | Intel P4 2.4Ghz | Matrox G400MAX | 512 MB Crucial DDR333 RAM | CD-RW/DVD-ROM | Seagate 80GB [C:] | Logitech Cordless Elite Duo

    Server:
    Abit BE6-II | Intel PIII 450Mhz | Matrox Millennium II PCI | 256 MB Crucial PC133 RAM | WDC 6GB [C:] | WDC 200GB [E:] | WDC 160GB [F:] | WDC 250GB [G:]

    Comment


    • #17
      yah i run everygame in 16bit. on all graphics cards, i just can't tell a rewardable difference.

      Comment


      • #18
        Originally posted by crow8k3a
        yah i run everygame in 16bit. on all graphics cards, i just can't tell a rewardable difference.
        You have obviously never played BF1942.

        Comment


        • #19
          I run every game at 32bit, and I can tell the difference.

          HL in 32bit looks noticeably better on an ati8500, I know the textures etc are only 16 bit, but with lighting and alpha stuff like smoke and rain it is very noticable.

          I don't no how well it works for parhelia.

          Comment


          • #20
            Originally posted by crow8k3a
            yah i run everygame in 16bit. on all graphics cards, i just can't tell a rewardable difference.
            Either it's your monitor or your eyes that doesn't work properly.

            Comment


            • #21
              i can tell a slight difference, but to me its not a rewardable difference. If you like it then cool, but to me it doesn't matter at all. Actually yah i do have battlefield 1942, plays great, looks great.

              Comment


              • #22
                well i was gonna change it to 32bit color for cs to compare performance hits etc, seems like i just assumed i was running in 16 cause thats what i usually run for all games, but i can't even find the option of color depth in cs. anyone know how to change it or see what its set at?

                Comment


                • #23
                  use -32bpp as a parameter in the shortcut.

                  Comment


                  • #24
                    yah 32bit is slower, but i took screenshots and i can't tell any diff between 32 and 16 at all, so i'djust run at 16.

                    Comment


                    • #25
                      I wanted to change my refreshrate but i cant find the option, so i'll just run at 60.

                      Comment


                      • #26
                        It takes it from Windows. Set the refresh for the resolution the game runs in, and it's supposed to take it. In XP, you may have to use a utility such as RefreshLock or RefreshForce to set the rate correctly - although it seems that the Parhelia just ignores you anyway.
                        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                        Comment


                        • #27
                          You know, if there is one thing I didn't have any problems with on the Parhelia in XP and that was refresh rates. Though to be honest I have never had any problems with any gfx card in XP over refresh rates. In fact it was easier on the parhelia since it had it's own built in tool on power desk.

                          Comment


                          • #28
                            Yeah, my Parhelia ignored that, too. It always ran Half-Life at 60Hz. D3D games it did properly, but not HL.
                            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                            Comment


                            • #29
                              not to be a smartass or just plain mean... but why would you buy a Parhelia and run it in 16bit color? Hell, one of the major selling points of the card was that it had *better* than 32bit color (and yes, i can even notice a difference going to gigacolor in some games).

                              A GeForce2 should be able to get the same level of performance in those games under 16bit. Perhaps even a GeForce256 DDR.

                              Hell, a GeForce 3 is a 2 year old graphics card now and it provides better performance than a brand new $400 one in 3d apps... and does it at a higher color depth... thats rather insulting to me...

                              Personally, any of the Unreal engine games look bad in 16bit color and even 32bit color doesn't get it all... especially with fog effects or transparent sprites... Deus Ex is a wonderful example of this... Gigacolor acctually provides a noticable improvement in quality. Hell, i remember forcing 32bit color with the Voodoo5 (only video card at the time that could do it besides G400's... and glide was way faster) on Unreal engine games and marveling at how much better it looked than 16bit color.

                              Quake 3 engine games definately look better than Unreal engine games in 16bit color, however 32bit color mode still does improve quality.
                              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                              Comment


                              • #30
                                ghost,

                                well to me the color is more important concering work than gaming.

                                the main selling point to me is that it had 3 displays possible, this helps tremendously when working on 2 or 3 programs at once, especially if you are trying to replicate something viewed in another program or 2.

                                i love gaming, but if i can make a game run faster, the first thing i'm gonna do is turn down the color depth. If i look at a still image i can see the difference in games, but when ur running around etc the 32bit color is the first thing i'll turn down, and to be honest, it doesn't even bother me. in unreal 2 i don't notice much diff in 16bit and 32, and in that one. But it comes down to personal preference, if u don't mind the game running slower, and want better dithering of colors etc, than leave it on, but for me i usually turn it off, unless the game runs superb.

                                i won't disagree, turning on 32bit does increase teh quality.

                                Comment

                                Working...
                                X