Announcement

Collapse
No announcement yet.

Ack! Star Trek Voyager uses T&L!!!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Ack! Star Trek Voyager uses T&L!!!

    Hey,

    nVidia just relased a list of games to use T&L. Most of them I don't care about... except one. Star Trek Voyager: Elite Forces. I'm really looking forward to this game. It uses Quake3 engine. And the modelers have mapped out pretty much all of voyager and you get to play on a borg cube.

    I really don't want a G3force... but I'm really looking forward to this game. Hmmm.... maybe the G400 opengl drivers will be really optimized by the time this game comes out.

    later

    ------------------
    Abit BH6, Celeron 450, Matrox G400 32mb "MAX", 256mb ram, IBM 10GB, DVD 5x, MX300

    Abit BH6
    Celeron 450
    Matrox G400 32mb "MAX"
    256MB PC100 RAM
    IBM 10GB 7200rpm HDD
    Creative Labs DVD 5x
    Mitsumi 4x/2x/8x CD-RW
    Monster Sound MX300
    USR 56K Modem
    ADi 6P (19" Monitor)
    Windows ME

  • #2
    I think the maker of the game would be kicking themselves in the rear end if they made their game only T&L (GeForce and S2000) that would mean it would only work on 2 cards, thats just not good business sense ... some Q2 based games work on the TGL so maybe this game will work seeing as though it will based on Q3

    ------------------
    "Who could have fragged you without even hearing him, that would be me!!!"

    -Me
    "Who could have fragged you without even hearing him, that would be me!!!"

    -Me

    Comment


    • #3
      Don't be too worried about this claim guys. OpenGL inherently supports any T&L hardware- so any game that uses OpenGL should be able to take advantage of the GeForce's hardware. That means that nVidia is likely to stick any OpenGL based game on the list, if they expect it to be extremely popular. It's just hype, though. Take a look at the actual benchmarks for the current Q3Test1.08 on the GeForce. They're okay, but not really skyhigh compared to TNT2's or G400's.

      It's just nVidia hype!

      ------------------
      Ace

      D@$#n typos.

      [This message has been edited by Ace (edited 20 October 1999).]
      "..so much for subtlety.."

      System specs:
      Gainward Ti4600
      AMD Athlon XP2100+ (o.c. to 1845MHz)

      Comment


      • #4
        Ace

        you're one of these guys who never get the point! The GeForce Q3-Test Benchmarks are not much higher than the TNT2/G400 Benchmarks? Right, and do you know why? Cause the GeForce has not an extremly increased fillrate if you compared it with the TNT2/G400; it's main improvement is the T&L. And you wont see any fps increasement in games that dont support T&L...but you WILL see visual improvement in games which do support T&L! thats the point... who the hell wants 200fps in quake2? who the hell wants 150fps in Q3Test? not me! but I DO WANT 3D-Worlds which look better and more realistic, running with about 50fps!

        btw: I had a G200 for 1 year now, and matrox pissed me off with it's non-exisitng driver support. thats why I bought me a TNT2Ultra instead of the G400... there you go Matrox, good job!

        (beeing swiss, my english may be not as good as yours, but is your french as good as mine? )

        greetings

        Comment


        • #5
          Forgive me for offending your delicate sensibilities NoClue, but you really need to get one. From everything I've seen, other than the pretty "Tree" demo, the GeFarce offers very little improvement in performance over the G400 series, and NONE of it is in the image quality area. I already HAD a clue. The G400 series currently produces the best 2D or 3D images, bar none. The GeForce is much closer, and just a little faster in some situations, so I can get by with making a statement like my last response. What's your excuse?

          ------------------
          Ace
          "..so much for subtlety.."

          System specs:
          Gainward Ti4600
          AMD Athlon XP2100+ (o.c. to 1845MHz)

          Comment


          • #6
            Sorry about that last response. It just pisses me off to get smarta$$ed comments to legitimate replies. If you reread my first response, you'd see that my point is that the Star Trek Voyager game is based on Q3's engine, hence it's OpenGL, and so, will support any T&L acceleration offered by any accelerator, not just the GeForce. That doesn't mean that the game will look or run any less well on someone's G400 or TNT2. In fact, given the current experience with Q3Test1.08, I'd guess that most of us can run the game with the same image quality settings that are used on the GeForce (I do) with our G400's and still get 30+ FPS minimum framerates.

            What this means to me is exactly what you say I don't "get", NoClue, playable framerates, but with TOTAL eye-candy turned on. I don't care about astronomical framerates either, just playable framerates. If I wanted those maximum rates, I'd have the V3 3500 right now. Instead I bought the Max, even though it may not hit 90 FPS in Q3, because I get the BEST picture of any card currently on the market, including the GeForce.

            ------------------
            Ace
            "..so much for subtlety.."

            System specs:
            Gainward Ti4600
            AMD Athlon XP2100+ (o.c. to 1845MHz)

            Comment


            • #7
              Ok Ace, I didn't mean to offend you in any way, but I'm really getting tired of all the people calling nvidia and the GeForce "hype-only"-ware even though they don't get the point of this card.
              You may be right when saying that the G400 has the best picture quality of all 3DCards avaiable at the moment. But who cares about so calles "eye-candy" is your fps drops below 25fps al the time? You then have to disable these extra eye-candy anyway (I'm talking about EMBM)!
              And when it comes to driver support, Matrox is just worth shxt! Lets talk about OpenGL (I know this is getting old in this BBS), OGL just sucks on the G400-series compared to the ICD of the TNT2's. Ok, there's a TurboGL now, but only works on P3/K7!! goddammit Matrox, do you think I go and buy me a new processor just because you're not able to write an OpenGL that works good with all processors??? forget it! That's why I bought a TNT2U, it has rock stable OpenGL, excellent D3D and even Glide-Emulation that works at an acceptable speed!
              To go back to the GeForce: You're right, with a P3/K7+G400 you get about the same performance out of Q3-test1.08 as with a GeForce. BUT THIS STILL IS NOT THE POINT! Q3-Test1.08 is still a TEST! not the final version. The finnal version of Q3-Arena will most probably have special T&L support for the GeForce cards that will make this game look just awesome on a GeForce system! And you have NO CHANCE to get that extra eye-candy on you G400! So stop talking shit about the GeForce and wait till you see the first Screenshots of T&L-optimized games and you will drop your G400 immediately for such a HypeForce if you're really into eye-candy!

              greetings
              NoClue

              Comment


              • #8
                NoClue,

                You don't seem to get it, do you? If there are no games out right now that can benefit from the GeSpot's features, then it is HYPE! Until these mystery games start appearing on our shelves, and the GeSpot is available, it is nothing more than an overpriced TNT2U. People are better off sticking with what they have rather than go out and buy one of these beasts right away, because we have no idea how well it will be supported and how well it will actually perform with their so called super enhanced Tilt and Lick games. Does that make any sense at all, or do you just go along with whatever a video card company tells you. Because right now that is all we have to go on. You think that with how pissed off you are with Matrox and the whole G200 ICD thing that happened, you would not be so quick to jump onto the Hype bandwagon.

                Rags



                [This message has been edited by Rags (edited 21 October 1999).]

                Comment


                • #9
                  Chill out, NoClue. Matrox is still improving their drivers, and so far, other than Tribes, there's no game that I can't play very well on my card right now, and with a better picture than any other card out there. It's probably going to be at least six months before titles that make my card obsolete (and the hardware to show it) hit the market. So those of us that don't share your "nVidia dropped the bomb" view of the GeForce can still be happy, and so can you. There isn't anything wrong with the GeForce BTW, it's just not enough to justify a change of my cards so soon, that's all. nVidia's page makes it sound like those games will look better on their card than anyone else's. That's just hype, for now. Only the tree demo they use shows any major differences between cards right now, and that likely won't change for a while.

                  Add to that the fact that the GeForce is turning out to be surprisingly CPU dependent, and you end up in the same sort of situation G400 owners are in now. Give Matrox another few months, and the G400s should be performing as well as, or better than, most TNT2's on most systems above Pentium class, IMHO. They're just behind in the OpenGL driver development arena is all. nVidia started sooner, so they have a more mature driver base to work from, but Matrox is slowly (actually, pretty quickly now, from what I've been seeing) getting there.

                  BTW- The original post as I started it was much more scathing than this, but I like to think we all more adult than that. Let's just agree that both cards are okay, and Matrox needs better drivers, while nVidia (at least with GeForce) needs more games that actually have MUCH more detail to take advantage of their new card.
                  ------------------
                  Ace


                  [This message has been edited by Ace (edited 21 October 1999).]
                  "..so much for subtlety.."

                  System specs:
                  Gainward Ti4600
                  AMD Athlon XP2100+ (o.c. to 1845MHz)

                  Comment


                  • #10
                    Wow... sorry guys. Didn't mean to start such a big argument.

                    Anyway, I just wanted to say that I was at some webpage showing off the Geforce ability to do "curves" in Q3. And i have to say that after looking at before and after pictures I'm really, really not impressed. Honestly, I was flipping back and forth between these two pictures and it took me about 3 whole minutes before i saw the difference. And yes... there definitly were some curves there.

                    But here's my thing. If I have to stare at a pic that long to see the difference... how much of that will I notice in a game of say Q3 running around trying to stay alive and kill everyone else. I problably won't notice much if any of it.

                    On this say page they rendered a the same scene as if full screen anti-aliasing was activated (ala Voodoo 4). Now if this is what the Voodoo 4 is going to do... I'm kinda impressed. It was very, very noticable. But at the same time... i didn't quite like it. It looked kinda... "cartoony". But I'll say this... no matter how many screen shots I see of game on the web, it's never the same as seeing it in person. So maybe I'll like the anti-aliasing thing more when I see it in motion. Maybe I'll like the T and L thing more when I see games use it more.

                    But mostly I was thinking, "Man, can't wait to get a card that has T&L AND full scene anti-aliasing." :::droooool::: Hello? G-2000?

                    Later

                    ------------------
                    Abit BH6, Celeron 450, Matrox G400 32mb "MAX", 256mb ram, IBM 10GB, DVD 5x, MX300

                    Abit BH6
                    Celeron 450
                    Matrox G400 32mb "MAX"
                    256MB PC100 RAM
                    IBM 10GB 7200rpm HDD
                    Creative Labs DVD 5x
                    Mitsumi 4x/2x/8x CD-RW
                    Monster Sound MX300
                    USR 56K Modem
                    ADi 6P (19" Monitor)
                    Windows ME

                    Comment


                    • #11
                      hmmmmm

                      "Chill out, NoClue. Matrox is still improving their drivers, and so far, other than Tribes, there's no game that I can't play very well on my card right now, and with a better picture than any other card out there. "

                      TNT plays every game...and I'm sure you wont see a difference in picture quality
                      btw: picture quality depends on the GraphicCard/Monitor combination. I can't explain this in english cause this is too technical (if you understand german, tell me and I'll explain it here). So some GraphicCards produce a better picture on some monitors, while other Cards produce a better picture on an other Monitor (and I'm not talking about size). So stop that "G400 has the best picture quality" stuff, cause it's just not true. The G400 may produce a sharp and clear picture, but other cards do that as well.

                      "It's probably going to be at least six months before titles that make my card obsolete (and the hardware to show it) hit the market. So those of us that don't share your "nVidia dropped the bomb" view of the GeForce can still be happy, and so can you. There isn't anything wrong with the GeForce BTW, it's just not enough to justify a change of my cards so soon, that's all. nVidia's page makes it sound like those games will look better on their card than anyone else's. That's just hype, for now. Only the tree demo they use shows any major differences between cards right now, and that likely won't change for a while."

                      I agree with you.
                      And I dont have a "nVidia dropped the bomb" view. Actually I wouldn't buy this card either at the moment cause it is the first of its type and many others will follow and have better specs. And by then there will be games that justify to change to a T&L card. But someone has to do the first step, and this time it was nVidia. And if they're a bit over-enthusiastic (does that word exist?) I dont really mind. You can call that hype or whatever you want. doesnt change the fact that nVidia did a great job in introducing a T&L card. I really look forward to see the games that make full use of the T&L engine!

                      "Add to that the fact that the GeForce is turning out to be surprisingly CPU dependent, and you end up in the same sort of situation G400 owners are in now."

                      Not true! the GeForce is not as cpu dependent as the G400. actually your cpu doesnt have to do the transform and lightning calcualtion if the game engine supports T&L. correct me if I'm wrong here.

                      "Give Matrox another few months, and the G400s should be performing as well as, or better than, most TNT2's on most systems above Pentium class, IMHO."

                      Now this really makes me laugh! didnt I hear the same thing a year ago with the G200? dammit, I a few months the G400 is an OLD CARD! and if Matrox just supports their cards with mature drivers when they're OLD...oh well, maybe I should put my G200 back in my system in about 3 or 4 month? cause at this time I will get the final drivers and a stable OGL ICD for it?

                      greetings
                      NoClue

                      Comment


                      • #12
                        Well NoClue, I'm tired of arguing with you, so this is my last response on this subject. I'm saving the thing that really annoyed me in this whole conversation for last- the other stuff's just different opinions, as far as I'm concerned.

                        It's true that pairing a good card with a bad monitor (G400/Hyundai) and a poorer card with a better monitor (TNT/Sony) can cause the cards to look better or worse. However, connect both cards to the SAME good monitor (Sony F500, for instance), and the monitor becomes a non-factor. Deny it all you want, Matrox still has the best image quality going. Sorry, but I don't understand German, or I'd have you send me that explanation via e-mail. Not that important.

                        TNT should play every game out there. NVidia's been making real 3D cards longer than Matrox (koodos to nVidia for this one), and the one game I can't play right now on the G400 was 'specially' written to run on the TNT series, since it wouldn't before (Glide only, at one time, dontcha' know).

                        I agree that since the GeForce does T&L in hardware, the card should be less CPU dependent. Funny thing is, from what I've seen of current benchmarks, it's way more CPU dependent than it should be.

                        I think your problems with the G200 and the ICD fiasco have really turned you anti-Matrox. In effect, it's created a custom set of blinders for you. Too bad. I used to feel the same way about another company, 'til I bought one of the cards based on one of their last chipsets. I was impressed. Holding grudges or prejudging these companies can make you miss out on some good stuff, and it's more your loss than the company in question.

                        Anyhow- here's what really irritated me in my discussion with you. You seem to be misunderstanding the point I was making to the original post, which was simply that the reason this game appears on nVidia's GeForce T&L support list is that it's OpenGL-based. It didn't need special work to get that support. That is provided courteousy of the engine / API it was based on. THAT makes it hype by nVidia. Not that it's false or anything- just that it might sound a little like the game was especially modified just for that chipset. I assure you, that's not the case.

                        You say you don't mind them being overenthusiastic (yes, it's a word). Neither do I. I was just pointing out that just because a game appears on their list doesn't mean that other cards won't run the game with just as pretty a picture or anything. In fact, the only real advantage for GeForce that I've seen so far involves games that support T&L (DX7 specially coded to include support, or any OpenGL title)- running in ultra-high resolutions (1280+). The only problem with that is that then you need a very good monitor to get those resolutions and decent refresh rates. Oh, that and the fact that the G400 series also seems to like that same area (admittedly, not as well as the GeForce seems to have done).

                        Get all that? Now let's call a truce, okay? I like my Matrox, as I liked (and still like) my Diamond V2-12MB, and my Creative TNT AGP (in my old box in fact- waiting for me to buy a newer monitor for this system so I can put this one on the old one, as well as for some memory). I think we all need to keep our minds open to new products. Just don't let the marketing sell you something that doesn't really benefit you.

                        ------------------
                        Ace


                        [This message has been edited by Ace (edited 22 October 1999).]
                        "..so much for subtlety.."

                        System specs:
                        Gainward Ti4600
                        AMD Athlon XP2100+ (o.c. to 1845MHz)

                        Comment

                        Working...
                        X