Announcement

Collapse
No announcement yet.

GeForce4 info slips.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Yes, and a quite cool one, too. Well, I found it cool at its time, at least....
    But we named the *dog* Indiana...
    My System
    2nd System (not for Windows lovers )
    German ATI-forum

    Comment


    • #32
      yup a true classic.

      completed it 2 months ago, with a C64-emulator, as well as 10 years ago on my dads C64.

      well, i completed the first difficulty, on the second and third run the doors/keys are different, and after that it just gets darker.

      it gets a bit repetative in the long run though.
      This sig is a shameless atempt to make my post look bigger.

      Comment


      • #33
        Wombat
        I could be wrong on this one but didnt someone say that Matrox's next card will be fast without the frills? If so, maybe we will see a high end gamers card and nothing else (with dual head of course)

        regards MD
        Interests include:
        Computing, Reading, Pubs, Restuarants, Pubs, Curries, More Pubs and more Curries

        Comment


        • #34
          Hey, I want the frills. Remember Dual Head is a 'frill' when you get right down to it.

          Matrox mucked up with HeadCasting, but they hit the nail on the head with DualHead. You can never get it right 100% of the time.

          If Matrox comes out with something as cool as Dual Head again, Ill be happy. they seem to have one big 'frill' with each release (not counting G450).

          I am also hopping they will up the supported resolution on the DVI connection. About time for a dual channel one I think.

          Since the big M also makes (made?) network cards, I wouldnt be supprised to se a RJ45 connector ont he back of my next graphics card. Haig mentioned something about guessing what connectors would be on the card, and thats my guess.

          Ali

          Comment


          • #35
            I would expect that Matrox will at least provide a video component output if they're still providing TV out support. I'm more interested that they continue to enhance their dual monitor and TV out support versus trying to briefly gain the title of fastest gaming card.
            <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

            Comment


            • #36
              Back to the subject.

              There where some specs reported on http://www.xbitlabs.com/news/story.html?id=1011575285 but apperantly they are already removed.

              But here they are again

              As we (and not only we) have already told you, in the beginning of February, to be more exact on February 5 in the USA and some time on February 6 in Europe, NVIDIA will launch its new solution for hardcore gamers – GeForce4 chip aka NV25. Today we got the opportunity to tell you a bit more about this new product.
              Just like in case of Titanium chips family, the graphics cards makers are most likely to launch several products on NV25 at a time, which will differ by the chip and memory working frequencies as well as by the size of the graphics memory used. We will now touch upon the fastest model of this family, as we believe it deserves most attention and describes all the peculiarities of NV25 architecture.
              So, the new NVIDIA graphics chip, just like GeForce3, will be manufactured with 0.15micron technology. However, the number of transistors used in it will reach 63 million, and the working frequency will be increased up to 300MHz. Here the natural question arises: how on earth NVIDIA managed to increase the chip working frequency so greatly without shifting to a new finer manufacturing technology, especially since NV25 should be more architecturally complex than GeForce3? The answer to this question is pretty original: we will see a new packaging in NV25 – PBGA with integrated heatsink. In other words, NVIDIA decided to enhance the heat dissipation process in order to be able to speed up its chips. For the same purpose, NVIDIA will provide its NV25 chips with a special NVIDIA thermal control system.
              NVIDIA will increase the memory (DDR SDRAM) working frequency as well, which will work at 650MHz on the fastest solutions. This way the memory bus bandwidth will make 10.4GB/sec. Memory chips used on the graphics cards (at least on those following the reference design) will feature BGA-packaging. NV25 based graphics cards will get up to 128MB DDR SDRAM.
              NV25 will feature 4 rendering pipelines with 2 TMUs each. This will make the new chip similar to GeForce3. However, due to higher working frequencies and improved technologies, the fillrate is claimed to equal 4.9 billion AA samples/sec.
              Besides significantly higher working frequencies, NV25 will boast some fresh new features and technologies, though it will remain very much like its predecessor, GeForce3. First of all, we have to mention Lightspeed Memory Architecture II, the technology aimed at unloading the memory bus. It includes loss-free Z-buffer compression with 4:1 ratio, 2nd generation Occlusion Culling and Quadcache Architecture, which saves even more memory bandwidth.
              Thanks to another significant performance and memory bus efficiency increase, NVIDIA managed to implement a new more progressive anti- aliasing method in its NV25. It is called Accuview AA. This method will use new subpixel locations mask and feature NVIDIA’s patented and optimized for the best performance and quality algorithm.
              As for the T&L unit, it has been changed significantly compared with that of GeForce3. Now it is called nFinite FX II and features 2 vertex shader pipelines. Higher clock frequency will triple the performance in those applications, which use the second computing pipeline, making it three times as fast as that of GeForce3. Moreover, increasing the chip frequency will speed up the operations with pixel shaders by 50%.
              Besides that, NVIDIA also claims that its new solution will support Z-Correct bump mapping feature, which is still a mystery to all of us. We will not speculate here now, as you will learn everything very soon: on February 5
              The performance solutions from NVIDIA have now also got dual-monitor support. NV25 will boast nView technology, the next stage in the TwinView evolution. It will have richer features and will be much easier to use.
              Attention! This news story will be soon removed upon NVIDIA’s request
              Main: Dual Xeon LV2.4Ghz@3.1Ghz | 3X21" | NVidia 6800 | 2Gb DDR | SCSI
              Second: Dual PIII 1GHz | 21" Monitor | G200MMS + Quadro 2 Pro | 512MB ECC SDRAM | SCSI
              Third: Apple G4 450Mhz | 21" Monitor | Radeon 8500 | 1,5Gb SDRAM | SCSI

              Comment


              • #37
                Take a look at this:



                This actually looks like it could be true. Imagine the performance of that FireGL 8800. If the info is correct though.

                Comment


                • #38
                  in some sites (like Beyond3D) most ppl think that Reactor Critical is No. 1 rumour source, but does it have any reliability?

                  Answer is simple: NO!

                  so need to worry.

                  EDIT: and even it turns about to be true, that 256 bit memory bus will make it too expensive for gamers. it will be competive against QUADRO DDC (which costs here in finland about 1000 €)

                  and I think that Matrox will have about same memory bandwidth but with cheaper price.
                  Last edited by Nappe1; 21 January 2002, 16:35.
                  "Dippadai"

                  Comment


                  • #39
                    I know it will cost that much, it is after all aimed at workstation level graphics.

                    But a 256bit bus would be very special, although expensive.

                    Comment


                    • #40
                      Originally Posted by Ben6 at Beyond3D thread: http://www.beyond3d.com/messageview....&threadid=1915
                      Got the answer from a buddy of mine who's on the FireGL driver team at ATI. The FireGL 8800 has a 128bit DDR memory bus . Their FireGL 4 has the 256bit DDR bus...
                      so, that's was it all about. afaik, FireGL 4 is based on some very fast rendering core and has not much to do with ATI's chips right now.
                      "Dippadai"

                      Comment


                      • #41
                        I guess I fell into the rumour trap. That's the first time in a loooong time.

                        Comment


                        • #42
                          Originally posted by cbman
                          Something about, "A whole new XPeriance, It will hit you harder and faster than Montezuma's Revenge will on your first trip to Mexico"
                          Montezuma's revenge is a nickname for the disease also known as traveler's diarrhea... But no doubt that catches you pretty hard on your first trip to Mexico too

                          Comment


                          • #43
                            I'm still not sure that Matrox can get a product unscathed from their R&D pipeline to a final product after recent fiascos, but I don't doubt what the Matrox R&D is capable of and I think they may have outdone themselves this time. Only time will tell.


                            Originally posted by Novdid
                            Ant, it sounds like you have very high expectations on the new chip from M, considering what I've seen in your posts lately.

                            Comment


                            • #44
                              Why do you call the G550 a fiasko Ant?
                              It was a business card. Not a card for gaming.

                              I personally think we will have the TNT2 vs. G400 feeling again.
                              Raw systemspecs vs. performance, quality & longlivity.
                              Or can someone tell me which card else than the G400 lived *that* long in an gaming machine?...

                              Ati is cheap crap and will always be cheap crap (except in a MacIntosh, that is)! And Nvidia joined them some time ago.

                              And both theire drivers are bad. I´ve tried both!

                              I just hope the new card has DualHead and +400fps in Quake3 (the most stupid game of all times)!
                              The kids love this
                              There is no weakness, but to cringe and despair because one thinks oneself weak.
                              For so long as one´s will is undefeated one is strong, for so long as the desire for revenge still endures.
                              -Tom Holland, Deliver us from Evil

                              Comment


                              • #45
                                I personally think we will have the TNT2 vs. G400 feeling again.
                                i think that is a good view. at that time matrox was behind nvidia in the performance stakes (although not as much as now), and matrox came up with a great new product with the best performance and lots of new features, including one revolutionary one (dualhead).

                                lets hope matrox can do it again

                                personally i'll put my money on a dx9 card with displacement mapping, about gf4 performance and improved dualhead. i also expect a suprise on the features list
                                Dell Inspiron 8200
                                Pentium4m 1.6
                                640mb pc2100
                                64mb gf440go
                                15" uxga ultrasharp
                                40gb 5400rpm hdd 16mb cache

                                Comment

                                Working...
                                X