Announcement

Collapse
No announcement yet.

Matrox & Creative Labs = a couple HALF-ASS driver packages!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Thundrchez, you lost me with that bit about NT being fundamentally a one person system? NT has had terminal services facilities since citrix started playing with NT 3.5. Since then MS did a version of NT 4 Server to operate as a terminal server, and w2k has terminal services as a core service on all flavours of server.

    But that is beside the point, a network aware application can report and display its results to any client on the network if it is written to do so. It's just a unix quirk that applications drop the "client" out of client server design and make everyone log on to the server to see what's happening. Sure, it has some advantages from a support point of view (only one computer to keep working) but there is no magic about it.

    Paul

    Comment


    • #32
      "no LOGICAL reason to be upset with Microsoft TECHNOLOGICALLY"

      I don't know if upset is the right word, Windows is annoyingly legacy bound, (DOS shouldn't even be a part of the lexicon by now, neither should FAT) it's a result of the lack of a no guts/no glory attitude on the part of MS, there should have pushed NT a lot harder IMO, for one thing among a million others (use of MFC42.dll as a game of picking straws to see who has the smallest version, etc).

      Don't know if you consider legacy to be a technological issue, but there is lots of room to be critical of MS, they lack leadership beyond what it takes to make a buck. I get the feeling sometimes they are simply writing code for use instead of MS headquarters, and if it fits the rest of the world it heads to marketing with a different number added to the word Windows. I mean if you are sitting on a dragon's hoard of cash, and you still don't like taking any risks, what's wrong with you? About the only interesting thing I've seem from them lately has been on the hardware front with the XBox, they are wimps when it comes to the OS.

      Comment


      • #33
        I Agree, Himself. The problem with Windows lies in its slavery to supporting EVERY pos hardware under the sun... By having to support everthing the OS is bloaded and buggy. From what i hear, windows is a joy to program for, and personally its fast. But the stablity just sucks(Duh). If we can just stop using old 16 bit and 8 bit tech, we might get some stablity, and probably a TON more speed.

        Comment


        • #34
          You make it sound as thought testing software is so easy and testing silicon is so difficult.

          Although I can not discount the difficulty of testing "silicon" as you call it, as you said, it's like comparing apples to oranges.

          Let's just say you're designing a nice little chip, and just to make it more fun let's make it meat MIL specs, just to make sure it meats stringent standards. Even though you may make it with a few million transistors, that's a nice small number compared to some of the other ones we're dealing with.

          Aside from any internal problems that must be tested/dealt with, meaning problems associated with design/fab/etc., there is a finite and rather small set of situations the chip will encounter in use in the real world.

          In contrast, OS programmers have to deal with just as many million lines of code as you have transistors, but they must also deal with a finite and much much larger set of scenarios that can occur when the OS is put to use in the real world. The list of scenarios is enormous compared to those of your nice little "silicon" you designed.

          It is very reasonable to expect programmers to try to minimize the number of bugs in their own software, but fact of the matter is you're dealing with incredibly large possible combinations of hardware/software that all must work 100% ideally. Statistically speaking it is impossible to account for all of this. Even the most rigorous tests can not find every possible bug.

          Many bug fixes are in fact workarounds, because an actual fix is not exactly practical. Practical, such a nice word, isn't it? That's the real beef of the matter here. We can only do what is practical. Just like it is not practical or pragmatic for MSFT to go and ditch FAT and DOS support. Although this may make some of their users really happy, this will really upset some of their other customers. It's a double-edged sword, so MSFT plays it safe and goes with legacy "support".

          And once again, if you think hardware is so perfect, go around and start reading hardware errata, etc. You'll find extensive lists of problems with the hardware (what's that mean? 0 bugs?).

          If we could all just be realistic about the issues here, the higher the complexity of your hardware/software, the more likely you are to get bugs and other "features". It is a natural side effect of complexity. You can demand all the perfection you want, but take a number and have a seat, because you'll be waiting a long time for it (that is, of course, assuming that it every occurs).

          Cars, trucks, airplanes, trains, hardware, software, people, all have "bugs" that cause malfunctions, unexpected situations, disease, etc. None of it's perfect.

          But don't let me discourage you from holding on to your ideals.
          Why do today what you can put off until tomorrow? But why put off until tomorrow what you can put off altogether?

          Comment


          • #35
            I'm not saying that testing software is easy. I'm saying that a lot of software companies do not do as exhaustive testing as they should do. Yes, hardware has errata. However, due to the cost and time that it takes to fix hardware problems, hardware designers do a very thorough testing and simulation of their product. Software design tends to be a lot more sloppy, relatively speaking.

            When you find a software bug, you update a few lines of code, recompile, and rerun the program. Very quick turn around time. When you find a hardware bug, you have to wait weeks to tapeout masks, manufacture reticles, send the updated reticles to the fab, get a hand-held hot wafer lot manufactured, package it, and then mount it on a pcb. This costs a lot of money and time. Hardware companies cannot afford to get it wrong or they miss the market window. Therefore, they test the crap out of their product.

            If software design had the same kind of overhead that hardware design had, they would have to design and test it much better than the current standard because they could not afford not to. This will never happen, though. Everybody is just stuck with buggy software. Just remember that the next time you are in the hospital and you see your life support machine boot windows.

            Comment


            • #36
              spoogenet,

              I think MS has shown it's tendency to develop multiple OS products to do what they want to do, so DOS/FAT could have been continued with one line and a new line started without it. That's just one way they could of handled it, I'm sure with all the manpower they have they could come up with even better solutions. They chose not to, legacy is where they make their money, if there were any real competition in the PC OS market they would have been forced to do things like this and many more. The next time your software doesn't work the way you like it most likely due to the programmer having to deal with Windows and working around things that don't make any sense but are there for legacy reasons. Drivers not working properly? Same thing, legacy passed on for the last decade of PC hardware and OS architecture. Can it be fixed overnight? No. Do you want to have the same legacy issues around 10 years from now? Also no. At some point somebody has to clean house and get rid of the legacy. Call that idealism if you want, but I'd like to see a real fix before I die. I don't think MS will ever do it, legacy is why they are the pratical monopoly they are today, and how they keep raking in the bucks.

              As for programming under Windows, yes, it is easy enough, when it is documented properly and the code fits the documentation and vice versa, but it all depends no what you want to do, if you want multiple threads operating as if they were concurrent, forget Win9x, it's not going to happen, that's where WinNT is so much better. With an ancient platform like the Amiga it was intrinsic to the design. No, MS could have done much better, but they were making too much cash from supporting legacy to bother.

              Comment


              • #37
                Himself,

                Where is the legacy code in NT? No 16 bit code, only DOS VM to allow some well behaved DOS and 16 bit Windows Apps to run - completely isolated from the OS.

                There are plenty of redundant API functions, but every OS has that - as soon as a new version comes out some things are handled differently and APIs are altered.

                I don't think you guys know what you are asking for! At this time the only way Windows can achieve super stability is to completely stop supporting 3rd party vendors and clone manufacturers and have a single hardware specification and supplier.

                Thundrchez, have you any idea of the sort of testing Windows 2000 went through?

                Paul

                Comment


                • #38
                  I've wrote a software program called Navigate alows you to navigate by sea using eletronic charts. To this day I don't know if the program is 100% bug free. But on the outside it seems simple, but on the inside there is some stuff to me at least I had to create that I never did before. This was very difficult. From 92 to 95. And bugs would crop up. A programmer is the worst person to test for bugs because they'll never catch all of them or do what an end user may do. Some bugs were found months later. And this is only a 25k line program in pascal.

                  So I say it's impossible to write a 100% bug free program that is huge
                  Also sometimes fixing a bug causes a new bug too.

                  Writing software is a bitch

                  Comment


                  • #39
                    Galvin,

                    There's a little song, goes something like "100 bugs in your programming code, 100 bugs in your code, fix one up, compile it again, 101 bugs in your programming code."


                    Himself,

                    I think you just hit the nail on the head. Yes, you did. You said that MSFT is making too much money from supporting legacy.... Correct me if I'm wrong, but isn't MSFT a company that is interested in...what's that word? Profit???

                    If MSFT just one day decided to drop all legacy support, you just may hail it as one of the greatest decisions they could have made (perhaps short of going out of business). However there would be quite a few more people who would be rather upset with the idea.

                    When you make a product, and hence try to sell it, the best way to make any money off of it is to provide as many customers as possible with what they want. By maximizing your customer base you can become a monopoly, the true economic goal of all profit-driven firms. The best way to reduce your customer base is to tell them "sorry, gotta go buy from a competitor, because we don't support that option!"

                    If everybody wanted to ditch legacy, then we'd all be ditching legacy. And don't even lay all the blame on MSFT. Ever wonder about hardware legacy? ISA? PCI is so much better. PCI existed well before Intel decided to incorporate it into their bag of tricks.

                    x86? That's legacy that goes back just as far as Windows/DOS. x86 is a horribly crappy instruction set. It's slow and rather inefficient. Other architectures such as MIPS are much faster and have been proven time and time again. So just why does Intel hang onto the x86 instruction set? One reason is most likely compatability. No x86, no Windows (not necessarily bad), and no a lot of other things. Ever also wonder why Intel is pumping money into Linux development? Different OS, chance for different architecture.

                    Computers always face a tradeoff of performance vs. compatability. Intel could be making much faster (performance, not clock rate) processors if they ditched x86 and went with a more advanced architecture. The cost of that, however, is immense. Enter Itanium.

                    If you really want to move forward and advance in technology, EVERYBODY is going to have to do it. New processors, new motherboards, new OS, new programs to do everything you already have them to do, etc etc etc.

                    Although legacy is horrible for performance, it is highly economical.

                    Just a few thoughts.

                    [This message has been edited by spoogenet (edited 26 March 2000).]
                    Why do today what you can put off until tomorrow? But why put off until tomorrow what you can put off altogether?

                    Comment


                    • #40
                      This is too good to pass up.
                      http://www.dilbert.com/comics/dilber...0031347889.gif

                      Comment


                      • #41
                        I was just stuck by the idea that you can't be critcial of MS on the technological level, hey if I had the same situation put in front of me I might do the same thing to make the millions, but I wouldn't pretend it was beyond reproach technically, well as long as I wasn't in marketing.

                        PaulS,

                        "Where is the legacy code in NT? No 16 bit code, only DOS VM to allow some well behaved DOS and 16 bit Windows Apps to run - completely isolated from the OS."

                        I think I said that MS should have pushed WinNT more, not that WinNT was bogged down in legacy. Win98 should have been WinNT+, not Win95+ IMO. (It probably is full of legacy as well, I don't use it much, but probably not in the same way, there is a reason MS couldn't upgrade it past a certain point, it's probably something to do with the NT kernel. WinNT might as well be MacOS for all it has in common with Win9x. It's nuts trying to make one executable compatible with both at the same time, I know that much, yet it is oddly reminiscent of Win95 in a lot of ways, Win98 is the odd man out really.) Thing is, if they had gone with a WinNT+ direction, there might have been a .001 degree opening in the door that might allow some other operating system to get .00002% of the market. So, they ain't the technological wonders of the world, just the money grubbing scum that we all have to put up with.

                        Comment


                        • #42
                          Too funny.

                          Please don't mistake my postings for being a defender of those unworthy of defense. I'm all for more software testing, and it also sometimes appears as though some company's may inject bugs (aka features) into their software just to keep you coming back for more. Defense contractors have done this in the past and called it job security.

                          On the other hand, though, it's not exactly cost-effective to fix every single bug, especially when they're just caused by other sloppy programmers, etc etc etc.

                          Anyway, keep reading those Dilberts, there are some other good ones along the lines of our discussions.
                          Why do today what you can put off until tomorrow? But why put off until tomorrow what you can put off altogether?

                          Comment


                          • #43
                            Himself, the Win9X lines exists precisely because MS was not able to get people to abandon legacy code and move to NT. The NT kernel was the break from DOS, segment offset addressing, cooperative multitasking and non-protected memory models, it came BEFORE Win95. Win95 came out after the market steadfastly refused to move to NT and decided to stick with wfwg3.11. Win95 was the carrot on a stick to get people running enough 32 bit applications that an eventual shift to NT technology would be possible. Win98 / ME are just by-products of the length of time it has taken to produce NT 5. That's why I question the credibility of anybody that asserts that W2k was released without proper testing.

                            It is also why those people who say that MS never intended W2k for games and consumer desktops are wrong (including the MS marketing people). The intention, for more than 6 years, has been that MS will have a single microkernel based platform for its OSs.

                            Paul

                            Comment

                            Working...
                            X