If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
And if the picture was ultra sharp at 1600x1200 in 150FPS they would either get motion sicknes or finaly be able to hit something.
If there's artificial intelligence, there's bound to be some artificial stupidity.
Jeremy Clarkson "806 brake horsepower..and that on that limp wrist faerie liquid the Americans call petrol, if you run it on the more explosive jungle juice we have in Europe you'd be getting 850 brake horsepower..."
<font face="Verdana, Arial, Helvetica" size="2">Originally posted by WaR-ped: Am I the only one in this forum that has decided that this "g550" that is "coming out this summer" is not the type of card that was wanted, and as a result, will not make the purchase?
Somehow, I have come to the conclusion that I have no way of justifying the cost(im sure more than $200USD) of purchasing a new video card that will only outperform my present video card marginally. somehow I dont see the advantage to adding features, such as displacement mappig, that will only slow down the card even more.
Whats everyones general thoughts on this conclusion? Im not a hardcore gamer, but Id like to be able to get respectable frame rates at 1280x1024 resolution. Im not a hardcore graphics guy, but Id like to be able to render in Pro Engineer without the video card choking. Somehow I dont see this new card being able to fullfill this.
my other option is this verclock like there's no tomorrow. Whats the highest clock anyone has ever gotten on their G400MAX? Im out to beat it. Lets see what happens. </font>
(side note... its very weird to quote one's self... sorta a deja vu type a thing.. hehe)
I hate being a pessamist. i really do. when i posted my frustration about a month ago, i really hoped i was wrong. Aside from my price estimate, i was right, in my opinion. honestly, i think matrox made a blunder in spending so much time and effot into the product they have announced. its all fluff with no real content.
Here is what one individual said that i have talked with about it:
As far as the card goes, as far as 2d goes it's one of the best.
The problem is that a majority of business users could use a 8 year old ISA vid card at 640x480x60hz and not know they were missing anything.
The 3d on the new card is supposedly better than their old cards, but with their memory bottleneck, it'll probably run like a Geforce2 MX, or worse, so that's a waste.
They wasted R&D money on the headcasting crap. Do they actually expect business people to buy a card for systems on both ends, figure out how to get thier pics converted into the 3d images, get the connection up and running, and then look at lame-o pseudo talking heads. What advantage does this REALLY have over a regular phone conversation? The point of video conferencing is so that you can see the persons gestures and expressions! Also, many video conferencing systems allow you to toss up presentations, which you then talk over... in other words, there are already tons of companies with hardware out there that do it better.
They spent the money to build in a T&L unit that does this ONE thing. If ANYTHING but headcasting is done, the T&L unit is disabled. Duh!!
And finally, it's only partially DirectX 8 compliant, which means the upcoming DirectX 8 games will run like crap on it. (And for those of you who'll use this argument: At least the Geforce 2's have enough memory bandwith that they have SOME hope they'll be able to run em)
I think he is right on the ball in his opinion.
i think that matrox has 3 different markets to shoot for with this card:
1)upgraders such as myself. fat chance! you think im going to spend another 130 bucks on a card that, for all intents and purposes runs the same, but comes with extra software?
2)business folks that are looking into video conferencing.(the quote above comments on this)
3)idiots.
I honestly think the ONLY thing this card has going for it, is the price. They are damn lucky they are able to sell it for their ideal price of around 125 bucks. if they tried for 200 or more, id be surprised if they sold even one to anyone.
if they are shooting for large corporations and places like airports and the stock market computers, then thats fine, but why even bother making a press release when you are trying to impress large businesses? just send them a proposal.
lastly... as a rant... matrox, get off of the "we rule in 2D!" crap. most office people dont run their monitors past 800x600 anyway, and at that resolution, even a geforce card can do that decently. sheesh
is it really that hard to implement more power into the card? no, its not. is it extremely costly to provide 128bit wide DDR bandwidth and to improve the chip yields so that they run at a decent speed? no, its not. if ATI can do, so can you.
its extremely aggravating to know that there is nothing on the market that i want to upgrade to.
if ATI's RADEON 2 gets the same driver improvement from the RADEON as the RADEOn got from its previous card, then it might be a viable option.
i love my matrox card, and the powerdesk features, but when all of a sudden, my bottlenek becomes my video card, there is something wrong with that, and loyalty and features only carry so far. at this rate, my ethernet card will process images faster than my video card.
maybe matrox should get out of the video card business and just make addon cards. they'd probably sell a lot more of them
what id really like to see, is a matrox card that is like the nVidia Quadro2 in performance comparison and market approach, but at this rate, to get something that performse like that, i'll have to wait 5 years, and by then, it'll have some wasted feature like BodyCasting or something
First Love:
Lite-On FS020 enclosure w/4 120mm Panaflos and soon a 172mm Nidec
MSI 694D Pro w/ BIOS 1.6
2x800E cC0 Pentium 3 w/ 2xVolcanoII
SyncMAX(NEC) PC166 VCM SDRAM 4x128mb w/ CAS = 1
nVidia Quadro2 Pro, but Matrox at heart
And other non-important stuff like hard drives and a dvd drive
<font face="Verdana, Arial, Helvetica" size="2">most office people dont run their monitors past 800x600 anyway</font>
I don't know what type of company you work for but at my company of about 1500 in house employees and about 1800 PCs with over half of them running 21" monitors resolutions of 1280x1024 to 1600x1200 are very common. And with our major apps being fairly graphic intensive, image quality is very important to us. And as a tech support person who works with all these PCs I can say without reservation that the machines with the best image quality is the ones with the Matrox brand cards.
Joel
Libertarian is still the way to go if we truly want a real change.
i dont doubt that Joel... but also coming from the perspective of selling those PCs... i notice that most people dont care about the monitor size. 17" is fine for them, and when i try to tell them about image quality, they dont care... even after I show them the difference(Radeon to Geforce2). "ya, i guess it is a bit nicer. whats the price? ... oh, no, ill take the other one since it costs less"
when refering to "office people" I am refering to the people that use their PCs as typewriters and communications booths. they use MSword, and MS outlook primarily. These people dont notice the advantage that is Matrox, so it is my belief that Matrox is wasting their time by trying to sell to these people with their "headcasting", and "superior image quality".
First Love:
Lite-On FS020 enclosure w/4 120mm Panaflos and soon a 172mm Nidec
MSI 694D Pro w/ BIOS 1.6
2x800E cC0 Pentium 3 w/ 2xVolcanoII
SyncMAX(NEC) PC166 VCM SDRAM 4x128mb w/ CAS = 1
nVidia Quadro2 Pro, but Matrox at heart
And other non-important stuff like hard drives and a dvd drive
Hmm,.....is this forum actually read by business people at all ?
My assumption would be that we are definetly the few that are more into xtras and performance, so .in the end, this card really is not for us......
It is a broad consumer oem card, nothing more, who uses it, i dont care, I was hoping to get a bettercard to suit my needs and that has failed all across the board.
I just want to buy what is not outdated a month from now
<font face="Verdana, Arial, Helvetica" size="2">Hmm,.....is this forum actually read by business people at all ?</font>
I know of several and each and everyone of us has the power to influence the buying decision of our companies when it comes to PCs and the components that go in those machines.
The G550 is perfect for it's targeted market. And for those that think that the G550 is all that Matrox has been working on for the past two years are IMO idiots. Even individuals at Matrox has indicated otherwise as well as several Hardware sites.
Patients.
Joel
Libertarian is still the way to go if we truly want a real change.
Sirhardi
maybe you missed my reply in the other thread due to the heavy flaming going on there .
But if you want Matrox 2D quality and DualHead and working Overlays on both heads, well that's easy:
Get yourself a G400 (or a MAX, if you can get one ), this card CAN do overlays on both heads - you just have to reside with a PD5 driver (favourably the 5.52/5.55 ones)since Matrox for strange reasons disabled this feature in the newer PD6 drivers.
No, this isn't true for the G450 for two reasons:
1. there are no PD5 drivers for the G450, when the G450 came out, the drivers were already version PD6.x
2. the coincidence of dropping support for this on the whole G4x0 line of cards at roughly the same time the G450 faced up brought up the speculation that by integrating the "second head" (RAMDAC, Maven,...) this functionality was lost. And maybe to easy the pain for tech-support explaining that a newer card has less features than it's predecessor it was disabled for the whole line (simply stupid, yes - if it's true...).
I found this when trying to help a G450 owner with this exact problem. Matrox tech-support, however, simply states that this (no overlay on 2nd head) is just the way it's supposed to work - you know, it's not a bug, it's a feature
P.S. The cost-reducing integration on the G450 seems to have had other adverse effects esp. on TV-Out (high CPU-usage, limited resolution/bit-depth even on 32MB models, worse quality).
[This message has been edited by Indiana (edited 20 June 2001).]
Comment