If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Originally posted by ElDonAntonio Doesn't Matrox allow you to exchange the card for this reason?
Nope. You can't exchange a card for a problem that they won't even admit exists.
Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
Originally posted by leech So which is the banding? I occasionall get that effect where different chunks of the screen are rendered at wrong times (kind of a weird way to explain it, but it fits) But I had thought that banding was more of a refresh type problem where it looks like there are lines going up and down the screen (kind of the effect of when you have a TV too close to your monitor) Is this right?
Leech
This is vsync mismatch. It's what's called tearing when you disable vsync.
Originally posted by leech So which is the banding? I occasionall get that effect where different chunks of the screen are rendered at wrong times (kind of a weird way to explain it, but it fits) But I had thought that banding was more of a refresh type problem where it looks like there are lines going up and down the screen (kind of the effect of when you have a TV too close to your monitor) Is this right?
Leech
The original 'banding issue' is a kind of interference. Affected are only analog (RGB) outputs of parhelia. All signals that haven't to go through the RAMDACs are clear. It's definitely no render problem. Such problems would be possible to framegrabb. Render problems are normally software or driver anormalities (or other issues...). Well it's no real 'refresh type problem' either. The 'pixel lines' aren't changing or jumping. They are stable.
But it's like making a movie of an active monitor with a camcorder. On your film you will see lines or bandings displayed on the monitor (or flickering), most of the time, because the monitor isn't synchron in it's refresh to the sample frequency of the camcorder's ccd. Only when they are synchron you won't recognize them...
In this issue the size, direction and speed of the bandings is steadyly variing, because of the variing amount of frames, displayed in a certain interval.
So it's like viewing on such a created 'monitor movie' with a rapid or at least movement of bandings...
What varies is the intensity of the lines. PC-System or maybe card dependent.
Hey, you all, correct me please if you don't agree.
Vsync tearing and banding are totally different. Vsync tearing is something any card can do when vsync has been turned off, and is caused by the scene changing while the monitor is still scanning out the picture. It can happen on a Parhelia at the same time that you're getting banding, so that may be causing some confusion. Here's some pictures that hopefully will clear it up a little.
Normal still image, or moving image with vsync on:
Moving image with vsync off:
Image with Parhelia Banding (patent pending)(tm)(c)
Vsync tearing happens on all cards and certainly on Parhelia's digital output, too.
ps. That confused me yesterday with the crypt scene, too...
pps. An other 'feature' of Parhelia Bandings (TM) is that they appear over all (analog) displays / CRTs. They don't affect only the certain one with the 3D Application scene or not only the certain displayed 3D scene in a window. Vsync tearing is limited to the displayed 3D scene.
Originally posted by JaG
What varies is the intensity of the lines. PC-System or maybe card dependent.
It's card dependent. A couple of us here have tried changes to our systems, or switching cards around. The best anecdote we have is a guy whose business ordered about 10 Parhelia cards, and put them in identical workstations. Some banded, some didn't. Moving the P cards around, the problems tracked with the Ps.
Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
I don´t think the behavior changes over time, I noticed it the first week(but blamed it on my monitor in the beginning), and it haven´t changed since, and that is many months ago
This sig is a shameless atempt to make my post look bigger.
Originally posted by ElDonAntonio You should double check, I remember Haig saying to a guy in the matrox forums he could rma the card because of that. Or maybe I'm delirious...
Now as you mention it, I think I read it, too, but can't find it any more...
He said s.th. like you should get an rma take the new card if the issue is found here, too, you should repeat the whole... Until you'll get a fixed one or your money... . Well, that was a half year ago. And if so and other tried it like that I don't know how they would reakt today...
But again, Wombat, did you really see cards without this issue? Or only cards with less symptoms?
Originally posted by JaG But again, Wombat, did you really see cards without this issue? Or only cards with less symptoms?
The only card I've seen with my eyes is my own card. But we've got a large enough pool of trustworthy users here that I know some people can't even see the problem on their video cards. Mine was pretty bad. It also wasn't psychosomatic - my Parhelia was one of the first retails out the door, and I was talking with BBz about the problems I was seeing before it really caught on publically.
Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
The reason to pose this question to you was, that you told me from the guy with the different behaviours on identical workstations. So you noticed the issue, already at your own card and know, too, how to discover it.
In my case, I wouldn't have noticed the issue, for a long time or perhaps just a year or more, if i hadn't read something about, by chance, in Matrox support forum (about a half year ago), although I preordered my own card(, too).
Those I bought for business (later) could I only check basically. I hadn't noticed it there, if I wouldn't have known what I was looking for.
That there are different grades of the interferences or what ever they are, is clear now. So I was perhaps lucky but that's, what this question makes difficult.
If I ask anyone (and they would be so nice to answer me ), there are perhaps some or some more who never saw clearly that, we call the 'banding issue'. So it's a bit confusing (especially for those who didn't follow this longer thread or former threads with this subject). Perhaps some would say no there are no bandings, although they have an slightly affected card. Others would think we mean this Vsync tearing and would say yes, we see these bandings, although they perhaps have an issue free card...
It's difficult. - That's what makes it diffuse is, that there are no official statements. But if so, it might be too expensive to survive.
Where do you want to set the border between yes, you are eliable for an RMA or not.
When discovering that perhaps 99 % are affected but 60 % of those affected cards do only have very slight forms of this issue, will you send those cards back to the users perhaps on their costs?
It's difficult. The Pentium Bug costed Intel a huge amount of it's profit, when it appeared. But Intel's Pentiums were selfseller. So it didn't leave permanent damage to Intel (and wasn't the last bug )...
Matrox is a fine small, creative and competent firm, but lacks already to much of the market profit. So perhaps it would leave permanent damage...
On the other hand what's i.e. nvidia doing? They've had masses of bugs in their chips (undocumentated as well). Same with ATi, too. In some gens. were whole units defective. Did they take their cards back? Won't think so.
the difference is that, until now, I always viewed matrox as a company that focused more on quality than their competition(atleast in the consumer-market). they have always had a great reputation for high 2d-image quality, and is also one of the few companies i wouldn´t suspect of degrading image quality, in order to win benchmarks in reviews.
in short: other graphic-card companies have speed or extra features, or even cheap prices, as their trademark, matrox have stability and image-quality as their trademark.
I have always viewed matrox cards as a luxury-item: the thing to get when you are picky about quality, and are ready to pay extra for that quality.
I also think a typical matroxuser to have a better eye for faults like this, because someone who spends a small fortune on a graphic-card, from a company with a famed reputation for image quality, will probably have a very good monitor.
I actually expected matrox to admit the bug and deal with the consequences, because of their reputation.
Nvidia is a completely different company, their trademark had always been speed and compatibility, their typical costumers are ready to pay extra for speed, and expect their cards to be speedy. their bugs hasn´t hurt them as much, because no matter how faulty their cards have been, their costumers have been more concerned with speed and compatibility than faults, they payed for compatibility and speed, and as long as they got what they expected they are happy, to a certain degree of course. note: this is beginnig to change because of the high price graphic-cards have nowadays.
rigth now both nvidia and matrox have relased cards that IMO doesn´t live up to their respective companies reputation: the parhalia has a bug that effects image quality and didn´t deliver the quality that was expected, and nvidia´s nv30 didn´t deliver the speed that was expected(unless you think their extreme cooling is an acceptable design choice).
ATI on the other hand previously had a quite shady reputation for their drivers, so people didn´t relly expected a completely flawless product from them. they are AFAIK doing quite well by changing that reputation, because their latest products have been supricingly impressive.
Matrox CAN still survive, their reputation for image quality has been damaged, but their reputation for stability has not. and entering the pro-3d market, automatically(if you are successfull) improves the stabiltlty/quality reputation(yeah I know it is silly, but ignorant people think that if you have a ridiculessly expensive product-line for the professional user, then they assume that you must be able to make very good quality-products, sometimes the high price alone makes a product looks like an impressive flagship-model).
Comment