Hi, Im' pretty concerned about the "benchmarks results" that have been published so far, especially the 3DMark2001 results. 11K 3Dmarks using the "fastest available Athlon XP and KT333" seems pretty low to me (hey, I get that score with my lan box… man, my CPU+mobo+video card cost about the same as a Parhelia board... I know that 3Dmarks should sports the "Tweaked for nVIDIA" certification logo, but even with with a Radeon 8500 64 MB i'm able to get in the 11K... Is it driver immaturity? or maybe that 3DMarks has problem using all of Parhelia's functions (additionnal vertex shaders and texturing units)? An other thing that tickles me is that John Carmack chose to use ATI's R300 over nVIDIA's NV28 AND MATROX PARHELIA to demo Doom3 at E3. He does praise the speed and driver maturity and ease of use of nvidia's cards a lot (and I have to say he is right about those) but he is still a very objective fellow... If Parhelia would have been faster, he would have used it. I'm also concerned about the threat the R300: It will have 8 pipes with 4 tex. units, 256 bits memory bus, full DirectX 9.0 support, the highest end card (there will no be All-in-wonders anymore)with Vid-in, Vid-out and 128 MB of RAM will me priced at the same price than the 128 MB Parhelia (which is supposed to be around $400 i've read somewhere) and whould be available in september. Which leave Parhelia only 2 months as -performance- king… Troubling, isn't it?
Announcement
Collapse
No announcement yet.
Concerned ...
Collapse
X
-
Concerned ...
What was necessary was done yesterday;
We're currently working on the impossible;
For miracles, we ask for a 24 hours notice ...
(Workstation)
- Intel - Xeon X3210 @ 3.2 GHz on Asus P5E
- 2x OCZ Gold DDR2-800 1 GB
- ATI Radeon HD2900PRO & Matrox Millennium G550 PCIe
- 2x Seagate B.11 500 GB GB SATA
- ATI TV-Wonder 550 PCI-E
(Server)
- Intel Core 2 Duo E6400 @ 2.66 GHz on Asus P5L-MX
- 2x Crucial DDR2-667 1GB
- ATI X1900 XTX 512 MB
- 2x Maxtor D.10 200 GB SATATags: None
-
An other thing that tickles me is that John Carmack chose to use ATI's R300 over nVIDIA's NV28 AND MATROX PARHELIA to demo Doom3 at E3. He does praise the speed and driver maturity and ease of use of nvidia's cards a lot (and I have to say he is right about those) but he is still a very objective fellow... If Parhelia would have been faster, he would have used it.Why is it called tourist season, if we can't shoot at them?
-
Which leave Parhelia only 2 months as -performance- king… Troubling, isn't it?
Bigger does not always equate to better.
Joel
BTW: Mine is the biggest around here so don't even go there.Libertarian is still the way to go if we truly want a real change.
www.lp.org
******************************
System Specs: AMD XP2000+ @1.68GHz(12.5x133), ASUS A7V133-C, 512MB PC133, Matrox Parhelia 128MB, SB Live! 5.1.
OS: Windows XP Pro.
Monitor: Cornerstone c1025 @ 1280x960 @85Hz.
Comment
-
I think 11k 3dmarks is pretty good for a non-tweaked system, with VERY early drivers, its right on par with gf4ti. which leads my to believe that 3dmark2001 isn´t videocard limited anymore but limited by cpu/ram/fsb etc...
Im more interrested in a score in 1600*1200*32 or with 4xfsaa/16xfaa, i never expected parhelia to outrun a gf4ti in lowres.
BTW do i win a price?
Last edited by TdB; 5 June 2002, 10:40.This sig is a shameless atempt to make my post look bigger.
Comment
-
some info (proberly rumors) about the R300
But I wonder what they mean with 16 textures
if its for all the pipelines its 2 textures per pipelineLast edited by CaineTanathos; 5 June 2002, 11:01.Hey! You're talking to me all wrong! It's the wrong tone! Do it again...and I'll stab you in the face with a soldering iron
Comment
-
I found this on anandtach.com:
"While talking to one of ATI's board partners we were informed that they weren't keen on building R300 boards because of the fact that they are expecting the design to require a 10-layer PCB. We could not get confirmation as to whether or not ATI's R300 board running in VIA's suite was a 10-layer board or not but according to this board manufacturer, they will only be distributing R300 boards and not producing them."
"The need for a 10-layer PCB comes from increased power consumption where separating power and ground layers becomes difficult and thus requiring the use of more board layers to route traces through. With the amount of power that the 8-layer GeForce4 Ti 4400/4600 boards draw, we wouldn't expect much less from a next-generation part from ATI that is supposed to significantly outperform the GeForce4."Hey! You're talking to me all wrong! It's the wrong tone! Do it again...and I'll stab you in the face with a soldering iron
Comment
-
But to the uninformed John Doe, bigger is better ( well, also for Jane Doe ); About the Inq article, I believe Mcgeek want to say 16 textures per pixel per clock cycle, using 4 pixel pipelines (the same way the Radeon 8500 is able to do 8 texture passes on a pixel in one clock cycle). I'm concerned because I don't want Parhelia to become a V5 5500...
BTW: Joel, you are definitively not the biggest around; I'm the reason they invented the measuring tape...Last edited by frankymail; 5 June 2002, 11:47.What was necessary was done yesterday;
We're currently working on the impossible;
For miracles, we ask for a 24 hours notice ...
(Workstation)
- Intel - Xeon X3210 @ 3.2 GHz on Asus P5E
- 2x OCZ Gold DDR2-800 1 GB
- ATI Radeon HD2900PRO & Matrox Millennium G550 PCIe
- 2x Seagate B.11 500 GB GB SATA
- ATI TV-Wonder 550 PCI-E
(Server)
- Intel Core 2 Duo E6400 @ 2.66 GHz on Asus P5L-MX
- 2x Crucial DDR2-667 1GB
- ATI X1900 XTX 512 MB
- 2x Maxtor D.10 200 GB SATA
Comment
-
1. it makes no sense to have a frame rate faster than your monitors refresh rate. It does increase frame rates in benchmarks, but in games it causes tearing and quality losses.
2. GF cards are limited to 60 hz refresh in their drivers unless a reghack is done.
3. since most users monitors are using refreshes <100 hz those 150-200 fps benchmarks for NVIDIA are all smoke and mirrors.
Dr. MordridLast edited by Dr Mordrid; 5 June 2002, 12:22.Dr. Mordrid
----------------------------
An elephant is a mouse built to government specifications.
I carry a gun because I can't throw a rock 1,250 fps
Comment
-
You're right, but having FPS to spare means two things: either higher res @ good fps and everything turned on or a longer useable lifespan…
Helevitia, I would not switch places with you for anything in the world because "equipped" like you are, the only thing you can have "relations" with is an overgrown pink elephant . . . and I personally prefer to have such "relations" with a being I'm able lift and manage to hold in my arms... Beside, I already have such a being, and overgrown pink elephants are on the expensive side these daysWhat was necessary was done yesterday;
We're currently working on the impossible;
For miracles, we ask for a 24 hours notice ...
(Workstation)
- Intel - Xeon X3210 @ 3.2 GHz on Asus P5E
- 2x OCZ Gold DDR2-800 1 GB
- ATI Radeon HD2900PRO & Matrox Millennium G550 PCIe
- 2x Seagate B.11 500 GB GB SATA
- ATI TV-Wonder 550 PCI-E
(Server)
- Intel Core 2 Duo E6400 @ 2.66 GHz on Asus P5L-MX
- 2x Crucial DDR2-667 1GB
- ATI X1900 XTX 512 MB
- 2x Maxtor D.10 200 GB SATA
Comment
-
You're right, but having FPS to spare means two things: either higher res @ good fps and everything turned on or a longer useable lifespan…
That you can enjoy hi-res and good quality without a big performance-hit, without the need for "spare-fps" in the first place.
Think about it: I really don´t want ultra-high fps in situations that don´t stress the gpu(I really don´t), but i do want acceptable fps in situations that DO stress the gpu.
And neither 3dmark(if it gets 11k+ then i wouldn´t consider it stressfull) nor quake3 puts alot of stress on the gpu, and that is why you don´t see impressive framerates in those benchmarks.
But in situations where a gf4ti drops below 15 fps, then you will actually notice the difference between a gf4ti and a parhelia(even without a framerate counter).
In general (this goes for about everything), you can choose to optimize for a best case scenario, or a worst case scenario.
I believe matrox has optimized for a worst case scenario, but the benchmarks you have seen, benchmarks the gpus in a best case scenario.This sig is a shameless atempt to make my post look bigger.
Comment
-
and I personally prefer to have such "relations" with a being I'm able lift and manage to hold in my arms... Beside, I already have such a beingGigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
Comment
-
from what I have heard and seen Parhelias speed and features seem to be right on the mark. at least 60 fps at high res with reasonable eye candy levels and thats with prerelease drivers etc.
it has more features than I need (but I want them), and by the time dx9 comes out, I expect its performance will improve 20%-%50.
And as matrox says in won't be q3 fps king...because q3 barely uses the new hardware...eg at 160fps it will be running cool, and if you enable AA your fps will stay at 160 and still won't break a sweat.
I am waiting for the for displacement mappped game before I really form a proper opinion(that and the price list)
PS how much is the developer discount?
Comment
Comment