Author |
New ATi card kicks back Nvidia |
BackSlash Marshal Galactic Navy
Joined: March 23, 2003 Posts: 11183 From: Bristol, England
| Posted: 2005-10-08 17:09  
Click, read, and if you have an Nvidia card, weep
Read the info on the pages (so you know that ATi having less pipelines isn't BAD in this case, and it is an interesting read if you're interested in this sort of thing).
For those who can't be bothered to click...
This is my personal favourite...
X1300 beating a 7800GTX....
And just to annoy RevenG
[ This Message was edited by: BackSlash *Jack* on 2005-10-08 17:17 ]
_________________
|
-RevenG-
Raven Warriors
Joined: March 03, 2004 Posts: 2673
| Posted: 2005-10-08 17:18  
PWNED.
_________________
|
Ulric Winters Fleet Admiral
Joined: February 21, 2004 Posts: 198 From: Somewhere
| Posted: 2005-10-08 17:27  
OMGWTFBBQSAUSAGEPWNT!
_________________
|
JackSwift Cadet Sundered Weimeriners
Joined: October 30, 2002 Posts: 1806 From: Where the Sun dont Shine (Seattle-ish)
| Posted: 2005-10-08 18:36  
I'm not surprised. The new ATi card has 512 MB of memory as opposed to 256 MB on the 7800GTX. Of course it's going to get higher 3d Mark scores.
_________________ (too lazy to rehost that old sig)
\"Errare Human Est.\"
|
Philky!
Joined: July 19, 2004 Posts: 90
| Posted: 2005-10-08 21:09  
I knew you were the poster of this thread before I even opened it. This thread proves nothing. ATI releases a card after Nvidia and it is better. No surprise there. Xbox was released after PS2, which one is better?
_________________
|
Pegasus Grand Admiral Pitch Black
Joined: August 02, 2005 Posts: 434 From: Eleventh galaxy on the right!
| Posted: 2005-10-08 21:36  
Not much of a lead if it has 512 compare to 256 of the Nvidia, that is actually a very disappointing result for ATI.
And who on eath plays 1024 these days? *boggles*
_________________ Retired K'luth Combateer
|
Pegasus Grand Admiral Pitch Black
Joined: August 02, 2005 Posts: 434 From: Eleventh galaxy on the right!
| Posted: 2005-10-08 21:37  
Not much of a lead if it has 512 compare to 256 of the Nvidia, that is actually a very disappointing result for ATI.
And who on eath plays 1024 these days? *boggles*
_________________ Retired K'luth Combateer
|
Jar Jar Binks Grand Admiral
Joined: December 25, 2001 Posts: 556
| Posted: 2005-10-08 22:17  
why would i weep? i acually dont care if ATI have 5 points better in whatever, im happy with my Nvidia and opposed to ATI, it never screws up on me. so cry more
_________________
|
DOM700 [-IMO-] Fleet Admiral
Joined: July 26, 2001 Posts: 3175 From: Eckental, Germany, Sol-System
| Posted: 2005-10-09 01:59  
Quote:
|
On 2005-10-08 21:09, Philky wrote:
I knew you were the poster of this thread before I even opened it. This thread proves nothing. ATI releases a card after Nvidia and it is better. No surprise there. Xbox was released after PS2, which one is better?
|
|
PS2
At least if the Xbox stays legal
_________________ If the buildings on your planets disappear, guess who was there....
Never forget what you fight for
I have earned my betatester badge for being part of the open beta
|
Tbone Grand Admiral
Joined: July 21, 2001 Posts: 1756 From: Vancouver
| Posted: 2005-10-09 03:21  
Quote:
|
On 2005-10-08 21:09, Philky wrote:
I knew you were the poster of this thread before I even opened it. This thread proves nothing. ATI releases a card after Nvidia and it is better. No surprise there. Xbox was released after PS2, which one is better?
|
|
PS2. Faulty comparison really, consoles are all about opinions. You can't debate on the performance of video cards.
_________________
|
BackSlash Marshal Galactic Navy
Joined: March 23, 2003 Posts: 11183 From: Bristol, England
| Posted: 2005-10-09 07:19  
The 512MB of ram just means it can handle more textures without having to trash the HD to get more. It means it can work with more on-memory textures instead of waiting for the RAM. In 90% of most apps today, the full 256MB of memory is NEVER used. The only reason they have the 512MB is because they want the X1800XT to cater for everyones needs.
The real boost in performance comes from its custom designed architecture. It's a total work from scratch approach, and it's working (as you can see). It works via having less pipelines than the Nvidia card, but each pipline (pixel and vertex), are each designed for a certain aspect. If a certain thing wants to be carried out, it uses pipe#1, which has been design to deal with that faster. Instead of having a general processing and using all pipes. This actualy works out faster when using programs that have a lot of functions and a lot of calling of stuff.
If you read the article, you would actualy see that ATi have developed a new memory management system, like AMD, they moved the memory controller closer to the chip (on-chip). This allows for faster access and processing times. ATi have also moved over to 90nm, which means they can effectivly double their power, and they have. The X1800XT has an effective clock rating on the memory of 1500Mhz. That's due, not to a late coming or spending more time on the card (the X1800 was actualy ready BEFORE the 7800's, but they were doing driver optimizations because the new series of drivers will support brand-new features to work with the new series of cards (testing basicly)), it's just due to the fact that they worked hard on a new architecture, and it works very well.
In some tests, the X1300Pro actualy out does the 7800GTX because of its new architecture. Programmers have actualy turned from wanting to use Nvidia cards, to ATi cards, and now market-leading professional graphics development companies and such are moving over to ATi (because of their support for 512MB native with the card, and because it's cheap compaired to the Nvidia professional cards).
Ati decided that instead of just giving it more pipelines to deal with more functions and more shaders, they would instead give it less, but use each pipeline to deal with certain things, therefor focusing several pipes on certain calls (the most used ones). This actualy works out better in shader heavy programs (especialy pixel and vertex shader ones).
I suggest you do some research before saying "It's come out later", or "It has 512MB memory, ofcourse it's going to win". Also for those who say that ATi cards "foul up" more, are wrong. Their drivers are more stable, and more reliable than Nvidia ones. Infact, looking at the quater for ATi and Nvidia... Nvidia have had more card failures and send backs than ATi. ATi cards are also notoriously quieter, and use less power, meaning they will work better in the long run (fan won't break down, and the card runs at a lower temperature due to less power consuption).
So, I suggest you do your reading up before posting certain things like the replies above. I also suggest you look at both companies, cards, and what they are doing.
- Jack
[ This Message was edited by: BackSlash *Jack* on 2005-10-09 07:23 ]
_________________
|
Fatal Rocko Willis Fleet Admiral Fatal Squadron
Joined: March 01, 2003 Posts: 1336 From: Kentucky
| Posted: 2005-10-09 13:57  
I am with you Backy... I have a old PCI ATI 7000 on my cranky 1.3Ghz 256MB Ram HP that i got 3 years ago to play DS when I was stationed out in California. That card still resides in that machine (it doesn't have a AGP slot only PCI) and handles all my gaming needs for that computer. I know that thing, the computer, cannot handle modern games like HL2 and its competing games. Anyways to make a long story short i tried a 64 MB Nvidia PCI card (my ATI is 32MB) and I had nothing but problems with the Nvidia Card after the first weekend of gaming... The card looked nice for three days then all went out the tubes.. haven't had the card installed since then.
Love ATI, and will stay ATI.
Me
_________________
|
Shigernafy Admiral
Joined: May 29, 2001 Posts: 5726 From: The Land of Taxation without Representation
| Posted: 2005-10-09 14:07  
On the other hand, I have a four year old nvidia card that still works like a champ. I can't play Rome: Total War or such, but I can play KotOR 2... albeit with a sometimes reduced framerate. Still, its gits r dun.
No problems on my end. Not that I have a rabid loyalty to Nvidia therefore, but I have never had an issue with them and have generally been impressed with my card's performance.
_________________ * [S.W]AdmBito @55321 Sent \"I dunno; the French had a few missteps. But they're on the right track, one headbutt at a time.\"
|
Ramius Fleet Admiral Agents
Joined: January 12, 2002 Posts: 894 From: Ramius
| Posted: 2005-10-09 14:44  
I play hl2 in 1024x768 with 70 fps on a geforce4. its fine for me.
_________________
|
Diabo|ik Grand Admiral
Joined: August 16, 2002 Posts: 327 From: Quebec, Canada
| Posted: 2005-10-10 09:35  
Quote:
|
On 2005-10-09 14:07, Shigernafy wrote:
On the other hand, I have a four year old nvidia card that still works like a champ. I can't play Rome: Total War or such, but I can play KotOR 2... albeit with a sometimes reduced framerate. Still, its gits r dun.
No problems on my end. Not that I have a rabid loyalty to Nvidia therefore, but I have never had an issue with them and have generally been impressed with my card's performance.
|
|
Quote:
|
I play hl2 in 1024x768 with 70 fps on a geforce4. its fine for me.
|
|
Truth to be told, I own a 3 gen old video card and don't feel the need to acquire a better one yet simply for the reason that today's games scale much better across the video card range than the processor range. Even then, computers with relatively slow ( 1-2 ghz range ) processors with a decent vid card ( Gf4-Rad9800 range ) can play most of today's game with a good framerate if you are willing to sacrifice a bit on the eye candy. Back then it was much harder to tweak a new game so it could run decently on a relatively slow system. Furthermore, in the latest years, processing power skyrocketed on the 2 main computer components necessary to deliver enough frames for an enjoyable gameplay. Prices went down. Also by adjusting the resolution you can get new games to run on very old hardware if they have the require standard shaders implemented and enough ram for the textures ( which can most of the time be adjusted with a texture resolution slider... ).
We live a golden age on the hardware gaming side, the software side isn't all that good tho.
The old performance enthusiast in me craves for a twin-sli X1800 setup tho .
_________________ Mostly Retired.
|