Jump to content


AMD and nVidia


40 replies to this topic

#1 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 25 June 2008 - 12:52

A review says it all..

Well, the new stuff is out, so if you need an upgrade, or are just an early adopter, grab your chance. It's also way more competitive now. nVidia is still the fastest, but ATi can match them easily with Crossfire setups...

Pick your choose. My vote in this round goes to ATi though, their HD4800 is just awesome.
Posted Image

#2 G-sus

    batshit insane

  • Member
  • 802 posts
  • Projects: Coding Skynet

Posted 25 June 2008 - 12:58

well as the latest generation of Nvidia came out, it looked like the end of ATI.
but very suprisingly the 4870 has not even better tech inside, but more performance after all...
also is quite a lot cheaper...
thinking about getting some of them.
Posted Image
(Sig by The DR)

True beauty comes from heart and mind.
(but perfection has also big boobs)

#3 Eddy01741

    E-Studios Uber Computer Geek

  • Member
  • 2223 posts

Posted 25 June 2008 - 14:10

Well, looks like AMD/ATI finaly caught up to nVidia after being behind since the nvidia 6xxx and ati X8xx generation (basically, they have been behind for two generations). Well, if I ever gget a new video card, I'd only have like a 150 dollar budget, So i'd still be looking at the cheaper 8800 series cards or maybe a HD3850 or 3870.
Posted Image

#4 Libains

    Light up life.

  • Gold Member
  • 4950 posts

Posted 25 June 2008 - 22:09

I'm still an Nvidia guy personally - always used their cards and never seen fit to change them - hence I won't. Although, the ATi stuff does look extremely nice this go around.
For there can be no death without life.

#5 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 26 June 2008 - 11:29

View PostEddy01741, on 25 Jun 2008, 16:10, said:

Well, looks like AMD/ATI finaly caught up to nVidia after being behind since the nvidia 6xxx and ati X8xx generation (basically, they have been behind for two generations). Well, if I ever gget a new video card, I'd only have like a 150 dollar budget, So i'd still be looking at the cheaper 8800 series cards or maybe a HD3850 or 3870.

Expect the 4850 to drop come close to that level within a month or two, really.

But alas, it's just up to the PC you have. If the rest of your setup is crap, the performance gain of this 1 Teraflop thing will be minimal over an 8800GT.
Posted Image

#6 Eddy01741

    E-Studios Uber Computer Geek

  • Member
  • 2223 posts

Posted 27 June 2008 - 15:57

Yep, I know, if i buy a new pc, i'm gonna make it from scratch again, my lil 4200+ X2 A64 processor ain't up to the standards of the Core 2 Duos of today (especially the penryn models), not to mention that I'm still running DDR memory. Well, I mihgt be getting a laptop instead anyways, a little more useful for daily use, and life is getting busy, school and all, so less time for gaming.
Posted Image

#7 -Xv-

    Teh Titelz

  • Member
  • 910 posts
  • Projects: ShockWave Balance

Posted 27 June 2008 - 22:39

ATi has definitely put out something good this time around... It can even beat EVERY nvidia card in some cases (bioshock/UT3) so it all depends on drivers/engines as to which card is better. I have a 3870, and I'm kinda sad to see that after 5 months of having this card its at the bottom of VGA charts again :( but i need a new mobo, a new PSU and then maybe I'll upgrade to a 4850/70. We'll see.

Posted Image




Hamachi Status: : Posted Image

Much thanks to p4ved for siggy ;)

#8 Prophet of the Pimps

    Masters of Booty Strike Force

  • Gold Member
  • 11369 posts
  • Projects: ShockWave

Posted 30 June 2008 - 15:19

For that price you gotta go with ATI but since i have to use linux too some times and ATI drivers are shit on it, so because of that i have to get a Nvidia 9800 series card.
Never underestimate a Resourceful Idiot
Posted Image

#9 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 01 July 2008 - 21:02

Linux performance improved greatly since last Catalyst updates. I recommend you searching the most recent values of Linux performance before going for nVidia already.
Posted Image

#10 Beta9

    Semi-Pro

  • Member
  • 265 posts

Posted 02 July 2008 - 06:45

Five words: google the geforce gtx 280

it's nvidia's next gen gpu which is supposed to challenge Intel's to-be-released new graphics rendering system called Larrabee
also, while ur at it, check out the gtx 280's specs, its a pure beast (btw, it was featured in the August 08 issue of Maximum PC)

Edited by Beta9, 02 July 2008 - 06:50.

Posted Image

Posted Image

Posted Image

#11 -Xv-

    Teh Titelz

  • Member
  • 910 posts
  • Projects: ShockWave Balance

Posted 05 July 2008 - 04:07

actually, the gtx280 is way to expensive for what it can do.

You can get 2 x 4850s for half the price which can beat it in some cases, or get 2 x 4870s for still less price and beat the shit out of it.

Even if you only get one 4870 which is only around 300 dollars compared to the gtx280s 650 dollars, you would still only be 5-10% away in performance. Then we get the 4870x2 in july which will truly beat the shit out of the 280 alone for STILL LESS then the 280.

Take your pick.

Posted Image




Hamachi Status: : Posted Image

Much thanks to p4ved for siggy ;)

#12 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 09 July 2008 - 09:14

nVidia lowered prices ''due to pressure by competing products''

GTX 280 around $450
GTX 260 around $300
9800GTX+ around $199

source (dutch)

And it's true as I can see GTX260 in the store with prices around €250 already. GTX280 dropped under €400

Also note while we were thinking nVidia was the most sinful when it's around megasize GPU's with extraterrestrial power consumption, Intel joins the fray with this:

Posted Image

With it's 2 Teraflops on stock, it has almost twice the potential (and it has 16, 24 to 32 ''cores'') of the current high end GPU's from ATi and nVidia.. and serious power consumption up to 300W for the absolute high end model. It all depends how intel thinks to implement this, it's different, and we all know that the second serious thing from GPU making is, creating drivers..

Please note that this is something different than the competition, because aside from the Intel GPU, the cards also come with an integrated extra CPU, which can for instance do physics and AI tasks. This should remove the obvious CPU bottleneck that this super GPU would create.

Take a last note that this powerhungry beast is based off the obsolete Pentium Architecture, which may explain why it isn't as advanced as the recent Penryn architecture, and it seems like it's just an easy way to get something on the market instead of having to invest in time consuming development to make a CPU/GPU/This combination with equal performance without the extreme power consumption.

All this babble I have been telling here is mainly what I could get for facts and what I could get out of it, as I still don't really understand all this stuff.

Edited by Aftershock, 09 July 2008 - 09:20.

Posted Image

#13 Beta9

    Semi-Pro

  • Member
  • 265 posts

Posted 11 July 2008 - 04:00

View PostAftershock, on 9 Jul 2008, 2:14, said:

nVidia lowered prices ''due to pressure by competing products''

GTX 280 around $450
GTX 260 around $300
9800GTX+ around $199

source (dutch)

And it's true as I can see GTX260 in the store with prices around €250 already. GTX280 dropped under €400

Also note while we were thinking nVidia was the most sinful when it's around megasize GPU's with extraterrestrial power consumption, Intel joins the fray with this:

Posted Image

With it's 2 Teraflops on stock, it has almost twice the potential (and it has 16, 24 to 32 ''cores'') of the current high end GPU's from ATi and nVidia.. and serious power consumption up to 300W for the absolute high end model. It all depends how intel thinks to implement this, it's different, and we all know that the second serious thing from GPU making is, creating drivers..

Please note that this is something different than the competition, because aside from the Intel GPU, the cards also come with an integrated extra CPU, which can for instance do physics and AI tasks. This should remove the obvious CPU bottleneck that this super GPU would create.

Take a last note that this powerhungry beast is based off the obsolete Pentium Architecture, which may explain why it isn't as advanced as the recent Penryn architecture, and it seems like it's just an easy way to get something on the market instead of having to invest in time consuming development to make a CPU/GPU/This combination with equal performance without the extreme power consumption.

All this babble I have been telling here is mainly what I could get for facts and what I could get out of it, as I still don't really understand all this stuff.



I heard about Larrabee a while ago and from what the Intel engineers claim, her graphics processing speed with be nothing short of beastly. For that matter, Larrabee will be able to render graphics which are near life-like quality.
Posted Image

Posted Image

Posted Image

#14 Prophet of the Pimps

    Masters of Booty Strike Force

  • Gold Member
  • 11369 posts
  • Projects: ShockWave

Posted 11 July 2008 - 08:06

Sweet. 9800GTX here i come.
Never underestimate a Resourceful Idiot
Posted Image

#15 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 11 July 2008 - 15:23

Mind you there is the 9800GTX (currently on sale) and the 9800GTX+, the plus indicates it is made on 55nm process technology, thus is smaller, and that has allowed for higher clock speed. That one is faster, but it isn't on sale yet.

Anyway, do not get a normal 9800GTX, it is inferior to the 4850, and a heatwave generator. The 9800GTX+ is on par, or a little bit better by a few FPS.
Posted Image

#16 Prophet of the Pimps

    Masters of Booty Strike Force

  • Gold Member
  • 11369 posts
  • Projects: ShockWave

Posted 11 July 2008 - 17:37

i got a slow ass processor bottlenecking my whole system so i have no use for anything much faster. Just need a decent price to performance ratio.
Never underestimate a Resourceful Idiot
Posted Image

#17 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 12 July 2008 - 11:04

You'd be better off with an 9800GT then.. $149 for a boosted 8800GT, and due to your bottleneck the difference with a 9800GTX will be minimal. Either way I wouldn't recommend the 9800GTX+ over the ATi 4850. It has inferior technology with more heat generated, and just for sake of reprimand I'd buy an ATi to keep competition on, and punish nVidia for slacking development. [/fanboy]

Update on ATi's behemoth, the 4870X2:

Posted Image

Oficial slides

Posted Image

Posted Image

Posted Image

source

Picture:

Posted Image

And for whoever knows, it is claimed and proven that microstuttering, the main problem of Crossfire setups, has left the building with this new thing: link

Edited by Aftershock, 12 July 2008 - 15:49.

Posted Image

#18 G-sus

    batshit insane

  • Member
  • 802 posts
  • Projects: Coding Skynet

Posted 12 July 2008 - 11:35

well, the 4870X2 may have some problems at the first versions, as all X2 have...
the 280 chip is mostly damn advanced, as nvidia has learned from the problems they had with the 8800, problem is just, that thingy is really much overpriced, as good it may be.
the ati cards... well they´re pretty cheap in price, and definately got a very good price/performance rating.
but as usual, drivers arent finished yet and have some problems, also nvidia is supported better by most games.
Posted Image
(Sig by The DR)

True beauty comes from heart and mind.
(but perfection has also big boobs)

#19 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 12 July 2008 - 15:28

Time to return to that comment. I don't really think you get the overview. And what exactly were the problems with the 8800? That card blew everything apart in the time of it's release. And what are exactly the driver problems for ATi, i'm guessing it's just another prejudice? And the other problems? Oh yes, there is the heat generation. Try turning fan speeds up, it's only on 10% or so to preserve noise levels.

Fourth, the graphics chips are not overpriced. They are produced at 65nm, with 1,4 billion transistors. It is twice as big as the ATi RV770 chip, which has 956 million transistors at 55nm, and thus costs a lot more to produce. For the technological advancements, I suggest you go reading the review I posted in the first post. It's a very adequate and professional review which explains everything you need to know.

The 4870X2 performance on 3Dmark Vantage, is on extreme presets, is said to be above X5500, that was the long time rumor. The average score for a GTX280 is X4800, around X5000 or so with better drivers I guess. So that's already a significant boost. Anyway, the source that I am getting all this information from is the main insider CJ. He's getting info for the dutch site 'tweakers, and has revealed that it might go up as far as X6920. Two of these 4870X2's in crossfire are said to reach X12500, a similar score which can be achieved by using three GTX280's in Tri SLI.

This is probably very optimistic, and we should also note that I doubt seeing such results in games. Using two 4870X2s in Crossfire means using 4 cores, which means very variable performance in games with the scalability that such a setup will obviously have issues with. In any way, one X2 will perform better than two default 4870 cards in crossfire, and it may pass the theoretical scalability of two of them as well (which means, if one 4870 scores 50 FPS on game A, the X2 will perform better than two of those literally multiplied, which is 100+ FPS. This is really unconfirmed though, and such performance ratio's will probably be rare if achieved.)

But I hold on to it :rolleyes:

Posted Image

:P

Edited by Aftershock, 12 July 2008 - 15:42.

Posted Image

#20 G-sus

    batshit insane

  • Member
  • 802 posts
  • Projects: Coding Skynet

Posted 12 July 2008 - 16:18

well, even if the 8800 had great performance, it had several technical "bugs" of greater impact.
of course only for the first version, there was that memory leak, but it was just a driver issue.
the point is, it wasnt balanced. it had a bottleneck, which has been removed in the 280 chips, and they are balanced in their components and interfaces.
also, choosing a 280 would get you around that typical X2 problems like micro-lag due to synchronisation, ect.
on other note, the gpu of the 280 has much better thermal stability (no, dont mean it doesnt get freaking hot), so it can withstand much higher temperatures, the ATI doesnt. (and they get hot as shit too)
its very obvious that you get with the ati solution more performance/money, but nvidia is definately the more advanced chip.
just to let u know, i´m not trying to pick on ati just cuz of prejudices (i like them, my current card is a x1900xtx), but the 280 is clearly the best graphics card atm. just too damn expensive, and had to deal with unrechable expectations, like having double performance compared to the 8800 chip.
Posted Image
(Sig by The DR)

True beauty comes from heart and mind.
(but perfection has also big boobs)

#21 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 13 July 2008 - 12:17

I think it's very obvious you didn't thoroughly read my posts :P

Quote

choosing a 280 would get you around that typical X2 problems like micro-lag due to synchronisation


Quote

And for whoever knows, it is claimed and proven that microstuttering, the main problem of Crossfire setups, has left the building with this new thing: link


Quote

the gpu of the 280 has much better thermal stability


Quote

Oh yes, there is the heat generation. Try turning fan speeds up, it's only on 10% or so to preserve noise levels.


And I really do ask myself whether AMD is stupid when they run their fanspeeds at 10% and have their chips at temperatures exceeding 80 degrees at normal load, or use a single slot cooler on a card that performs equal to an 9800GTX, when their chip is thermally less stable, when the GTX260 and 280 have the largest coolers seen on graphics cards up to date. (they even encompass the entire PCB, also on the back) Anyway, I don't want to turn this into a discussion topic. It's only that I can't stand ungrounded arguments. A GTX280 is dying as soon as it passes 110 degrees, I don't think the ATi chips do less than that, as if they did then AMD wouldn't let them run so close to that fail border.

Another reason nVidia's chips are so expensive (or were so expensive, now they probably don't make a cent on them, and are just selling to keep market share), because it has very low yields.

Here is one cool thing from nV though. Their new 9800GT reminds me of the 7900GT, small, cheap and performing. This card is equal to the 8800GT, with the difference that it uses 55nm process technology (just like the 9800GTX+). I like the looks of it, and I might just get one:

Posted Image

I don't really know about any ''bottleneck'' for the 8800 cards, could you light me up on that?

Edited by Aftershock, 13 July 2008 - 12:25.

Posted Image

#22 G-sus

    batshit insane

  • Member
  • 802 posts
  • Projects: Coding Skynet

Posted 13 July 2008 - 13:09

it was something like the interface between the shaders and the memory iirc...
but it doesnt slow anything else down, as it is always present, it just keeps the 8800´s from having their true performance.
its a design error.
also, why are like half of you about the 9800, both 4870 and 280 are better than that...
Posted Image
(Sig by The DR)

True beauty comes from heart and mind.
(but perfection has also big boobs)

#23 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 13 July 2008 - 13:21

Because we do not compare a $149 card with a $299 and $499 one. This is the GT, not the GTX+

It's all about the price ranges, and in it's range the 9800GT is currently a very good choice, (Until ATi launches the RV740 probably, but that won't be until september) that's why I posted it.

No chip is perfect. The ATi HD2900 had much more internal problems. It had way more processing power than the 8800, but It didn't have enough TMUs (Texture units). It wasn't balanced at all, and thus was inferior to the 8800 (G80). No chip is perfect, and really, the GT200 also is totally not.

Edited by Aftershock, 13 July 2008 - 13:30.

Posted Image

#24 G-sus

    batshit insane

  • Member
  • 802 posts
  • Projects: Coding Skynet

Posted 13 July 2008 - 13:27

oh, that explains. maybe i didnt get the discussion then.
i was about the best performance, regardless of the price, since i wont care about it when i´m getting my new computer very soon... :P
(was thinking about maybe 2 280s, 8GB ram and a new 24" tft maybe... :lol:)
Posted Image
(Sig by The DR)

True beauty comes from heart and mind.
(but perfection has also big boobs)

#25 Shirou

    Humble darkspawn

  • Member
  • 3328 posts

Posted 14 July 2008 - 18:29

First preview tests of HD4870X2 are out. Pretty performance!

http://www.driverheaven.net/reviews.php?re...88&pageid=1

Interesting note:

The HD4870X2 sports 2 GB of GDDR5 memory. With this, there is absolutely no game that will suffer from memory limitations. Some tests show more than 100% scaling power of the X2 compared to a single 4870. This is undoubtedly a result of the lack of memory (it only has 256bit bandwith and 512 MB memory, a feat which makes it cheap) which shows on the higher resolutions. The conclusion from that is that the soon to come 1 GB version of the 4870 could still sport significant performance increases, so that's really worth waiting for as well if you have a big screen.
Posted Image



1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users