Hameyadea's comments

  • 31 results
  • 1
  • 2
  • 3
  • 4
Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

Wow... Such a drop in frames in DX 10? Although small, but still...

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

Forget about Duke, because Gordon just script-kicked him to an unknown time!

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

Generic_Dude: That wasn't a good idea. Everyone knew that the GeForce 8 will be nVidia's first DX 10 cards (which, by logic, means that the 7 series are the last DX 9 cards), so buying a high-end 7 series (or an X1800+ from ATI) wasn't smart. I'm (maybe) gonna get an 8900+ soon (which is, by my measurement, 'til Jan. '08 ). I'm glad I didn't upgraded my card. Now that was smart (actually.... I was broke. But Half-Life 1: Anthology, HL2: GotY Edition and HL2:E1 are must have! LOL). And to all ATI's fanboys: ATI will surpass nVidia, that is for sure. But not for long. With ATI's HD series' power requirements? ATI are literally giving nVidia their (ATI's) market share. And seriously, the HD 2900 beats the GTS (the 320 MB V-RAM version) at 2048x1536, but who play at those resolutions? And in a "simple" 1280x1024 the card is just broken. And with that delay? Too little, too late.

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

Now imagine that in DX 10.

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

SirWrinkles: Those results don't make any sense! I mean, both the GTS (640 MB) and the GTX have done extremely well in Doom 3, F.E.A.R., etc... Maybe you're showing the results of the GTS (320 MB). And also, what's the drivers that those sites are using (for all cards)? GameSpot show the drivers that were tested, but I searched Guru3D and Tom's Hardware and I couldn't find them. You know that it's easy to roll-over to older drivers, right? Maybe they're using a pre-100.xx ForceWare and the 8.7.x Catyalyst? Those results for the GTX aren't logical. Also, with CPU(s) they've used? How much RAM? What were the RPMs of the hard drives? All that can reduce frames. Try running F.E.A.R. at all high, 4x AA, 8x AF, 2560x1920 with a GTX, 1 GB RAM, single-core CPU, and a 5000 RPM hard drive. How many frames you'll get? And then try running F.E.A.R. again (same settings) with the HD 2900 on a dual/quad-core CPU, 2.5 GB RAM and a 10,000 RPM PC. Unless I see the EXACT specs. that those games were tested in, I suspect to accept those results. And send links with your comments, so we'll know that you're not making up those results.

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

Hack_Hack: With a gun, straight to the fan.

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

Also, nVidia's logo (and slogan) appear in quite a lot of games (Grand Theft Auto: San Andreas and Hitman: Blood Money, to name a few) while ATI's logo and slogan... are at ATI's website. nVidia beats ATI. Don't believe me? Come back in July (2 months later) and we'll compare the HD 2900's performance now with today's drivers to July's drivers. The 8800 GTS will still win.

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

pattepikk, nVidia created the Xbox's graphic card, why do you think that ATI created the 360's graphic card and nVidia created the RSX (PlayStation 3's graphic card)? nVidia and Mircosoft had some arguments, so Mircosoft and nVidia aren't best buddies. So why Mircosoft will show nVidia an early look on Vista? ATI created the Xenox (or is it the Xenos?), the 360's graphic card. So, by logic, Mircosoft would had ATI an early look on Vista, am I right?

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

Oh yeah... Nminator is right. When GameSpot reviewed the 8800s for the first time (in November), ATI's fanboys said that ATI's first DX 10 will beat the 8800s out of the water with lower power specs (and with GDDR 4). the 8800 beat the HD 2900 out of the water, the 8800s use less power compared to the HD 2900 and only the HD 2600 is using GDDR 4, the other HD cards are GDDR 3 and even the HD 2400 and lower use DDR 2! While the GeForce 8 series use GDDR 3, EVEN IN LOW-END CARDS! (unlike ATI... LOL). I'm waiting to see ATI's fanboys hitting their hats, newspapers the chairs!

Avatar image for Hameyadea
Hameyadea

1289

Forum Posts

0

Wiki Points

33

Followers

Reviews: 0

User Lists: 0

Edited By Hameyadea

I agree with emroy, a single nVidia card ($600) beats TWO ATI cards ($400 x 2 = $800). Long live the small difference.

  • 31 results
  • 1
  • 2
  • 3
  • 4