GTX 350 - hoax or legit leak?

This topic is locked from further discussion.

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#1 JP_Russell
Member since 2005 • 12893 Posts

Haven't seen any posts on this yet, so my apologies if I missed anything and it's already been posted. The news on this came yesterday.

I was checking the comparison of nVidia cards on Wikipedia today, and noticed the GTX 350 had been added on there. It had two references, Tweaktown (who allegedly got the news from Techpowerup) and Hardspell. Not sure if it's legit or not, what do you guys think? If it is, this looks like it could be one beast of a card. 2GB of GDDR5 ( :shock: ), 480 stream processors ( :shock: ); clock speeds of 830MHz core ( :shock: ), 2075MHz shader ( :shock: ), and 3360MHz memory (I assume that's GDDR5-effective speed). It's just a single-GPU card, as far as I can tell. Single 576mm die, same as the GT200s.

Even with a 55nm fab process, that's going to be one power-hungry single GPU, though I suppose it might be a little more efficient than the 4870X2 (which is probably what it'd be designed to compete with).

So what do you think? Is it even possible they could fit double the TPC's on there, with stream processors clocked to the heavens and beyond? Do you think this is for real? Or is it fake? I'm not sure what to think. I don't think Tweaktown would put this out there if they didn't feel it could be real, themselves, but... the specs just sound incredible.

Avatar image for X360PS3AMD05
X360PS3AMD05

36320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 X360PS3AMD05
Member since 2005 • 36320 Posts
Tweaktown, Hardspell? I'm putting my money on BS :|...............
Avatar image for 12345yon
12345yon

1073

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#3 12345yon
Member since 2007 • 1073 Posts

O M G

i have a 5200FX.....im a bit late dont you think >.>

and, if its on wikipedia, maaaybe its legit O_0

Avatar image for Elann2008
Elann2008

33028

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#4 Elann2008
Member since 2007 • 33028 Posts
Thanks for the awesome news. I believe it's true. We all know that Nvidia has been stirring up something in their kitchen to counter the HD 4870x2. This couldn't have been better timing. I've been looking into Nvidia cards. I was going to get a GTX 280 but I guess I will wait now. :)
Avatar image for 12345yon
12345yon

1073

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#5 12345yon
Member since 2007 • 1073 Posts
i prefer ATI HD cards, with crossfire... yay!
Avatar image for johnny27
johnny27

4400

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#6 johnny27
Member since 2006 • 4400 Posts
i wouldnt doubt it that nvidia has something in the works to retake the graphics crown from ati altough i wonder what mid range offerings they have planned :?perhaps they can pull a 8800gt but take it one step futher and offer gtx 280 performance for low end price
Avatar image for nVidiaGaMer
nVidiaGaMer

7793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#7 nVidiaGaMer
Member since 2006 • 7793 Posts
Nvidia hasn't even made any cards that use GDDR4 so I call this one a hoax for now.
Avatar image for covertgamer78
covertgamer78

1032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 18

User Lists: 0

#8 covertgamer78
Member since 2005 • 1032 Posts
Maybe they skipped GDDR4 for the time being at least. They may make a mid range card with GDDR4 and use GDDR5 for their big guns. Two of these on one card would own for a few years I think! I knew nVidia would counter the 4870x2 eventually, would be awesome if this was it!
Avatar image for X360PS3AMD05
X360PS3AMD05

36320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 X360PS3AMD05
Member since 2005 • 36320 Posts
GDDR4 isn't going to happen with Nvidia............
Avatar image for nVidiaGaMer
nVidiaGaMer

7793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#10 nVidiaGaMer
Member since 2006 • 7793 Posts

Maybe they skipped GDDR4 for the time being at least. They may make a mid range card with GDDR4 and use GDDR5 for their big guns. Two of these on one card would own for a few years I think! I knew nVidia would counter the 4870x2 eventually, would be awesome if this was it!covertgamer78

I still wonder why nVidia by passed GDDR4. ATI has been using it for 2 years already (since the X1950XTX card).

Avatar image for covertgamer78
covertgamer78

1032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 18

User Lists: 0

#11 covertgamer78
Member since 2005 • 1032 Posts

[QUOTE="covertgamer78"]Maybe they skipped GDDR4 for the time being at least. They may make a mid range card with GDDR4 and use GDDR5 for their big guns. Two of these on one card would own for a few years I think! I knew nVidia would counter the 4870x2 eventually, would be awesome if this was it!nVidiaGaMer

I still wonder why nVidia by passed GDDR4. ATI has been using it for 2 years already (since the X1950XTX card).

nVidia is all about 1uping ATi, they won't lower themselves to mere GDDR4 unless they introduce a budget line

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 JP_Russell
Member since 2005 • 12893 Posts

Tweaktown, Hardspell? I'm putting my money on BS :|...............X360PS3AMD05

Tweaktown is usually one of the more reliable hardware sources out there. Never heard of Hardspell before now, though.

and, if its on wikipedia, maaaybe its legit O_0

12345yon

Like I pointed out in the OP, there were references for it on Wiki, and I'm basing the possibility of legitimacy on them, not Wiki. Wiki merely brought my attention to it.

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 LordEC911
Member since 2004 • 9972 Posts

With G200b coming before the end of the year, yes a mere shrink of G200, there is nothing new coming from Nvidia until mid 2009. Late Q2/Q3 we should see a 40nm slightly tweaked G200.

The thing is, with a 40nm shrink we will probably be looking at a chip between 300-400mm2 which immediately means they have to drop the 512bit bus, not enough perimeter. I would imagine if it was close to 400mm2 448bit would fit but closer to 300mm2 and we should be looking at 384bit. Nvidia linking ROPs and the memory controller is the problem, they would have to increase the number of ROPs per 64bit memory controller to increase the ROP amount.

Memory obviously 768mb and 1.5Gb would allow for seperate bins, depending on the bus size we still might be seeing 2.2ghz GDDR3 but 3.6ghz GDDR5 isn't out of the question, especially since I can see Nvidia wanting to hit 200Gbps of bandwidth before AMD/ATi.

Shader wise, anywhere from 320-480shaders seems doable. Obviously keeping the MADD+MUL for 3FP ops per clock will save time rather than having to build a new architecture. I am expecting a new architecture from both Nvidia and AMD/ATi pretty soon here, especially after moving to 40nm.

Clock-wise, we shouldn't see ridiculous clocks like these claims, especially if this is a pretty big revamp of the architecture. I would imagine another 600-700mhz for the core, 1500-1750mhz for the shaders and 2.2-3.6ghz for the memory. With the above specs this would pretty easily put the transistor count around 1.8-2bil. With a transistor count that high even 700mhz would be insanely fast.

~1.5-2.5TFlops
~173-202Gbps

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#14 JP_Russell
Member since 2005 • 12893 Posts

With G200b coming before the end of the year, yes a mere shrink of G200, there is nothing new coming from Nvidia until mid 2009. Late Q2/Q3 we should see a 40nm slightly tweaked G200.

LordEC911

So you think this GTX 350 is a fake?

Avatar image for VersaEmerge
VersaEmerge

61

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 VersaEmerge
Member since 2008 • 61 Posts
[QUOTE="LordEC911"]

With G200b coming before the end of the year, yes a mere shrink of G200, there is nothing new coming from Nvidia until mid 2009. Late Q2/Q3 we should see a 40nm slightly tweaked G200.

JP_Russell

So you think this GTX 350 is a fake?

[QUOTE="LordEC911"]

With G200b coming before the end of the year, yes a mere shrink of G200, there is nothing new coming from Nvidia until mid 2009. Late Q2/Q3 we should see a 40nm slightly tweaked G200.

JP_Russell

So you think this GTX 350 is a fake?

As if you just asked that.

Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#16 wklzip
Member since 2005 • 13925 Posts
[QUOTE="LordEC911"]

With G200b coming before the end of the year, yes a mere shrink of G200, there is nothing new coming from Nvidia until mid 2009. Late Q2/Q3 we should see a 40nm slightly tweaked G200.

JP_Russell

So you think this GTX 350 is a fake?


He said that its possible, but not this year.

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17 JP_Russell
Member since 2005 • 12893 Posts
[QUOTE="JP_Russell"][QUOTE="LordEC911"]

With G200b coming before the end of the year, yes a mere shrink of G200, there is nothing new coming from Nvidia until mid 2009. Late Q2/Q3 we should see a 40nm slightly tweaked G200.

wklzip

So you think this GTX 350 is a fake?


He said that its possible, but not this year.

I thought he was referring to the 40nm lineup that he did some guesswork on the specs of were what was coming later. In other words, he didn't think this card with these specs is real at all, and he was estimating what sort of specs we actually will see when something new (the "40nm slightly tweaked G200") comes on down the line.

Avatar image for nooblet69
nooblet69

5162

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#18 nooblet69
Member since 2004 • 5162 Posts
I hope its legit, if so I will get a gtx 280 when the prices drop :D.
Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 JP_Russell
Member since 2005 • 12893 Posts
Bump
Avatar image for Elann2008
Elann2008

33028

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#20 Elann2008
Member since 2007 • 33028 Posts

I hope its legit, if so I will get a gtx 280 when the prices drop :D.nooblet69

I'm with you man. Unless, the new GTX's are reasonably priced like the HD4870x2 was - I will get the new lineup. :D

Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#21 Lach0121
Member since 2007 • 11792 Posts
[QUOTE="nVidiaGaMer"]

[QUOTE="covertgamer78"]Maybe they skipped GDDR4 for the time being at least. They may make a mid range card with GDDR4 and use GDDR5 for their big guns. Two of these on one card would own for a few years I think! I knew nVidia would counter the 4870x2 eventually, would be awesome if this was it!covertgamer78

I still wonder why nVidia by passed GDDR4. ATI has been using it for 2 years already (since the X1950XTX card).

nVidia is all about 1uping ATi, they won't lower themselves to mere GDDR4 unless they introduce a budget line

funny same could be said about ati, is all about 1uping Nvidia..
Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 9mmSpliff
Member since 2005 • 21751 Posts
I believe they are mixing this up with the GTX280 on 55nm, which will not be the 4870x2 killer. It takes alot more than a die shrink and an increased core clock, even with the GDDR5 memory. It needs a a ton of more Stream Processes. Not only does nvidia have to fight the 4870x2, but the later to come 4850x2.
Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#23 Lach0121
Member since 2007 • 11792 Posts

I believe they are mixing this up with the GTX280 on 55nm, which will not be the 4870x2 killer. It takes alot more than a die shrink and an increased core clock, even with the GDDR5 memory. It needs a a ton of more Stream Processes. Not only does nvidia have to fight the 4870x2, but the later to come 4850x2. 9mmSpliff
umm the 4870x2 is not the card everyone is making it out to be...

and streaming processes isnt the only thing that matters on a GPU...

if you know about GPU's, then you should know this..

second you are comparing a single gpu card, to a multi-gpu card, why is it so many people forget this????

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 JP_Russell
Member since 2005 • 12893 Posts

I believe they are mixing this up with the GTX280 on 55nm, which will not be the 4870x2 killer. It takes alot more than a die shrink and an increased core clock, even with the GDDR5 memory. It needs a a ton of more Stream Processes. Not only does nvidia have to fight the 4870x2, but the later to come 4850x2. 9mmSpliff

I agree, GT200b couldn't come close to the 4870X2, though you're forgetting it also has a pretty sharp shader clock increase (+25%). This is as opposed to the 9800GTX+, which only had a small one (about +8%). I expect a bit more of an improvement from it than the 9800GTX+ was over the 9800GTX, though still nothing to go crazy about. As for the 4850X2, the only problem with that card is that it won't always scale well and will occasionally only perform equally to one 4850, or barely better. That's the biggest reason I will never buy a multi-GPU card and why they hold no merit for me; I wouldn't buy a $400 card if I knew it might not pull the weight of its price in every game I played. That, and micro stuttering.

This GTX 350, were it true, I think could easily give the 4870X2 a run for its money with the kind of specs it has, especially taking into account instances of micro stuttering and poor scaling.

Avatar image for Spartan8907
Spartan8907

3731

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Spartan8907
Member since 2006 • 3731 Posts
Wasnt this 2gb single gpu card already proven to be fake a few weeks ago?
Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#26 wklzip
Member since 2005 • 13925 Posts

[QUOTE="9mmSpliff"]I believe they are mixing this up with the GTX280 on 55nm, which will not be the 4870x2 killer. It takes alot more than a die shrink and an increased core clock, even with the GDDR5 memory. It needs a a ton of more Stream Processes. Not only does nvidia have to fight the 4870x2, but the later to come 4850x2. Lach0121

umm the 4870x2 is not the card everyone is making it out to be...

and streaming processes isnt the only thing that matters on a GPU...

if you know about GPU's, then you should know this..

second you are comparing a single gpu card, to a multi-gpu card, why is it so many people forget this????

I agree, but it seems that its the only thing that matters to nvidia, the g200 while being redesigned was added a lot more streaming processors forgetting about the texture units. Now they are even calling those streaming processors as "cores".

And second because that dual gpu 4870x2 is capable of beating 2 gtx280 in sli.

Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#27 Lach0121
Member since 2007 • 11792 Posts
[QUOTE="Lach0121"]

[QUOTE="9mmSpliff"]I believe they are mixing this up with the GTX280 on 55nm, which will not be the 4870x2 killer. It takes alot more than a die shrink and an increased core clock, even with the GDDR5 memory. It needs a a ton of more Stream Processes. Not only does nvidia have to fight the 4870x2, but the later to come 4850x2. wklzip

umm the 4870x2 is not the card everyone is making it out to be...

and streaming processes isnt the only thing that matters on a GPU...

if you know about GPU's, then you should know this..

second you are comparing a single gpu card, to a multi-gpu card, why is it so many people forget this????

I agree, but it seems that its the only thing that matters to nvidia, the g200 while being redesigned was added a lot more streaming processors forgetting about the texture units. Now they are even calling those streaming processors as "cores".

And second because that dual gpu 4870x2 is capable of beating 2 gtx280 in sli.

capable but not always, and not in all aplications. i prefer nvidia anyway, much more stable company than amd/ati. though i have an amd processor, which i will change given the oppertunity and extra cash.
Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#28 Lach0121
Member since 2007 • 11792 Posts
[QUOTE="wklzip"][QUOTE="Lach0121"]

[QUOTE="9mmSpliff"]I believe they are mixing this up with the GTX280 on 55nm, which will not be the 4870x2 killer. It takes alot more than a die shrink and an increased core clock, even with the GDDR5 memory. It needs a a ton of more Stream Processes. Not only does nvidia have to fight the 4870x2, but the later to come 4850x2. Lach0121

umm the 4870x2 is not the card everyone is making it out to be...

and streaming processes isnt the only thing that matters on a GPU...

if you know about GPU's, then you should know this..

second you are comparing a single gpu card, to a multi-gpu card, why is it so many people forget this????

I agree, but it seems that its the only thing that matters to nvidia, the g200 while being redesigned was added a lot more streaming processors forgetting about the texture units. Now they are even calling those streaming processors as "cores".

And second because that dual gpu 4870x2 is capable of beating 2 gtx280 in sli.

capable but not always, and not in all aplications. i prefer nvidia anyway, much more stable company than amd/ati. though i have an amd processor, which i will change given the oppertunity and extra cash.

also by that merrit a single 4870 should outpreform a single gtx 280, but it doesnt.

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 JP_Russell
Member since 2005 • 12893 Posts

Wasnt this 2gb single gpu card already proven to be fake a few weeks ago?Spartan8907

I don't know what card you mean, but the news and rumors on this card are pretty recent (22nd).

And second because that dual gpu 4870x2 is capable of beating 2 gtx280 in sli.

wklzip

Egh? Very rarely perhaps, as in only in applications that severely favor ATI hardware or have worse SLI scaling than Crossfire.

Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#30 Lach0121
Member since 2007 • 11792 Posts
[QUOTE="Lach0121"][QUOTE="wklzip"][QUOTE="Lach0121"]

[QUOTE="9mmSpliff"]I believe they are mixing this up with the GTX280 on 55nm, which will not be the 4870x2 killer. It takes alot more than a die shrink and an increased core clock, even with the GDDR5 memory. It needs a a ton of more Stream Processes. Not only does nvidia have to fight the 4870x2, but the later to come 4850x2. Lach0121

umm the 4870x2 is not the card everyone is making it out to be...

and streaming processes isnt the only thing that matters on a GPU...

if you know about GPU's, then you should know this..

second you are comparing a single gpu card, to a multi-gpu card, why is it so many people forget this????

I agree, but it seems that its the only thing that matters to nvidia, the g200 while being redesigned was added a lot more streaming processors forgetting about the texture units. Now they are even calling those streaming processors as "cores".

And second because that dual gpu 4870x2 is capable of beating 2 gtx280 in sli.

capable but not always, and not in all aplications. i prefer nvidia anyway, much more stable company than amd/ati. though i have an amd processor, which i will change given the oppertunity and extra cash.

also by that merrit a single 4870 should outpreform a single gtx 280, but it doesnt.

now dont get me wrong, the 4870 is not a bad card by anymeans, its just overhyped. but so is the gtx-280..
Avatar image for gamerloks
gamerloks

509

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#31 gamerloks
Member since 2008 • 509 Posts
if this is true then this is going to be what's going to put ATI back in their missery and nvidia back on the top like before with the 8800 ultra
Avatar image for xialon
xialon

593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#32 xialon
Member since 2007 • 593 Posts

Odd.

At Fry's Electronics I think I saw a GTX 350 graphics card sitting next to the other high-end graphics cards.

Or maybe I was seeing things, and it was just a 3500FX Quad graphics card. But then why was that next to the 9800GX2's and 4870x2, and GTX 280???

I hope they come out new a new graphics card (meaning nVidia) despite the, as of late, graphics card frenzy that has been going on.

There needs to be another drive down in prices for the 9 series (by at least $25-$50) and the GTX 260 (which I have seen one for only $200).

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33 JP_Russell
Member since 2005 • 12893 Posts

Odd.

At Fry's Electronics I think I saw a GTX 350 graphics card sitting next to the other high-end graphics cards.

Or maybe I was seeing things, and it was just a 3500FX Quad graphics card. But then why was that next to the 9800GX2's and 4870x2, and GTX 280???

xialon

Interesting. Another possible listing sighting of the card from a different retail outlet. Here's to hoping you in fact were not hallucinating.

Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 9mmSpliff
Member since 2005 • 21751 Posts
*shakes head at fanboys*
Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#35 Lach0121
Member since 2007 • 11792 Posts
*shakes head at fanboys*9mmSpliff
irony?????
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 LordEC911
Member since 2004 • 9972 Posts

[QUOTE="9mmSpliff"]*shakes head at fanboys*Lach0121
irony?????

Ehhhh... please show me anything Spliff has said that might imply he is a fanboy.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Wesker776
Member since 2005 • 7004 Posts

*shakes head at fanboys*9mmSpliff

I agree with that notion.

[QUOTE="9mmSpliff"]*shakes head at fanboys*Lach0121
irony?????

How is that irony? :| Do you even know the meaning of irony?

]capable but not always, and not in all aplications. i prefer nvidia anyway, much more stable company than amd/ati. though i have an amd processor, which i will change given the oppertunity and extra cash.Lach0121

Erm, care to explain how Nvidia is a more stable company?

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#38 JP_Russell
Member since 2005 • 12893 Posts

*shakes head at fanboys*9mmSpliff

To whom are you referring? Not me, I hope.

Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#39 Lach0121
Member since 2007 • 11792 Posts

[QUOTE="Lach0121"][QUOTE="9mmSpliff"]*shakes head at fanboys*LordEC911

irony?????

Ehhhh... please show me anything Spliff has said that might imply he is a fanboy.

hmm not implying he is... just think for a sec... me stating i prefer nvidia... not that im against ATI... states that im a "fanboy", yet he can have his own views and not be a fanboy..... matter of fact, i just got my GF a hd3870 to go in her computer... just because i prefer nvidia myself, does not mean i hate ati....
Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#40 Lach0121
Member since 2007 • 11792 Posts

[QUOTE="9mmSpliff"]*shakes head at fanboys*Wesker776

I agree with that notion.

[QUOTE="9mmSpliff"]*shakes head at fanboys*Lach0121
irony?????

How is that irony? :| Do you even know the meaning of irony?

]capable but not always, and not in all aplications. i prefer nvidia anyway, much more stable company than amd/ati. though i have an amd processor, which i will change given the oppertunity and extra cash.Lach0121

Erm, care to explain how Nvidia is a more stable company?

first off, read the post above....

secondly, i very much know the meaning of irony... which is why i didnt use the term hypocrit.

third, Nvidia has always been more stable for me and my friends, on Drivers, game support, less compatability issues... and has had a more stable run with my experience and thiers in my 22 years of gaming,,,, buti do have to give ATI much credit for these issues have not been as bad this past year or two honestly... for the record. and are workign better.

fourth not to say that ATI hasnt done great things, nor that nvidia hasnt F'd up from time to time, just in my experience, nvidia is better for me..., and honestly i should of included the abbreviations IMO, or the phrase in my opinion in my second quote you have from me there, but i just wasnt thinking about it.

so fanboy im not, just because i prefer one company over the other, does not make me a fanboy...

now if i were to say, ATI sucks, ati will never get ahead not once in its existance, and never buy any of their products from blind biased points of view, then yes i would be..

but that being said i just bought my GF a ATI HD3870 for her pc... because i figured it would be better for her to stay with ati, instead of switching to nvidia.

but i still prefer nvidia.

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 LordEC911
Member since 2004 • 9972 Posts

"Stable company" not "stable drivers/cards."
Interesting retreat...

Avatar image for nish2280
nish2280

489

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 nish2280
Member since 2006 • 489 Posts
wow...that thing is gunna be REALLY expensive..like a centillion dollars
Avatar image for Lach0121
Lach0121

11792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#43 Lach0121
Member since 2007 • 11792 Posts

"Stable company" not "stable drivers/cards."
Interesting retreat...

LordEC911

lol retreat??? right...

Avatar image for UnlimitedToast7
UnlimitedToast7

54

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 UnlimitedToast7
Member since 2008 • 54 Posts

wow...that thing is gunna be REALLY expensive..like a centillion dollarsnish2280

BUt don't forget, that for that you will get a consistent 60 fps in crysis!... at 1280x712... when looking at the sky...

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 JP_Russell
Member since 2005 • 12893 Posts

[QUOTE="nish2280"]wow...that thing is gunna be REALLY expensive..like a centillion dollarsUnlimitedToast7

BUt don't forget, that for that you will get a consistent 60 fps in crysis!... at 1280x712... when looking at the sky...

Dude, a card like the GTX 350, were it for real, could very easily run Crysis on all very high, 1920x1200 with very playable frames. Could probably even put some AA and AF on. As it is, my GTX 260 plays the game on all high at 1600x1200 with 2X AA, and the lowest I've ever dipped was 25FPS. Oh yeah, and I get 120FPS looking at the sky, so there. :P

Avatar image for aoshi_shinumori
aoshi_shinumori

791

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 aoshi_shinumori
Member since 2006 • 791 Posts
great! now with that card i can play "CRAPSIS: clear sky"
Avatar image for Sneaky_Gopher
Sneaky_Gopher

2440

Forum Posts

0

Wiki Points

0

Followers

Reviews: -1

User Lists: 0

#47 Sneaky_Gopher
Member since 2005 • 2440 Posts
[QUOTE="UnlimitedToast7"]

[QUOTE="nish2280"]wow...that thing is gunna be REALLY expensive..like a centillion dollarsJP_Russell

BUt don't forget, that for that you will get a consistent 60 fps in crysis!... at 1280x712... when looking at the sky...

Dude, a card like the GTX 350, were it for real, could very easily run Crysis on all very high, 1920x1200 with very playable frames. Could probably even put some AA and AF on. As it is, my GTX 260 plays the game on all high at 1600x1200 with 2X AA, and the lowest I've ever dipped was 25FPS. Oh yeah, and I get 120FPS looking at the sky, so there. :P

Thats a great quote.