Is nVidia's Pascal generation nVidia's Best GPU's in their history?

  • 143 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

Poll Is nVidia's Pascal generation nVidia's Best GPU's in their history? (64 votes)

Yes. 59%
No. 41%

Ever since I got the GTX 1060 6GB on my laptop, I have been impressed by it's performance and the fact I am getting Desktop level performance on my Alienware 17" gaming laptop (got my fingers crossed that it lasts a long time). Anyways, I am getting up to 1900+ Ghz on the core clock speed. That is insane for a laptop making it essentially like the Desktop GTX 1060 6GB. That's without overclocking. I have practically stopped using my Desktop for gaming.

I always thought that nVidia's 8800 series that launched in 2006 was the best GPU series in their history. The 8800 GTX brought leaps and bounds in performance. It wasn't until June of 2008 that ATI had a GPU that could beat the 8800 GTX with the release of HD 4850/HD 4870. That means nVidia held the lead with a single generation for a year and half. Maxwell was good but AMD already had the R9 290X that was as good at the GTX 970 and it's refresh in June 2015, 9 month's after Maxwell's release was good enough to compete with the 970 and 980 with the R9 390 and 390X. But judging by what has come out of the performance number of the RX Vega 64, it will supposedly be as good as the GTX 1080 but not better than it and the 1080 Ti will still reign supreme. The GTX 1080 was launched last year meaning that Vega RX will be 1 year late and will consume 100 Watts more. So, that means Pascal will reign supreme until next year until AMD releases Navi in 2018. That's almost two years with nVidia being on top with Pascal. I don't think nVidia had such a long time with a lead in the GPU industry. That's not to say that Vega RX will be bad ( I welcome the fact that AMD finally has something to compete in the high end), it's about how Pascal is compared to previous GPU's in nVidia's history vs the competition.

I think we can all thank the cuts in the GPU division that AMD's last CEO made back in 2011 and the subsequent departure of top rated graphics talent in the subsequent years.

When you include everything in terms of power, performance (maybe not price), but at least in terms of power and performance and the fact that nVidia was able to stick an entire Desktop GPU inside a laptop and get the same permanence as a Desktop for the first time in their history is indeed an engineering achievement.

So, I think Pascal is their best generation in nVidia's history. Agree?

See I am not a AMD fanboy. Created a nVidia thread. ;)

ReplyEditDelete
 • 
Avatar image for drlostrib
DrLostRib

5931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#1  Edited By DrLostRib
Member since 2017 • 5931 Posts

lol, did you mean to link to the exact same thread you made on a different board

Avatar image for Yams1980
Yams1980

2862

Forum Posts

0

Wiki Points

0

Followers

Reviews: 27

User Lists: 0

#2 Yams1980
Member since 2006 • 2862 Posts

ya its been the best. But second best was maxwell... it was so good they had to nerf the card because it was performing better than they needed it to. Changing where the branding would go compared to where it should have if they were an honest company. I'm still on a 970gtx but might get a 1070 or 1080 by the end of the year if the price drops a bit or maybe wait for a bit longer for a newer gen.

Like their 680's were originally going to be the 670s but performed too well and they bumped the 670 up to 680 and sold it at that price... they had so much headroom its why they titan was born that generation, the whole line had to be nerfed back 1 line because they knew they didn't have to release it a full power to beat AMD. I bet AMD wish they had at least 1 of their engineers cause they haven't a clue how to compete with them.

Like it or not its like a deal with the devil, you know you should support AMD but yet how can you buy gpus that use more power and perform worse, you end up having to buy nvidia almost every time.

Avatar image for appariti0n
appariti0n

5013

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 appariti0n
Member since 2009 • 5013 Posts

Not even close. As far as I know, no generation had as large of a leap forward in technology as the release of the 8800 GTX. iirc, it was somewhere around 2.5x faster than the previous fastest single gpu solution.

The reduction in thermals/TDP is impressive on the 10X0 series though.

Avatar image for osan0
osan0

17804

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 osan0
Member since 2004 • 17804 Posts

the 8800GTX is probably the single best card they ever released (given the time of its release and such like. of course an 8800 GTX is no good for gaming now). for about 2-3years various publications couldnt recommend any upgrade for 8800GTX owners. the only reason i replaced mine was because of hardware failure.

pascal wouldnt be far off though. i think the 1080TI is a bit of a classic in the making. like the 8800GTX it has very little compromise and i think it will endure well as the years tick by. if your the type of person who likes to just build a PC then leave it alone for 4-5 years then i think the 1080TI would be the card to go for. it wont play games at 4K 60FPS at max settings in 4-5 years time but it will play them well.

Avatar image for R4gn4r0k
R4gn4r0k

46171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 R4gn4r0k
Member since 2004 • 46171 Posts

Pros:

  • Great performance
  • Great performance per watt
  • Really silent cards

Cons:

  • Prices since the 10XX series have been really crappy
Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6  Edited By scatteh316
Member since 2004 • 10273 Posts

They've still not managed to and and probably never will beat the evolution and sheer performance jump they got from G80.

Avatar image for Gaming-Planet
Gaming-Planet

21064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#7 Gaming-Planet
Member since 2008 • 21064 Posts

It's really efficient.

On laptops, it's a huge improvement.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#8  Edited By GarGx1
Member since 2011 • 10934 Posts

Not so sure about laptops, it may well be but for desktop PC's the 8800 GTX was and still is the biggest leap year on year and was a valid GPU for at least 7 years. 1080Ti's will have disappeared into history long before they reach 7 years old.

@Xtasy26: See I am not a AMD fanboy. Created a nVidia thread. ;)

:P

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9  Edited By Techhog89
Member since 2015 • 5430 Posts

8800 GTX is still a thing that existed.

Avatar image for KungfuKitten
KungfuKitten

27389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#10  Edited By KungfuKitten
Member since 2006 • 27389 Posts

Well the 1060 is very good. The others are a little expensive for me...

Avatar image for adamosmaki
adamosmaki

10718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#11 adamosmaki
Member since 2007 • 10718 Posts

G80 series is still the best. Not only the leap in performance was huge they actually had the single fastest GPU for a couple of years by far and even had the best price/performance ratio GPU ever the 512mb 8800gt

Sure Pascal with 1080ti seem they will keep the performance crown but AMD competes in every level and their performance/price is better than nvidia ( not counting the ridiculous GPU prices we have the last 4-5months due to mining that is)

And according to leaks vega 56 seem to be a bit faster than a 1070 with a similar price tag ( of course it remains to be seen if availability and normal prices are there )

Avatar image for neo418
neo418

586

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 neo418
Member since 2003 • 586 Posts

Nvidia's CEO is an AMD alum btw... I don't think he is trying to kill them, at least not on the CPU front.

Avatar image for scoots9
scoots9

3505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#13 scoots9
Member since 2006 • 3505 Posts

Maxwell was pretty impressive outside of the whole 3.5 issue. Kepler was garbage. Fermi was a garbage fire. Tesla was pretty uncompetitive later on and really overstayed its welcome, but it did give us the 8800 GTX. So I'll say the 8000 series.

Avatar image for neogeo419
neogeo419

1474

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 neogeo419
Member since 2006 • 1474 Posts

8800gtx, and it's not even close imho

Avatar image for endofaugust
EndofAugust

812

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#15  Edited By EndofAugust
Member since 2017 • 812 Posts

absolutely not, that would fall to the 6000 line of cards in 2004, specifically the 6800 Ultra. the jump made at that point in time in terms of performance was massive. i'd say now that iterations are getting closer together and more refined but the days of doubling performance are over.

Avatar image for ShepardCommandr
ShepardCommandr

4939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#16 ShepardCommandr
Member since 2013 • 4939 Posts

it's efficient sure but way overpriced

$500+ for a baby card

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17  Edited By QuadKnight
Member since 2015 • 12916 Posts

They are a bit too expensive.

My 1070 cost me about $450. My 970 was quite cheaper.

I'm very happy with the performance though, the 1070 chews through any game I feed it. I can't imagine owning a 1080Ti, sounds like major overkill.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18  Edited By ronvalencia
Member since 2008 • 29612 Posts

In terms of IPC, GP104 is very similar to Maxwell GM200.

Avatar image for PCgameruk
PCgameruk

2273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 PCgameruk
Member since 2012 • 2273 Posts

8800GTX was the only thing that could take on crysis ultra at a decent framerate. And that was still about 35fps... I might install crysis and see how my 1070 does in 4k, dam good game.

Avatar image for danjammer69
danjammer69

4331

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 danjammer69
Member since 2004 • 4331 Posts

@endofaugust said:

absolutely not, that would fall to the 6000 line of cards in 2004, specifically the 6800 Ultra. the jump made at that point in time in terms of performance was massive. i'd say now that iterations are getting closer together and more refined but the days of doubling performance are over.

I remember that (Had a BFG 6800 Ultra), and the jump in performance was enormous.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 Techhog89
Member since 2015 • 5430 Posts

@ronvalencia said:

In terms of IPC, GP104 is very similar to Maxwell GM200.

And the award for most robotic and irrelevant comment goes to...

Avatar image for HalcyonScarlet
HalcyonScarlet

13659

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#22 HalcyonScarlet
Member since 2011 • 13659 Posts

I think it is. One reason is that some of the cards are around 75% better than the previous gen of cards. For example, the GTX 960 verses the GTX 1060.

That's an unusual performance jump in terms of gen on gen PC cards.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#23 GarGx1
Member since 2011 • 10934 Posts

@quadknight said:

They are a bit too expensive.

My 1070 cost me about $450. My 970 was quite cheaper.

I'm very happy with the performance though, the 1070 chews through any game I feed it. I can't imagine owning a 1080Ti, sounds like major overkill.

No such a thing as overkill.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24  Edited By QuadKnight
Member since 2015 • 12916 Posts

@GarGx1 said:
@quadknight said:

They are a bit too expensive.

My 1070 cost me about $450. My 970 was quite cheaper.

I'm very happy with the performance though, the 1070 chews through any game I feed it. I can't imagine owning a 1080Ti, sounds like major overkill.

No such a thing as overkill.

At the moment it is for me. Unless I decide to start gaming in 4K which I have absolutely no interest in.

1440p at Ultra Settings is where it's at for me. 1080Ti would be overkill for 1440p gaming.

Avatar image for davillain
DaVillain

56017

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#25 DaVillain  Moderator
Member since 2014 • 56017 Posts

@quadknight said:

They are a bit too expensive.

My 1070 cost me about $450. My 970 was quite cheaper.

I'm very happy with the performance though, the 1070 chews through any game I feed it. I can't imagine owning a 1080Ti, sounds like major overkill.

You're 1070 cost $450? The gigabyte 1070 I paid for was $420 but then again, 1070 was supposed to be price around $300 when Nvidia stated but at launch, we didn't see that, just $400. Don't get me wrong, I really love it and I thought it was well worth the price I paid for and it's good with my monitor 1440p/60fps.

As for a 1080TI, those GPU are best suited for 1440p/144Hz monitors if you ask me.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26  Edited By QuadKnight
Member since 2015 • 12916 Posts

@davillain- said:
@quadknight said:

They are a bit too expensive.

My 1070 cost me about $450. My 970 was quite cheaper.

I'm very happy with the performance though, the 1070 chews through any game I feed it. I can't imagine owning a 1080Ti, sounds like major overkill.

You're 1070 cost $450? The gigabyte 1070 I paid for was $420 but then again, 1070 was supposed to be price around $300 when Nvidia stated but at launch, we didn't see that, just $400. Don't get me wrong, I really love it and I thought it was well worth the price I paid for and it's good with my monitor 1440p/60fps.

As for a 1080TI, those GPU are best suited for 1440p/144Hz monitors if you ask me.

Yea, I got the EVGA 1070 SC and it cost me about $450 with tax. A bit pricey but the performance boost over the 970 is well worth it.

I'm pretty much okay with anything 60fps and above so 144Hz gaming while nice is a bit overkill for me. Maybe in a couple of years time I'll join the 144Hz and above master race.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#27 GarGx1
Member since 2011 • 10934 Posts

@quadknight said:
@GarGx1 said:
@quadknight said:

They are a bit too expensive.

My 1070 cost me about $450. My 970 was quite cheaper.

I'm very happy with the performance though, the 1070 chews through any game I feed it. I can't imagine owning a 1080Ti, sounds like major overkill.

No such a thing as overkill.

At the moment it is for me. Unless I decide to start gaming in 4K which I have absolutely no interest in.

1440p at Ultra Settings is where it's at for me. 1080Ti would be overkill for 1440p gaming.

I'm running SLI 1080's on a 1440p monitor. It's not about resolution, it's all about the frame rates and the closer I can get to a consistent 144fps, with max settings, the better.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#28 ronvalencia
Member since 2008 • 29612 Posts

@techhog89 said:
@ronvalencia said:

In terms of IPC, GP104 is very similar to Maxwell GM200.

And the award for most robotic and irrelevant comment goes to...

Loading Video...

Not much different to GTX 8800 long live design which ended as GTS 250.

Try again fool.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#29 waahahah
Member since 2014 • 2462 Posts

I don't get what the point of this post is suppose to be. Is NVIDIA's most advanced and latest iteration of tech its best? Why yes, their best and latest tech is the best!

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#30  Edited By ronvalencia
Member since 2008 • 29612 Posts

@waahahah said:

I don't get what the point of this post is suppose to be. Is NVIDIA's most advanced and latest iteration of tech its best? Why yes, their best and latest tech is the best!

In terms of TFLOPS vs results, Pascal GP104 is similar to Maxwell GM200 which is similar to G80 to G92 move.

My point, Maxwell architecture and it's 16 nm FinFET Pascal iteration is also a long live architecture.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#31 waahahah
Member since 2014 • 2462 Posts

@ronvalencia said:
@waahahah said:

I don't get what the point of this post is suppose to be. Is NVIDIA's most advanced and latest iteration of tech its best? Why yes, their best and latest tech is the best!

In terms of TFLOPS vs results, Pascal GP104 is similar to Maxwell GM200 which is similar to G80 to G92 move.

I don't think this has anything to do with my statement. Its just a pointless thread.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#32  Edited By ronvalencia
Member since 2008 • 29612 Posts

NVIDIA Welcomes AMD’s x86 Comeback and Endorses Ryzen Threadripper

AMD replies to NVIDIA's tweet with a high 5.

Avatar image for appariti0n
appariti0n

5013

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33  Edited By appariti0n
Member since 2009 • 5013 Posts

@techhog89 said:

8800 GTX is still a thing that existed.

yep, any answer other than 8800 gtx/ g80 architecture is wrong.

http://gpu.userbenchmark.com/Compare/Nvidia-GeForce-8800-GTX-vs-Nvidia-GeForce-7900-GTX/m9271vsm10625

Avatar image for pdogg93
pdogg93

1849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 pdogg93
Member since 2015 • 1849 Posts

Pascal is the best thing that's happened for laptops. Gtx 1080 performs near identical to its desktop counterpart. Amazing leap for mobile gamers.

Avatar image for appariti0n
appariti0n

5013

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#35 appariti0n
Member since 2009 • 5013 Posts

@pdogg93: agreed, the 1060 in my laptop keeps pace with the 980m, and absolutely trashes the 960m, which it was supposed to replace. So happy with that purchase.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#36 Xtasy26
Member since 2008 • 5582 Posts

I keep on hearing how great the G80, 8800 GTX, which I agree it was great, probably one of the best in nVidia's history, they spent freaking 4 years making it. But I am arguing in terms of the performance/power ration. Was there any GPU in the 8800 series lineup that had the same exact specs and performance as the laptop version. Oh right it didn't.

People also forget that the 8800 GTX launched at $600 which was the same price that the GTX 1080 launched.

@GarGx1 said:

Not so sure about laptops, it may well be but for desktop PC's the 8800 GTX was and still is the biggest leap year on year and was a valid GPU for at least 7 years. 1080Ti's will have disappeared into history long before they reach 7 years old.

@Xtasy26: See I am not a AMD fanboy. Created a nVidia thread. ;)

:P

Hey my 6th nVidia GPU says Hi! ;)

@scatteh316 said:

They've still not managed to and and probably never will beat the evolution and sheer performance jump they got from G80.

Well the 9700 Pro or the original Voodoo 2 says Hi! I could argue that the Voodoo 2 was even a bigger jump as that was the first GPU where they offered two GPU's running at the same time via SLI. nVidia or ATI didn't have anything that matched the Voodoo 2's performance until nearly a year and half later.

@GarGx1 said:
@quadknight said:

They are a bit too expensive.

My 1070 cost me about $450. My 970 was quite cheaper.

I'm very happy with the performance though, the 1070 chews through any game I feed it. I can't imagine owning a 1080Ti, sounds like major overkill.

No such a thing as overkill.

Yes, there is. It would have been pointless for me to get a GTX 1070 when my GTX 1060 6GB could handle everything maxed out at 1080P. Besides by the time I need to upgraded I could used the money I saved by not getting a GTX 1070 and use that to buy a Alienware Amplifer and a new GPU at the same price as getting a GTX 1070 and get better performance 4 years down the road.

@waahahah said:

I don't get what the point of this post is suppose to be. Is NVIDIA's most advanced and latest iteration of tech its best? Why yes, their best and latest tech is the best!

Because of the power efficiency of Pascal which has never been done before where you could fit an entire desktop GPU with the same specs and similar clock speeds as it's desktop brethren. That's what makes this special.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#37 Xtasy26
Member since 2008 • 5582 Posts

@Yams1980 said:

Like it or not its like a deal with the devil, you know you should support AMD but yet how can you buy gpus that use more power and perform worse, you end up having to buy nvidia almost every time.

That's the problem with AMD in the Gaming Laptop space. As much as I like AMD I am not going to sacrifice performance on getting a Gaming Laptop where I could get better performance for similar price.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#38 waahahah
Member since 2014 • 2462 Posts

@Xtasy26 said:

@waahahah said:

I don't get what the point of this post is suppose to be. Is NVIDIA's most advanced and latest iteration of tech its best? Why yes, their best and latest tech is the best!

Because of the power efficiency of Pascal which has never been done before where you could fit an entire desktop GPU with the same specs and similar clock speeds as it's desktop brethren. That's what makes this special.

No shit, its the latest iteration and its the best... and I'm not sure this is such a good thing, they could have made the desktop more powerful hypothetically since it fits in a mobile. This could just as easily be seen as limiting the variations for parity.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#39 Xtasy26
Member since 2008 • 5582 Posts

@waahahah said:
@Xtasy26 said:
@waahahah said:

I don't get what the point of this post is suppose to be. Is NVIDIA's most advanced and latest iteration of tech its best? Why yes, their best and latest tech is the best!

Because of the power efficiency of Pascal which has never been done before where you could fit an entire desktop GPU with the same specs and similar clock speeds as it's desktop brethren. That's what makes this special.

No shit, its the latest iteration and its the best... and I'm not sure this is such a good thing, they could have made the desktop more powerful hypothetically since it fits in a mobile. This could just as easily be seen as limiting the variations for parity.

Latest iteration and it being the best is not what the argument is about. It's about the fact that they brought an entire new generation with better performance than the previous generation and the competition and they did it withing the power envelop where they could put the exact same GPU inside the laptop. THAT has never been done before in nVidia's history.

Avatar image for appariti0n
appariti0n

5013

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40 appariti0n
Member since 2009 • 5013 Posts

@Xtasy26: Or maybe they're sandbagging on the desktop side, since AMD was so far behind anyways. :P

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#41  Edited By Xtasy26
Member since 2008 • 5582 Posts

@ronvalencia said:
@techhog89 said:
@ronvalencia said:

In terms of IPC, GP104 is very similar to Maxwell GM200.

And the award for most robotic and irrelevant comment goes to...

Loading Video...

Not much different to GTX 8800 long live design which ended as GTS 250.

Try again fool.

I always knew nVidia hit a gold mine with Maxwell. Maybe I should have said is Maxwell their best GPU in history? ;) Although, Maxwell based GPU's on laptop were not the same in terms of performance as their desktop counterparts it was a lot close than their previous generations have been. nVidia was very smart by just taking Maxwell and tweaking and doing a node shrink and increasing clocks. This feels very similar to RV770 when AMD had a hit architecture it was smaller than the GTX 2XX series but had faster memory and could clock high (HD 4890 hit up to 1GH and it was the first GPU to hit 1GHZ). They tweaked RV770 and came up with the HD 5800 series on a new node and increased clocks and it was still smaller than Fermi and came out 6 months earlier and consumed 100W less. Where with AMD now it's the opposite with Vega where it's consumes 100W's more and instead it even came out even later at 12 months.

I seriously think AMD needs to go back to the "small die strategy" where they designed smaller GPU's that were fast, smaller than nVidia's (hence savings on their margins), clocked well and had great power consumption and came out at the same time or sometimes even earlier than nVidia. These large power hungry GPU's AMD is currently making is costing them not only in terms of margins because they are bigger than nVidia's but they are coming late to the market which is hurting them in sales and are consuming more power. Looks like nVidia learned their lessons after Fermi and started focusing on smaller, efficient and less power consuming GPU's.

My only question is will they just re-architecture Pascal to Volta with addition of HBM 2 and how well will their implementation of Async Compute will be in Volta. I am guessing Volta will be a new architecture with improved Async and will probably hit 2.2 Ghz+. Let's just hope that AMD isn't freaking 12 month's late to the market with Navi and offers similar performance as Volta.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#42  Edited By Xtasy26
Member since 2008 • 5582 Posts

@appariti0n said:

@Xtasy26: Or maybe they're sandbagging on the desktop side, since AMD was so far behind anyways. :P

Maybe. Look what they did with Pascal and Maxwell prior to it. Create a GPU that will be faster than your competitors while at the same time have something that is slightly bigger GPU with more CUDA cores on the side "just in case" or release it "when you want to". Unfortunately for AMD I don't think they have the resources to create two different GPU's of the same generation at the same time. They always been a "one size fits all" company with a single High end GPU with disabled stream processors in their lower tiers.

That's why I think they should go back to their strategy of making smaller die, that is power efficient and cost's less to make and probably less complex to make GPU's.

Avatar image for svaubel
svaubel

4571

Forum Posts

0

Wiki Points

0

Followers

Reviews: 133

User Lists: 0

#43 svaubel
Member since 2005 • 4571 Posts

@quadknight said:

They are a bit too expensive.

My 1070 cost me about $450. My 970 was quite cheaper.

I'm very happy with the performance though, the 1070 chews through any game I feed it. I can't imagine owning a 1080Ti, sounds like major overkill.

Yeah, as much as I would like to upgrade my 980... 1. Bills and debt, and 2. Dunno if I can justify a one-gen upgrade.

Been surprised how well my card has run games at 4k though.

On topic, Pascal is good, but I dunno if anything will top the insanity that was the 8800 GTX

Avatar image for kmp
KMP

380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#44 KMP
Member since 2017 • 380 Posts

It's a huge step up for laptops, Maxwell's 970 and the 8800x are more noticeable cards for their times though. I'm on Maxwell, but Pascal is a good step up.

@appariti0n said:

@pdogg93: agreed, the 1060 in my laptop keeps pace with the 980m, and absolutely trashes the 960m, which it was supposed to replace. So happy with that purchase.

Not even close, the 960m has always been a cheaper card that fits into thinner laptops. The 1060 is not meant to replace it at all as it encompasses a separate price range and laptop type.

Avatar image for appariti0n
appariti0n

5013

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 appariti0n
Member since 2009 • 5013 Posts

@kmp said:

It's a huge step up for laptops, Maxwell's 970 and the 8800x are more noticeable cards for their times though. I'm on Maxwell, but Pascal is a good step up.

@appariti0n said:

@pdogg93: agreed, the 1060 in my laptop keeps pace with the 980m, and absolutely trashes the 960m, which it was supposed to replace. So happy with that purchase.

Not even close, the 960m has always been a cheaper card that fits into thinner laptops. The 1060 is not meant to replace it at all as it encompasses a separate price range and laptop type.

No? I'm not the one who came up with the naming scheme.

Both the 960m and the 1060m are the 3rd fastests card of their respective generations, following a naming convention that has been in place since the 460m.

Avatar image for jun_aka_pekto
jun_aka_pekto

25255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#46 jun_aka_pekto
Member since 2010 • 25255 Posts

Nope. Just another fine release by Nvidia which goes back many generations dating back to their TNT cards.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#47 waahahah
Member since 2014 • 2462 Posts

@Xtasy26 said:
@waahahah said:
@Xtasy26 said:
@waahahah said:

I don't get what the point of this post is suppose to be. Is NVIDIA's most advanced and latest iteration of tech its best? Why yes, their best and latest tech is the best!

Because of the power efficiency of Pascal which has never been done before where you could fit an entire desktop GPU with the same specs and similar clock speeds as it's desktop brethren. That's what makes this special.

No shit, its the latest iteration and its the best... and I'm not sure this is such a good thing, they could have made the desktop more powerful hypothetically since it fits in a mobile. This could just as easily be seen as limiting the variations for parity.

Latest iteration and it being the best is not what the argument is about. It's about the fact that they brought an entire new generation with better performance than the previous generation and the competition and they did it withing the power envelop where they could put the exact same GPU inside the laptop. THAT has never been done before in nVidia's history.

Its the best iteration... it means its the best... its comes after the previous best... this is a pointless thread.

And you didn't understand my point, if the power design can fit in a laptop, then they held the desktop version back for parity. This was a choice they made so they left some room to grow for the future.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 scatteh316
Member since 2004 • 10273 Posts

G80 was and still is Nvidia at their peak of innovation and raising the bar....

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#49 Xtasy26
Member since 2008 • 5582 Posts

@waahahah said:
@Xtasy26 said:
@waahahah said:
@Xtasy26 said:
@waahahah said:

I don't get what the point of this post is suppose to be. Is NVIDIA's most advanced and latest iteration of tech its best? Why yes, their best and latest tech is the best!

Because of the power efficiency of Pascal which has never been done before where you could fit an entire desktop GPU with the same specs and similar clock speeds as it's desktop brethren. That's what makes this special.

No shit, its the latest iteration and its the best... and I'm not sure this is such a good thing, they could have made the desktop more powerful hypothetically since it fits in a mobile. This could just as easily be seen as limiting the variations for parity.

Latest iteration and it being the best is not what the argument is about. It's about the fact that they brought an entire new generation with better performance than the previous generation and the competition and they did it withing the power envelop where they could put the exact same GPU inside the laptop. THAT has never been done before in nVidia's history.

Its the best iteration... it means its the best... its comes after the previous best... this is a pointless thread.

And you didn't understand my point, if the power design can fit in a laptop, then they held the desktop version back for parity. This was a choice they made so they left some room to grow for the future.

Well, I could make the same argument as that they held back the previous generation ie. "Maxwell" for parity's sake. Which would be untrue because Maxwell didn't have the power consumption and performance that was the same exact as their laptop counterparts. But whereas with Pascal they do hence my argument that this was their best generation in terms of power/performance NOT price.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#50 waahahah
Member since 2014 • 2462 Posts

@Xtasy26 said:

Well, I could make the same argument as that they held back the previous generation ie. "Maxwell" for parity's sake. Which would be untrue because Maxwell didn't have the power consumption and performance that was the same exact as their laptop counterparts. But whereas with Pascal they do hence my argument that this was their best generation in terms of power/performance NOT price.

No you couldn't. They could have lead with 960m and 970m without the m on the end as the architecture was still a huge improvement.

Pascal architecture isn't the thing that allowed them to get the power design down. What allowed them to do that was transferring to smaller node sizes. Thats not nvidia's tech or achievement. They have significant room for improvements with the die shrinkage.