• 184 results
  • 1
  • 2
  • 3
  • 4
Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#51  Edited By KHAndAnime
Member since 2009 • 17565 Posts
@Coseniath said:

Lol! xD

I heard the "founders edition" is another word for "reference" anyways. Apparently there is no binning or anything like that happening.

There are not two GTX 1080 models made by nVidia. Only the “Founder's Edition” exists; there is not a cheaper card made by nVidia than the $700 Founder's Edition, which ships first.

Seems like a ton of people are selling their 980 Ti's under the impression they're getting a 1080 for $600...

...this will be funny. Considering the used market is misinformed at the moment, I'm definitely keeping the 980 Ti for awhile. Maybe I'll buy someone else's 980 Ti for $350 or whatever. Then I can have a card that's not much slower than the best card to be released but only half the price...

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#52  Edited By SaintSatan
Member since 2003 • 1986 Posts

@KHAndAnime said:
@Coseniath said:

@saintsatan: Wait!

Along with reference cards, we will have non-reference aswell.

I am 100% sure that you could find better non-reference GTX1080...

Reviews and full specs are coming at May 17th.

No point in talking sense to the madman. He likes paying more for less. (That's why he has an Alienware).

Oh you. I shopped through every single high end laptop maker (Sager, MSI, Origin, ect.) and Alienware beat their price AND gave me free gear. Alienware was the cheapest by far. Only a couple people even make SLI laptops. If by chance your were comparing laptop to desktop performance than you're being dumb. Desktop wasn't an option for me.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#53  Edited By KHAndAnime
Member since 2009 • 17565 Posts

@saintsatan said:

No point in talking sense to the madman. He likes paying more for less. (That's why he has an Alienware).

Only a couple people even make SLI laptops.

Why exactly do you need a SLI gaming laptop? Why is a desktop not an option? There are plenty of small, powerful, portable gaming PCs out there.

I mean, unless you're doing all of your gaming on an airplane...(you aren't). I just don't see what you're accomplishing that you wouldn't accomplish if you used a cheap laptop for traveling purposes and playing less intensive games, and had a LAN-type dedicated gaming rig if you needed to bring the rig around. To me it looks like you spent $2k on a laptop that's only a few years old but is already about to be on the lower side of medium end and you can't do anything to upgrade it. If that's not paying more for less, I don't know what is.

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#54 SaintSatan
Member since 2003 • 1986 Posts

@KHAndAnime said:
@saintsatan said:

No point in talking sense to the madman. He likes paying more for less. (That's why he has an Alienware).

Only a couple people even make SLI laptops.

Why exactly do you need a SLI gaming laptop? Why is a desktop not an option? There are plenty of small, powerful, portable gaming PCs out there.

I mean, unless you're doing all of your gaming on an airplane...(you aren't). I just don't see what you're accomplishing that you wouldn't accomplish if you used a cheap laptop for traveling purposes and playing less intensive games, and had a LAN-type dedicated gaming rig if you needed to bring the rig around. To me it looks like you spent $2k on a laptop that's only a few years old but is already about to be on the lower side of medium end and you can't do anything to upgrade it. If that's not paying more for less, I don't know what is.

The better question is why you care so much. I already said a desktop was not an option, I don't know why you keep continuing with such a dumb argument. I would have had to move it several times a day on some days. There's nothing more you need to know.

And lower side of medium end lol. It scores higher than 75% of all desktop PC's in 3D Mark and double the average gaming laptop. I can run pretty much every game ever with max settings. And you can upgrade laptops.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#55 Coseniath
Member since 2004 • 3183 Posts
@KHAndAnime said:
@Coseniath said:

Lol! xD

I heard the "founders edition" is another word for "reference" anyways. Apparently there is no binning or anything like that happening.

There are not two GTX 1080 models made by nVidia. Only the “Founder's Edition” exists; there is not a cheaper card made by nVidia than the $700 Founder's Edition, which ships first.

Seems like a ton of people are selling their 980 Ti's under the impression they're getting a 1080 for $600...

...this will be funny. Considering the used market is misinformed at the moment, I'm definitely keeping the 980 Ti for awhile. Maybe I'll buy someone else's 980 Ti for $350 or whatever. Then I can have a card that's not much slower than the best card to be released but only half the price...

+1 about the Founder Edition.

NVIDIA GTX 1080 & GTX 1070 ‘Founders Edition’ explained

So to sump up, MSRP which stands for Manufacturers Suggested Retail Price is still 599 USD for GTX 1080 and 379 USD for GTX 1070. Only special reference edition named ‘Founders Edition’ will cost more, because it uses unique cooler shroud and it should be available sooner than custom cards. That said, samples sent to reviewers will not be any better products than you will later find in stores.

So @saintsatan there is really no point going for Founders Edition, since the superior non-reference cards will cost less...

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#56 SaintSatan
Member since 2003 • 1986 Posts

@Coseniath: Great. Now I just need to find all of the rest of the parts I'm gonna use.

Avatar image for BassMan
BassMan

17918

Forum Posts

0

Wiki Points

0

Followers

Reviews: 227

User Lists: 0

#57  Edited By BassMan
Member since 2002 • 17918 Posts

I was under the impression that the third party cards were launching at the same time. Now the reference cards are coming first. I can wait for the custom coolers.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#58  Edited By KHAndAnime
Member since 2009 • 17565 Posts
@saintsatan said:

The better question is why you care so much. I already said a desktop was not an option, I don't know why you keep continuing with such a dumb argument. I would have had to move it several times a day on some days. There's nothing more you need to know.

And lower side of medium end lol. It scores higher than 75% of all desktop PC's in 3D Mark and double the average gaming laptop. I can run pretty much every game ever with max settings. And you can upgrade laptops.

The other day I moved my ITX build to 3 different parts of the house. Could've easily done it more if I wanted, without having to waste money on an overpriced laptop. There's a reason why they don't make many SLI laptops: most people know they're a waste of $. It's a tiny niche targeting exclusively people who don't care about blowing money on something that will be worthless pretty soon. If it was so inconvenient to move a gaming LAN PC around, LAN meet-ups would be dominated by laptops (which is rarely the case).

It scores higher than 75% of all desktop PC's in 3D Mark and double the average gaming laptop.

You missed the keyword "about to be". In a month that percentage is going to drop a lot (not that the percentage is actually relevant to anything). The 1070, the next mid-range card, is slated to be significantly faster than your 780m SLI setup. That would put you in the lower bracket of medium end, wouldn't it?

If you're so big on SLI, you'd be better off getting a couple of used 980 Ti's for $350 each. That would crush the 1080 for the same money.

And you can upgrade laptops.

Yea but the re-sell value of your parts suck by comparison, and your upgrade options are very limited.

@BassMan said:

I was under the impression that the third party cards were launching at the same time. Now the reference cards are coming first. I can wait for the custom coolers.

It doesn't sound out of the realm of possibility that the third party cards are coming almost immediately after reference. We'll see I suppose.

@Coseniath said:

So @saintsatan there is really no point going for Founders Edition, since the superior non-reference cards will cost less...

I'd be surprised if third party manufacturers sold their superior cards for less than reference. Never seen it happen before, at least.

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59  Edited By FelipeInside
Member since 2003 • 28548 Posts

@KHAndAnime said:
@saintsatan said:

The better question is why you care so much. I already said a desktop was not an option, I don't know why you keep continuing with such a dumb argument. I would have had to move it several times a day on some days. There's nothing more you need to know.

And lower side of medium end lol. It scores higher than 75% of all desktop PC's in 3D Mark and double the average gaming laptop. I can run pretty much every game ever with max settings. And you can upgrade laptops.

The other day I moved my ITX build to 3 different parts of the house. Could've easily done it more if I wanted, without having to waste money on an overpriced laptop. There's a reason why they don't make many SLI laptops: most people know they're a waste of $. It's a tiny niche targeting exclusively people who don't care about blowing money on something that will be worthless pretty soon. If it was so inconvenient to move a gaming LAN PC around, LAN meet-ups would be dominated by laptops (which is rarely the case).

Are you seriously questioning why he wants to use a laptop ?

Jesusssssss. The guy wants to game on a laptop, end of story. He doesn't need to explain why. If no one used laptops then manufacturers wouldn't sell them. Doesn't matter if your ITX build is small or whatever, a laptop is ALWAYS going to be more portable than a desktop.

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#60  Edited By SaintSatan
Member since 2003 • 1986 Posts

@KHAndAnime said:

The other day I moved my ITX build to 3 different parts of the house. Could've easily done it more if I wanted, without having to waste money on an overpriced laptop. There's a reason why they don't make many SLI laptops: most people know they're a waste of $. It's a tiny niche targeting exclusively people who don't care about blowing money on something that will be worthless pretty soon. If it was so inconvenient to move a gaming LAN PC around, LAN meet-ups would be dominated by laptops (which is rarely the case).

It scores higher than 75% of all desktop PC's in 3D Mark and double the average gaming laptop.

If you're so big on SLI, you'd be better off getting a couple of used 980 Ti's for $350 each. That would crush the 1080 for the same money.

I don't like nor want SLI. I don't know where you're coming up this this.

Avatar image for howmakewood
Howmakewood

7718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#61 Howmakewood
Member since 2015 • 7718 Posts

Don't have brand loyalty so I'll wait till 1080ti and the polaris cards are out, then pick the better option and I still can't be bothered with SLI, dx12 pushing the multi card support to the devs surely isn't going to make the sli/crossfire support better

Avatar image for horgen
horgen

127530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#62 horgen  Moderator
Member since 2006 • 127530 Posts
@howmakewood said:

Don't have brand loyalty so I'll wait till 1080ti and the polaris cards are out, then pick the better option and I still can't be bothered with SLI, dx12 pushing the multi card support to the devs surely isn't going to make the sli/crossfire support better

Polaris will compete with 1070 and lower. Vega with 1080 and above...

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#63  Edited By SaintSatan
Member since 2003 • 1986 Posts

Well this is what I've come up with so far and it's right under my $3k budget. I might break my budget and get SLI. Any suggestions?

Avatar image for howmakewood
Howmakewood

7718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#64 Howmakewood
Member since 2015 • 7718 Posts

@horgen said:
@howmakewood said:

Don't have brand loyalty so I'll wait till 1080ti and the polaris cards are out, then pick the better option and I still can't be bothered with SLI, dx12 pushing the multi card support to the devs surely isn't going to make the sli/crossfire support better

Polaris will compete with 1070 and lower. Vega with 1080 and above...

I remember @ronvalencia saying polaris 10 would scale high and replace fury x? Well have to see when the 1080ti is coming out and when Vega arrives

Avatar image for horgen
horgen

127530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#65  Edited By horgen  Moderator
Member since 2006 • 127530 Posts
@howmakewood said:

I remember @ronvalencia saying polaris 10 would scale high and replace fury x? Well have to see when the 1080ti is coming out and when Vega arrives

Polaris 10 is said to match the performance of the 980Ti, on a die that is less than half the size.

Edit: Source

Another one says it will match 390X for a lower price

Avatar image for howmakewood
Howmakewood

7718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#66 Howmakewood
Member since 2015 • 7718 Posts

@horgen: okay, so 1080ti or navi it is then

Avatar image for horgen
horgen

127530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#67 horgen  Moderator
Member since 2006 • 127530 Posts
@howmakewood said:

@horgen: okay, so 1080ti or navi it is then

Depends on how long nvidia will delay the launch of that card. If Vega is around the corner, I would recommend to wait for both to be released before making a purchase.

Now if only Intel would go on and release Skylake-E I would be happy.

Avatar image for howmakewood
Howmakewood

7718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#68 Howmakewood
Member since 2015 • 7718 Posts

@horgen said:
@howmakewood said:

@horgen: okay, so 1080ti or navi it is then

Depends on how long nvidia will delay the launch of that card. If Vega is around the corner, I would recommend to wait for both to be released before making a purchase.

Now if only Intel would go on and release Skylake-E I would be happy.

Ye I don't see 1080ti coming out too soon, so might as well wait the Vega out if it doesn't drag too far in to 2017

Avatar image for horgen
horgen

127530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#69 horgen  Moderator
Member since 2006 • 127530 Posts

@howmakewood: Time will tell. At least you can get your money worth from the 980Ti.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#70  Edited By ronvalencia
Member since 2008 • 29612 Posts

@howmakewood said:

I remember @ronvalencia saying polaris 10 would scale high and replace fury x? Well have to see when the 1080ti is coming out and when Vega arrives

There's a high end Polaris 10 SKU replacing R9-Fury Series.

Note that Vega has top-to-bottom market segments just like Polaris.

@horgen said:
@howmakewood said:

I remember @ronvalencia saying polaris 10 would scale high and replace fury x? Well have to see when the 1080ti is coming out and when Vega arrives

Polaris 10 is said to match the performance of the 980Ti, on a die that is less than half the size.

Edit: Source

Another one says it will match 390X for a lower price

That's just early engineering samples.

Avatar image for horgen
horgen

127530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#71 horgen  Moderator
Member since 2006 • 127530 Posts

@ronvalencia: Any rumours about clock speed? wccftech said something about a 2304 steam processers GPU at 800MHz matched 390X or so.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#72 Coseniath
Member since 2004 • 3183 Posts
@KHAndAnime said:
@Coseniath said:

So @saintsatan there is really no point going for Founders Edition, since the superior non-reference cards will cost less...

I'd be surprised if third party manufacturers sold their superior cards for less than reference. Never seen it happen before, at least.

I think it was about time. Nvidia was using a magnesium alloy cooler, while third party manufacturers were using a full plastic. :P

So Nvidia decided to charge $100 for their "elegant" cooler. :P

@ronvalencia: By replacing R9 Fury series, doesn't mean that it will be way faster, it might have same performance for far less money.

You can see Polaris and Vega have like 6 months between them. This is not the difference between two entire different architectures, but its the difference between mainstream and high end products.

AMD's Roy Taylor already confirmed this, just two weeks ago:

AMD’s Polaris will be a mainstream GPU, not high-end

In its latest financial report, the company noted that Polaris 11 would target "the notebook market," while Polaris 10 would target "the mainstream desktop and high-end gaming notebook segment."

AMD's Roy Taylor also confirmed that Polaris would target mainstream users, particularly those interested in creating a VR-ready system.

"The reason Polaris is a big deal, is because I believe we will be able to grow that TAM [total addressable market] significantly," said Taylor. "I don't think Nvidia is going to do anything to increase the TAM, because according to everything we've seen around Pascal, it's a high-end part. I don't know what the price is gonna be, but let's say it's as low as £500/$600 and as high as £800/$1000. That price range is not going to expand the TAM for VR. We're going on the record right now to say Polaris will expand the TAM. Full stop."

So AMD clarifies Polaris as a mainstream GPU. And this is normal for a just 232mm2 GPU. Vega will be the high-end.

Unless by "high end Polaris 10 SKU" you mean the full Polaris 10 chip which will probably perform as fast as FuryX (or even a little faster) for $300.

Avatar image for deactivated-5b69bebd1b0b6
deactivated-5b69bebd1b0b6

6176

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73 deactivated-5b69bebd1b0b6
Member since 2009 • 6176 Posts

So how much stronger is 1070 over my GTX 970? Would it be a worth while investment?

Avatar image for deactivated-5f768591970d3
deactivated-5f768591970d3

1255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 deactivated-5f768591970d3
Member since 2004 • 1255 Posts

@saintsatan: Im having the same thought as you. Im going broadwell E and ordered my 1440p 144hz gsync monitor. Kinda thinking I may want to splurge and SLI 1080s.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#75 Coseniath
Member since 2004 • 3183 Posts
@Crossel777 said:

So how much stronger is 1070 over my GTX 970? Would it be a worth while investment?

If we take Nvidia's statement about TitanX (and a little more) level of performance, then GTX1070 will be 45% than GTX970.

Personally I have a heavy factory overclocked (1405MHz boost) GTX970 and I will pass this GTX1070.

Maybe I will wait for Volta, or if they do the same thing as Kepler, GTX1080(with faster clock)--->GTX1170, I might buy it...

Avatar image for neatfeatguy
neatfeatguy

4402

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#76  Edited By neatfeatguy
Member since 2005 • 4402 Posts

I'm content with my 980Ti. It was a massive jump in performance over 2 570s in SLI. The 570s (if SLI profile worked well) would run about 15-20% behind a single GTX 970.

I don't have any plans to upgrade soon.....

Though, if I come across a deal on another 980Ti, I just might have to get it.

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#77 SaintSatan
Member since 2003 • 1986 Posts

@ankor77: Yeah I might go for this. SLI and different monitor. But damn it's expensive...

Avatar image for deactivated-5f768591970d3
deactivated-5f768591970d3

1255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78 deactivated-5f768591970d3
Member since 2004 • 1255 Posts

@saintsatan: Yup! Broadwell E/1080 sli and new monitors add up!

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#79  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Coseniath said:
@KHAndAnime said:
@Coseniath said:

So @saintsatan there is really no point going for Founders Edition, since the superior non-reference cards will cost less...

I'd be surprised if third party manufacturers sold their superior cards for less than reference. Never seen it happen before, at least.

I think it was about time. Nvidia was using a magnesium alloy cooler, while third party manufacturers were using a full plastic. :P

So Nvidia decided to charge $100 for their "elegant" cooler. :P

@ronvalencia: By replacing R9 Fury series, doesn't mean that it will be way faster, it might have same performance for far less money.

You can see Polaris and Vega have like 6 months between them. This is not the difference between two entire different architectures, but its the difference between mainstream and high end products.

AMD's Roy Taylor already confirmed this, just two weeks ago:

AMD’s Polaris will be a mainstream GPU, not high-end

In its latest financial report, the company noted that Polaris 11 would target "the notebook market," while Polaris 10 would target "the mainstream desktop and high-end gaming notebook segment."

AMD's Roy Taylor also confirmed that Polaris would target mainstream users, particularly those interested in creating a VR-ready system.

"The reason Polaris is a big deal, is because I believe we will be able to grow that TAM [total addressable market] significantly," said Taylor. "I don't think Nvidia is going to do anything to increase the TAM, because according to everything we've seen around Pascal, it's a high-end part. I don't know what the price is gonna be, but let's say it's as low as £500/$600 and as high as £800/$1000. That price range is not going to expand the TAM for VR. We're going on the record right now to say Polaris will expand the TAM. Full stop."

So AMD clarifies Polaris as a mainstream GPU. And this is normal for a just 232mm2 GPU. Vega will be the high-end.

Unless by "high end Polaris 10 SKU" you mean the full Polaris 10 chip which will probably perform as fast as FuryX (or even a little faster) for $300.

Vega has top-to-bottom SKUs

From http://www.anandtech.com/show/10145/amd-unveils-gpu-architecture-roadmap-after-polaris-comes-vega

It mentions Vega 10 and 11.

From http://www.extremetech.com/extreme/195897-samsung-and-apple-team-up-on-14nm-chips-expected-in-2015

Samsung/GF 14nm FinFET LPE is smaller than TSMC's 16nm FinFET.

TSMC's 16 nm is BS i.e. it's just FinFET version from thier 20 nm Planar.

http://semimd.com/blog/2014/04/17/globalfoundries-and-samsung-join-forces-on-14nm-finfets/

When compared to Intel's 14 nm FinFET, Samsung/GloFlo is less BS while TSMC's 16 nm FinFET is the larger BS.

According to Samsung, Polaris 10 built on TSMC's 16 nm FinFET would be around 267 mm^2.

Polaris 10 at 267 mm^2 would exceed 1st gen PS3's RSX 258 mm^2 size.

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#80  Edited By SaintSatan
Member since 2003 • 1986 Posts

Now I'm positive on which build I'm going for. I just called PayPal credit and they more than doubled (almost tripled) my credit line and no interest or payments for 6 months. I'm going 1080 SLI and I just bought this monitor...

#YOLO

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81  Edited By FelipeInside
Member since 2003 • 28548 Posts

@saintsatan said:

Now I'm positive on which build I'm going for. I just called PayPal credit and they more than doubled (almost tripled) my credit line and no interest or payments for 6 months. I'm going 1080 SLI and I just bought this monitor...

#YOLO

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#82  Edited By SaintSatan
Member since 2003 • 1986 Posts

@FelipeInside said:
@saintsatan said:

Now I'm positive on which build I'm going for. I just called PayPal credit and they more than doubled (almost tripled) my credit line and no interest or payments for 6 months. I'm going 1080 SLI and I just bought this monitor...

#YOLO

The only thing missing is $1,200 for the two 1080's and a couple minor parts. It comes to about $4,000 like the PC part picker screen I posted above (and edited in below). I have the best laptop. Now it's time to unleash this upon the world. You dared doubt my madness?

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 FelipeInside
Member since 2003 • 28548 Posts

@saintsatan said:
@FelipeInside said:
@saintsatan said:

Now I'm positive on which build I'm going for. I just called PayPal credit and they more than doubled (almost tripled) my credit line and no interest or payments for 6 months. I'm going 1080 SLI and I just bought this monitor...

#YOLO

The only thing missing is $1,200 for the two 1080's and a couple minor parts. It comes to about $4,000 like the PC part picker screen I posted above (and edited in below). I have the best laptop. Now it's time to unleash this upon the world. You dared doubt my madness?

You crazy bro. You don't even NEED that much power yet, lol...

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#84 SaintSatan
Member since 2003 • 1986 Posts

@FelipeInside said:
@saintsatan said:
@FelipeInside said:
@saintsatan said:

Now I'm positive on which build I'm going for. I just called PayPal credit and they more than doubled (almost tripled) my credit line and no interest or payments for 6 months. I'm going 1080 SLI and I just bought this monitor...

#YOLO

The only thing missing is $1,200 for the two 1080's and a couple minor parts. It comes to about $4,000 like the PC part picker screen I posted above (and edited in below). I have the best laptop. Now it's time to unleash this upon the world. You dared doubt my madness?

You crazy bro. You don't even NEED that much power yet, lol...

Probably not lol. I'll just open one 1080 and see how it goes.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#85 Coseniath
Member since 2004 • 3183 Posts
@ronvalencia said:

Vega has top-to-bottom SKUs

From http://www.anandtech.com/show/10145/amd-unveils-gpu-architecture-roadmap-after-polaris-comes-vega

It mentions Vega 10 and 11.

Yeah we know that they will have Vega 10 and Vega 11.

I am glad you mentioned the article you got the info.

Now, if you read the article you linked you will see:

The fact that Vega comes this soon after Polaris is interesting; it seems hard to believe that it’s a direct successor to Polaris – I can’t see AMD replacing Polaris parts in less than a year – so this points to Vega being more of a cousin, and is where AMD’s naming system isn’t especially helpful in deciphering anything further.

We know also that Vega 11 will be smaller than Vega 10.

But have you ever thought, that Vega 11 might be like 350mm2 to counter GP104 and Vega10 to be like 500mm2 to counter GP100?? And we still have Vega 11 smaller than Vega 10.

Also what you posted doesn't say anything for Polaris being desktop high end.

I already posted you about Corporate Vice President Alliances at AMD saying that Polaris is just a mainstream GPU, not a high end, 2 weeks ago.

So unless you are calling AMD's Roy Taylor a liar, there is no reason to continue to disagree...

And about fabrication proccess...

I agree with you and I already said that TSMC and Samsung is BS about 14nm.

TSMC's is just an improved 22nm and Samsung's is just an improved 20nm.

Only Intel has truly 14nm.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#86  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Coseniath:

That's 2014 source from T. Song.

http://www.extremetech.com/extreme/195897-samsung-and-apple-team-up-on-14nm-chips-expected-in-2015

http://vrworld.com/2014/04/17/samsung-and-globalfoundries-collaborate-on-14nm-finfet/

Samsung's 14 nm FinFET is up to 15 percent smaller than the other foundries.

Samsung's two 14 nm FinFET nodes. Year 2015 14LPP is going to be atleast 5% denser than Year 2014 14LPE.

Intel doesn't build gaming GPUs that matters i.e. weaker Xbone.

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87 RekonMeister
Member since 2016 • 784 Posts

This is the kind of performance leap I was waiting for, Pascal is truly going to be amazing, i'm saving for Skylake and will be buying a GTX 1080 to go with my 1440P Gsync panel.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#88  Edited By Coseniath
Member since 2004 • 3183 Posts

@ronvalencia: I said, I agree with you that Samsung looks better on the paper than TSMC.

But how do you explain that the same chip (A9) that was created with both TSMC's 16nm and Samsung's 14nm, people found that TSMC's chip was superior?

#Apple_Chipgate

Loading Video...

ps: The 15% you wrote before, its not about other 16nm Finfet but from 20nm Planar in your link:

indicating a 15% smaller chip with the switch from 20nm Planar to 14nm FinFET.

...

Avatar image for BassMan
BassMan

17918

Forum Posts

0

Wiki Points

0

Followers

Reviews: 227

User Lists: 0

#89  Edited By BassMan
Member since 2002 • 17918 Posts

@rekonmeister said:

This is the kind of performance leap I was waiting for, Pascal is truly going to be amazing, i'm saving for Skylake and will be buying a GTX 1080 to go with my 1440P Gsync panel.

Umm.... if you are saving for Skylake and looking to get a GTX 1080, why do you already have these items in your rig signature?

"Intel Core i7 6700k \ ASUS ROG Ranger VIII \ \ 16GB 2666 Corsair LPX \ EVGA GeForce GTX 1080 8GB G5X \ Windows 10 64 Pro \ 2560x1440 144hz G-Sync \ Logitech G303 \ Logitech G710+ Playstation 2 SCPH 70003 + FMCB, Original XBOX 1.6 Crystal Xecuter Flash TSOP + 500GB HDD, N64 NUS-001, Sega Dreamcast HKT-3030"

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#90  Edited By RekonMeister
Member since 2016 • 784 Posts

@BassMan said:
@rekonmeister said:

This is the kind of performance leap I was waiting for, Pascal is truly going to be amazing, i'm saving for Skylake and will be buying a GTX 1080 to go with my 1440P Gsync panel.

Umm.... if you are saving for Skylake and looking to get a GTX 1080, why do you already have these items in your rig signature?

"Intel Core i7 6700k \ ASUS ROG Ranger VIII \ \ 16GB 2666 Corsair LPX \ EVGA GeForce GTX 1080 8GB G5X \ Windows 10 64 Pro \ 2560x1440 144hz G-Sync \ Logitech G303 \ Logitech G710+ Playstation 2 SCPH 70003 + FMCB, Original XBOX 1.6 Crystal Xecuter Flash TSOP + 500GB HDD, N64 NUS-001, Sega Dreamcast HKT-3030"

Is that an issue? I already have the CPU and board, waiting for RAM to arrive Monday and 1080 is not out yet, for now i'm running a EVGA GTX 580.

Monitor I have had for ages but could not use it because Fermi is not compatible.

Oh I see I put i'm saving for Skylake LMAO my bad man, I meant just Pascal.

Avatar image for BassMan
BassMan

17918

Forum Posts

0

Wiki Points

0

Followers

Reviews: 227

User Lists: 0

#91 BassMan
Member since 2002 • 17918 Posts

@rekonmeister: Just trying to keep things honest around here. ;)

Looks like you are going to have a beastly rig when it is complete.

Avatar image for Bikouchu35
Bikouchu35

8344

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#92 Bikouchu35
Member since 2009 • 8344 Posts

@BassMan said:
@rekonmeister said:

This is the kind of performance leap I was waiting for, Pascal is truly going to be amazing, i'm saving for Skylake and will be buying a GTX 1080 to go with my 1440P Gsync panel.

Umm.... if you are saving for Skylake and looking to get a GTX 1080, why do you already have these items in your rig signature?

"Intel Core i7 6700k \ ASUS ROG Ranger VIII \ \ 16GB 2666 Corsair LPX \ EVGA GeForce GTX 1080 8GB G5X \ Windows 10 64 Pro \ 2560x1440 144hz G-Sync \ Logitech G303 \ Logitech G710+ Playstation 2 SCPH 70003 + FMCB, Original XBOX 1.6 Crystal Xecuter Flash TSOP + 500GB HDD, N64 NUS-001, Sega Dreamcast HKT-3030"

Time traveller here. Excuse the language.

Avatar image for rekonmeister
RekonMeister

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#93 RekonMeister
Member since 2016 • 784 Posts

@BassMan said:

@rekonmeister: Just trying to keep things honest around here. ;)

Looks like you are going to have a beastly rig when it is complete.

I do hope so, as you can tell I don't make big upgrades for years.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#94  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Coseniath said:

@ronvalencia: I said, I agree with you that Samsung looks better on the paper than TSMC.

But how do you explain that the same chip (A9) that was created with both TSMC's 16nm and Samsung's 14nm, people found that TSMC's chip was superior?

#Apple_Chipgate

ps: The 15% you wrote before, its not about other 16nm Finfet but from 20nm Planar in your link:

indicating a 15% smaller chip with the switch from 20nm Planar to 14nm FinFET.

...

Apple's Chipgate has nothing to do with chip's geometry size.

Samsung's 14nm FinFET LPE version is 8.1 percent smaller than TSMC's 16nm FinFET version.

Source: http://www.tomshardware.com/news/iphone-6s-a9-samsung-vs-tsmc,30306.html

Samsung's Year 2015 14LPP is going to be at least 5% denser than Year 2014 14LPE.

From http://www.techtimes.com/articles/96935/20151019/iphone-6s-chipgate-apple-and-consumer-reports-vs-the-internet.htm

Apple used Samsung's 14 nm LPE FinFET which different from AMD Polaris' 14nm LPP FinFET

Ars Technica published its comparison of the TSMC and Samsung chips. And although there was hardly any difference for tasks that only utilized 50-60 of the chips' compute power, TSMC's still edged Samsung's A9 SoC. The bigger discrepancy came when processor-taxing apps like Geekbench 3 are run.

YouTuber Jonathan Morrison also reported a 5-7 percent difference in battery life, which depended on tasks performed, in favor of TSMC. Austin Evans also did his own testing and revealed a significant battery life advantage for the TSMC on heavy phone usage.

"Our testing and customer data show the actual battery life of the iPhone 6s and iPhone 6s Plus, even taking into account variable component differences, vary within just 2-3 percent of each other," said Apple.

http://www.tomshardware.com/news/iphone-6s-a9-samsung-vs-tsmc,30306.html

Samsung's 14nm LPE FinFET has lower temps than TSMC's 14nm FinFET version.

From http://www.techpowerup.com/216361/globalfoundries-14-nm-lpp-finfet-node-taped-out-yields-good

AMD Polaris and AMD ZEN to be fab'ed with Samsung licensed 14 nm LPP FinFET on GloFlo.

AMD Kaveri APU's chip size is 245 mm^2 (28 nm) and AMD A6-7400K SKU's price tag is $50.99 retail i.e. chip price from GloFlo is low.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#95 Coseniath
Member since 2004 • 3183 Posts

@ronvalencia: I told you, I know that Samsung's fabrication is smaller, I just said that TSMC's fabrication even that is bigger, seems to be better.

And I checked a lot of sites and I saw that only Tom's Hardware tries to prove the opposite.

But as it seems, Apple doesn't agree with Tom's Hardware:

Report says ‘Chipgate’ may have cost Samsung Apple’s iPhone 7 business

Apple’s confirmation of the issue usually sounds reassuring, which suggests the company is working on a fix if it hasn’t already found one, which was certainly true for Antennagate and Bendgate. With Chipgate, Apple might have fixes in the works, including one that it will likely enjoy enacting: Apparently, Samsung might lose the A10 chip business to TSMC, which could be Apple’s way of preventing potential chip-related issues next year.

The problem that TSMC provide better results, its true and Apple has already confirmed it.

ps: I just found some news about Polaris and Vega. Vega is coming sooner to counter GP104.

Avatar image for dethtrain
dethtrain

570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#96 dethtrain
Member since 2004 • 570 Posts

@saintsatan said:

@BassMan: @ShepardCommandr: Bassman speaks the truth, I have SLI too. SLI support has been getting worse and worse. You're better off with a single 1080. I still might splurge and get two 1080's for the hell of it. #YOLO

Yeah you're both right. A few AAA titles weren't utilizing sli recently the past year or year and a half. I have a 690 now, and I don't think I'll go SLI for awhile.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#97  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Coseniath said:

@ronvalencia: I told you, I know that Samsung's fabrication is smaller, I just said that TSMC's fabrication even that is bigger, seems to be better.

And I checked a lot of sites and I saw that only Tom's Hardware tries to prove the opposite.

But as it seems, Apple doesn't agree with Tom's Hardware:

Report says ‘Chipgate’ may have cost Samsung Apple’s iPhone 7 business

Apple’s confirmation of the issue usually sounds reassuring, which suggests the company is working on a fix if it hasn’t already found one, which was certainly true for Antennagate and Bendgate. With Chipgate, Apple might have fixes in the works, including one that it will likely enjoy enacting: Apparently, Samsung might lose the A10 chip business to TSMC, which could be Apple’s way of preventing potential chip-related issues next year.

The problem that TSMC provide better results, its true and Apple has already confirmed it.

ps: I just found some news about Polaris and Vega. Vega is coming sooner to counter GP104.

Again, Chipgate has nothing to do with the chip's geometry size and temperature i.e. we talking about desktop PCs without mobile phone batteries.

Your article has word "might" hence it's speculative.

Against the article's "Apple’s confirmation of the issue"

http://www.businessinsider.com.au/apple-responds-to-chipgate-controversy-over-a9-chips-battery-life-2015-10?r=UK&IR=T

Apple angrily shoots back at claims different iPhones are seeing very different battery life

....

The Cupertino company is now shooting back at the tests used, claiming they are “not representative of real-world usage,” and that “the actual battery life of the iPhone 6s and iPhone 6s Plus, even taking into account variable component differences, vary within just 2-3% of each other.”

Geekbench is mostly a CPU benchmark and it's a crap benchmark .

http://www.realworldtech.com/forum/?threadid=136526&curpostid=136666

From Linus Torvalds.

Wilco, geekbench has apparently replaced dhrystone as your favourite useless benchmark.

Geekbench is SH*T.

It actually seems to have gotten worse with version 3, which you should be aware of. On ARM64, that SHA1 performance is hardware-assisted. I don't know if SHA2 is too, but Aarch64 does apparently do SHA256 in the crypto unit, so it might be fully or partially so.

And on both ARM and x86, the AES numbers are similarly just about the crypto unit.

So basically a quarter to a third of the "integer" workloads are just utter BS. They are not comparable across architectures due to the crypto units, and even within one architecture the numbers just don't mean much of anything.

And quite frankly, it's not even just the crypto ones. Looking at the other GB3 "benchmarks", they are mainly small kernels: not really much different from dhrystone. I suspect most of them have a code footprint that basically fits in a L1I cache.

Again,

Note that there are variation with chip's manufacture quality which is nothing to do with chip's geometry size. Unlucky user may have lesser quality silicon.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#98 Coseniath
Member since 2004 • 3183 Posts
@ronvalencia said:
@Coseniath said:

@ronvalencia: I told you, I know that Samsung's fabrication is smaller, I just said that TSMC's fabrication even that is bigger, seems to be better.

And I checked a lot of sites and I saw that only Tom's Hardware tries to prove the opposite.

But as it seems, Apple doesn't agree with Tom's Hardware:

Report says ‘Chipgate’ may have cost Samsung Apple’s iPhone 7 business

Apple’s confirmation of the issue usually sounds reassuring, which suggests the company is working on a fix if it hasn’t already found one, which was certainly true for Antennagate and Bendgate. With Chipgate, Apple might have fixes in the works, including one that it will likely enjoy enacting: Apparently, Samsung might lose the A10 chip business to TSMC, which could be Apple’s way of preventing potential chip-related issues next year.

The problem that TSMC provide better results, its true and Apple has already confirmed it.

ps: I just found some news about Polaris and Vega. Vega is coming sooner to counter GP104.

Again, Chipgate has nothing to do with the chip's geometry size and temperature i.e. we talking about desktop PCs without mobile phone batteries.

We can do an infinite loop here...

@Coseniath said:

@ronvalencia: I told you, I know that Samsung's fabrication is smaller, I just said that TSMC's fabrication even that is bigger, seems to be better.

...

@ronvalencia said:

Your article has word "might" hence it's speculative.

Against the article's "Apple’s confirmation of the issue"

http://www.businessinsider.com.au/apple-responds-to-chipgate-controversy-over-a9-chips-battery-life-2015-10?r=UK&IR=T

Apple angrily shoots back at claims different iPhones are seeing very different battery life

....

The Cupertino company is now shooting back at the tests used, claiming they are “not representative of real-world usage,” and that “the actual battery life of the iPhone 6s and iPhone 6s Plus, even taking into account variable component differences, vary within just 2-3% of each other.”

The "might" has nothing to do with what we discussing here.

The might is about the future. The chipgate is about the present that is already happening.

They speculate a future given the facts they have now. And the facts is the chipgate that obviously Apple is not happy about.

And against article's "Apple’s confirmation of the issue", really?

iPhone 6S battery life SCANDAL: Apple confirms chip manufacturer will affect YOUR device

Forget 'bendgate' now Apple's plagued by 'CHIPgate'! Testers claim different microchips inside iPhone 6s affect battery life, but the firm says its only varies by 2-3%

Also from Tom's Hardware: Apple confirmed to CNET Executive Editor Roger Cheng that there is a 2 to 3 percent difference in battery life between the TSMC and Samsung chips. But others are saying the delta is larger.

And I can continue, since search engines can find countless articles about Apple's confirmation...

And obviously Apple would confirm the problem (since it was there and they couldn't deny it) and start the damage control, saying its only 2-3%.

But, what did you expect? A company like Apple to confirm that the difference between them is big, making their phones that they sell to look like a lottery? LOL...

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#99 KHAndAnime
Member since 2009 • 17565 Posts

Someone picked up my 980 Ti for $525 off Ebay. I forgot I even had it listed still. I didn't get as much as I would have before the cards have been revealed ($~580) but I'll live. Guess that means I'll really be waiting for the cards to launch...

Those 3rd party cards better come FAST.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#100 Coseniath
Member since 2004 • 3183 Posts
@KHAndAnime said:

Someone picked up my 980 Ti for $525 off Ebay. I forgot I even had it listed still. I didn't get as much as I would have before the cards have been revealed ($~580) but I'll live. Guess that means I'll really be waiting for the cards to launch...

Those 3rd party cards better come FAST.

Soon™ :)

Also we got more info about GTX1080 performance: