Ryzen 7 3000 series leak + new line of GPU's

  • 97 results
  • 1
  • 2
  • 3
Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1  Edited By Gatygun
Member since 2010 • 2709 Posts

GPU's:

The good:

Cheaper and bit more performance most likely from the current lineup and hopefully it will press nvidia's prices. Also nice budget 1080 for people that just need performance and nothing else for cheap.

The bads:

Refresh, no raytracing or other nvidia like features, low amounts of memory. Only mid to low range gpu's.

CPU:

The good:

Cheap, lots of cores, high clocks, low prices, intel got pushed in a corner here massively. Even there flag ship 9900k chip is comparable to this ryzen refresh a mid gen chip for 250 bucks.

The best is we finally see 16/32 cores with high ghz something i have been waiting for already for a decade to upgrade towards. The price is still steep and i probably will wait until next gen consoles to a appearance and i know what the core counts are going to be. No hurry here.

Also this bode well for next generation consoles. CPU's in those boxes could be powerhouses.

The bad:

Gotta wait.

https://www.youtube.com/watch?v=PCdsTBsH-rI

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 R4gn4r0k
Member since 2004 • 46292 Posts

I'll be buying a Ryzen 3000.

Intel prices just aren't affordable for me anymore.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 Techhog89
Member since 2015 • 5430 Posts

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 R4gn4r0k
Member since 2004 • 46292 Posts

@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

What's fake ?

You think the price will be higher ? or the specs will be worse ?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6  Edited By scatteh316
Member since 2004 • 10273 Posts

No ray tracing?

Of course it'll do Ray Tracing...

/smh

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 scatteh316
Member since 2004 • 10273 Posts

And 5Ghz boost clock? I doubt AMD have managed to fix Ryzen's ~4Ghz clock wall that quickly although I will happily be surprised if they have.

Avatar image for davillain
DaVillain

56107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#8  Edited By DaVillain  Moderator
Member since 2014 • 56107 Posts

I just got myself a Ryzen 7 2700X and I totally love it! What a great CPU by AMD but I already knew the Ryzen 7 3000 series would make it's way to the world of PC market soon and I wasn't in the mood to wait for the 3000 series anyways, so the 2700X will do it for me. Gonna see how AMD plays out for the Ryzen 7 4000 series. I wanna wait till the 4000 series are a bare minimum to upgrade and only to double the core/thread count I have.

And as for their new GPU, another disappointing on the higher up but still, more competition is needed, so hey, more good thing for the PC crowd. If the Navi Cards can Crossfire with the on-board graphics of the G-series chips, imagine what kind of performance-per-dollar that would bring?

Avatar image for howmakewood
Howmakewood

7702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 Howmakewood
Member since 2015 • 7702 Posts

Yeah I'll believe the cpu specs when I see em

Avatar image for rmpumper
rmpumper

2134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 rmpumper
Member since 2016 • 2134 Posts

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11  Edited By scatteh316
Member since 2004 • 10273 Posts
@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

Doesn't work that way dude..... AMD's architecture is massively un-balanced which they said they're addressing with Navi. That itself should generate a good chunk of extra performance...

You also have to think process shrink will bring power and cost down.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12  Edited By Techhog89
Member since 2015 • 5430 Posts
@R4gn4r0k said:
@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

What's fake ?

You think the price will be higher ? or the specs will be worse ?

The whole report is fake. These won't be the SKUs we see.

@scatteh316 said:

And 5Ghz boost clock? I doubt AMD have managed to fix Ryzen's ~4Ghz clock wall that quickly although I will happily be surprised if they have.

This is a new architecture on a new node. They'll break the barrier. It's just a question of by how much.

@scatteh316 said:

No ray tracing?

Of course it'll do Ray Tracing...

/smh

None of these GPUs, if they did exist, are powerful enough for decent ray tracing. Besides that, the current method is Nvidia exclusive and patented.

@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

This is actually possible. The RTX series is on 12nm while this is on 7nm. If RTX were on 7nm the 2070 would be closer to 120W, if not less. It's the RTX series that didn't jump as much as usual.

Avatar image for rmpumper
rmpumper

2134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13  Edited By rmpumper
Member since 2016 • 2134 Posts

@techhog89 said:

@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

This is actually possible. The RTX series is on 12nm while this is on 7nm. If RTX were on 7nm the 2070 would be closer to 120W, if not less. It's the RTX series that didn't jump as much as usual.

GTX 10xx are 14nm, RX 5xx are 14nm (590 some kind of fake 12nm), but nvdia is twice as efficient. So nm do not mean anything when the whole architecture is shit.

Avatar image for mrbojangles25
mrbojangles25

58309

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#14 mrbojangles25
Member since 2005 • 58309 Posts

If they can come up with a video card that 90% as good as an RTX 2080 but 75% of the price, I just might buy it.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 R4gn4r0k
Member since 2004 • 46292 Posts

@techhog89: Everything I've heard so far about Zen 2 sounds really great, so I'll wait for final specs. But if Intel can't compete than I'll definitely go AMD this time.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16  Edited By Gatygun
Member since 2010 • 2709 Posts
@rmpumper said:
@techhog89 said:

The whole report is fake. These won't be the SKUs we see.

This is a new architecture on a new node. They'll break the barrier. It's just a question of by how much.

None of these GPUs, if they did exist, are powerful enough for decent ray tracing. Besides that, the current method is Nvidia exclusive and patented.

@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

This is actually possible. The RTX series is on 12nm while this is on 7nm. If RTX were on 7nm the 2070 would be closer to 120W, if not less. It's the RTX series that didn't jump as much as usual.

GTX 10xx are 14nm, RX 5xx are 14nm (590 some kind of fake 12nm), but nvdia is twice as efficient. So nm do not mean anything when the whole architecture is shit.

Why are you looking at Nvidia?

its a 7nm vega card with a updated architecture most likely. Original vega = 14nm ( massive difference )

Wanna see how much more you can do with it, here's something from a chief techical officier that makes 7nm cpu's and stated the following a year ago he also talks about 5ghz going to be gained relative easy with time.

While a move from 14 nm to 7 nm was expected to provide, at the very best, a halving in the actual size of a chip manufactured in 7 nm compared to 14 nm, Gary Patton is now saying that the are should actually be reduced by up to 2.7 times the original size. To put that into perspective, AMD's series processors on the Zeppelin die and 14 nm process, which come in at 213 mm² for the full, 8-core design, could be brought down to just 80 mm² instead. AMD could potentially use up that extra die space to either build in some overprovisioning, should the process still be in its infancy and yields need a small boost; or cram it with double the amount of cores and other architectural improvements, and still have chips that are smaller than the original Zen dies.

So in short 7nm process is carrying there entire ryzen line and soon gpu line with it.

It's clear AMD ( gpu ) is no longer chasing the top end. They are not pushing performance up most likely at all with the die shrink, so power consumption will take a massive nose dive. And yes if Nvidia released 14 > 12 nm and next up is another 12 nm card or higher then 7nm card, then AMD is skipping a few gens indeed.

The fact that there architecture for gpu's is clearly worse as you mention is also the reason they probably don't care much for pushing high end gpu's on 7nm or maybe they have other priority's as dedicated high end gpu's are hardly there priority or business right now.

It could also be that higher TDP means less efficient and basically harder for them to reach top end performance something they struggled with before which also makes zero sense push those 7nm vega chips performance.

I wouldn't be shocked if they actually underclocked some of those chips to gain even lower tdp.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 scatteh316
Member since 2004 • 10273 Posts
@techhog89 said:

@scatteh316 said:

No ray tracing?

Of course it'll do Ray Tracing...

/smh

None of these GPUs, if they did exist, are powerful enough for decent ray tracing. Besides that, the current method is Nvidia exclusive and patented.

Wrong, wrong, wrong and wrong.

Avatar image for deactivated-63d2876fd4204
deactivated-63d2876fd4204

9129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 deactivated-63d2876fd4204
Member since 2016 • 9129 Posts

Are you all being serious? Looking at the very first chip you can tell it's fake. A Ryzen 3, 6c/12t, 4 GHz boost, 50W TDP at $99? If any of you believe this...

Avatar image for davillain
DaVillain

56107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#19 DaVillain  Moderator
Member since 2014 • 56107 Posts

For those who keeps saying it's fake by Adored. Adored leak the RTX specs before Nvidia reveal it to the world and Adored's leaks was mostly right and most of the time, accurate leaks to say the least. Looking again from those leaks, this is too good to be true, especially Ryzen 7: 3700/3700X & Ryzen 9: 3800/3050X being the most attractive budget consumer friendly, there's no way this can't be fake.

As for the GPU side, I don't like the names AMD is going with this. 3060/3070/3080 is shady AF right now! AMD can come up with better names then this and honestly, naming higher up numbers over what Nvidia has now in the market is just so shady, this looks like it'll blow up AMD in the face with those specs.

Avatar image for Postosuchus
Postosuchus

907

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 Postosuchus
Member since 2005 • 907 Posts

AMD needs a comeback for their GPUs like they did for their CPU’s; Nvidia is following the same path Intel took during AMD’s bulldozer days (sky high prices, incremental improvements, etc).

Avatar image for Basinboy
Basinboy

14495

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#21 Basinboy
Member since 2003 • 14495 Posts

I’m committed to Nvidia GPUs due to owning several Gsync displays, but I’d be willing to consider a Ryzen build in the coming years. I’m also not against adopting Freesync displays moving forward, but I want to see how Intel plans to get in the GPU market first before making a decision.

Avatar image for adamosmaki
adamosmaki

10718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#22 adamosmaki
Member since 2007 • 10718 Posts
@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

while i take this rumors with a grain of salt is actually quite possible considering AMD is moving to 7nm . Just look the efficiency jump from AMD FX series to ryzen series. A 65W 6 core ryzen completely trounces a 120-130 W FX cpu in performance and efficiency.

Avatar image for musicalmac
musicalmac

25098

Forum Posts

0

Wiki Points

0

Followers

Reviews: 15

User Lists: 1

#23 musicalmac  Moderator
Member since 2006 • 25098 Posts

Am I the only one who finds the naming convention genuinely hilarious? lol

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

Zen at 12c/24t at 5Ghz boost and 105w TDP?... $330?

Avatar image for blackhairedhero
Blackhairedhero

3231

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#25  Edited By Blackhairedhero
Member since 2018 • 3231 Posts

So PS5 would rock a RX 3080 and a Ryzen 5 3600 or something along those lines.

Avatar image for mazuiface
mazuiface

1604

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#26 mazuiface
Member since 2016 • 1604 Posts

I know that Zen 2 will have higher clocks, but 16c/32t at 5.1ghz? Fake as f

That would destroy everything, even at 499 msrp.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27  Edited By Techhog89
Member since 2015 • 5430 Posts
@rmpumper said:
@techhog89 said:
@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

This is actually possible. The RTX series is on 12nm while this is on 7nm. If RTX were on 7nm the 2070 would be closer to 120W, if not less. It's the RTX series that didn't jump as much as usual.

GTX 10xx are 14nm, RX 5xx are 14nm (590 some kind of fake 12nm), but nvdia is twice as efficient. So nm do not mean anything when the whole architecture is shit.

14nm to 7nm more than doubles efficiency. This indicates a rough doubling of efficiency for AMD. It matches up perfectly.

And Nvidia isn't twice as efficient in the midrange. It's more efficient, but not 2x. https://www.techpowerup.com/reviews/Zotac/GeForce_RTX_2080_AMP_Extreme/32.html

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Techhog89
Member since 2015 • 5430 Posts
@scatteh316 said:
@techhog89 said:
@scatteh316 said:

No ray tracing?

Of course it'll do Ray Tracing...

/smh

None of these GPUs, if they did exist, are powerful enough for decent ray tracing. Besides that, the current method is Nvidia exclusive and patented.

Wrong, wrong, wrong and wrong.

Go on.

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#29 Pedro
Member since 2002 • 69479 Posts

@techhog89:DXR

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#30 GarGx1
Member since 2011 • 10934 Posts

This may or may not be true but it's looking like the expectations for top end Navi cards could be accurate.

@blackhairedhero said:

So PS5 would rock a RX 3080 and a Ryzen 5 3600 or something along those lines.

You can certainly hope so but $430 for GPU and CPU alone (I know Sony get a bulk buyer discount) may be a bit rich if they still want to make profit from their consoles. I'd be more inclined to expect a Navi 12 equivalent, at this point, in the next PlayStation's APU as it would keep the costs down.

You may be right though, only time will tell.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 Techhog89
Member since 2015 • 5430 Posts
@Pedro said:

@techhog89:DXR

That's only part of it. AMD would need to create their own hardware solution and get devs to support it.

Either way, ray-tracing is overhyped. It'll be at least 10 years before it becomes mainstream since next-gen consoles won't support it. Nvidia is just doing this to make AMD look bad. They know it's not really ready yet.

Avatar image for davillain
DaVillain

56107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#32 DaVillain  Moderator
Member since 2014 • 56107 Posts

@musicalmac said:

Am I the only one who finds the naming convention genuinely hilarious? lol

Dunno why AMD is copying the competition on every naming scheme now if true, almost seems like they are trying to get unsuspecting people to buy their stuff. (which of course looks to be the case cause AMD is just as sneaky as Nvidia is)

Avatar image for blackhairedhero
Blackhairedhero

3231

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#33 Blackhairedhero
Member since 2018 • 3231 Posts

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

Avatar image for adamosmaki
adamosmaki

10718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#34 adamosmaki
Member since 2007 • 10718 Posts
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

That wouldnt be a small loss if you factor Harddrive/bluray/power supply/ casing/motherboard. Its highly likely it will have an upper midrange GPU but the Cpu is quite likely it will be a lower end part . I dont believe they will do the mistake of using an outdated CPU design like the one on PS4 but i doubt it will be anything more than a lower end ryzen 3 ( which still would be a decent choice )

Avatar image for davillain
DaVillain

56107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#35 DaVillain  Moderator
Member since 2014 • 56107 Posts

@adamosmaki said:
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

That wouldnt be a small loss if you factor Harddrive/bluray/power supply/ casing/motherboard. Its highly likely it will have an upper midrange GPU but the Cpu is quite likely it will be a lower end part . I dont believe they will do the mistake of using an outdated CPU design like the one on PS4 but i doubt it will be anything more than a lower end ryzen 3 ( which still would be a decent choice )

If that's the case, I don't expect Sony to include PS4 BC if they want to keep the cost down as much as they can. PS3 price at the time was high due to PS2/PS1 BC making it more expensive then the Blu-Ray drive did.

And let's take into consideration that Sony most likely will take some of the ideas from Nintendo's Switch cause as it is, the sales are down in Japan. It's hard to say what Sony will go for.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#36 GarGx1
Member since 2011 • 10934 Posts
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

To hit $499 they'd need to get the rest of the machine together for ~$100 to $150, motherboard, RAM (prices are high at the moment), custom case, PSU, cooling, Optical drive (presumably), O/S, design, manufacturing, shipping and seller mark up. That's a big ask, remember AMD will be looking to make some kind of profit from these chips as well.

As far as I'm aware neither Sony or MS made a loss on the PS4 or Xbox One at launch. I could be wrong on that.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#37 04dcarraher
Member since 2004 • 23829 Posts

@davillain- said:
@adamosmaki said:
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

That wouldnt be a small loss if you factor Harddrive/bluray/power supply/ casing/motherboard. Its highly likely it will have an upper midrange GPU but the Cpu is quite likely it will be a lower end part . I dont believe they will do the mistake of using an outdated CPU design like the one on PS4 but i doubt it will be anything more than a lower end ryzen 3 ( which still would be a decent choice )

If that's the case, I don't expect Sony to include PS4 BC if they want to keep the cost down as much as they can. PS3 price at the time was high due to PS2/PS1 BC making it more expensive then the Blu-Ray drive did.

And let's take into consideration that Sony most likely will take some of the ideas from Nintendo's Switch cause as it is, the sales are down in Japan. It's hard to say what Sony will go for.

PS4 BC will be easy as its x86 based and not having to be emulated as with previous consoles.

Avatar image for FireEmblem_Man
FireEmblem_Man

20248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#38 FireEmblem_Man
Member since 2004 • 20248 Posts

@Gatygun: you forgot that Ryzen 3000 will be 7 nm

Avatar image for X_CAPCOM_X
X_CAPCOM_X

9552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#39 X_CAPCOM_X
Member since 2004 • 9552 Posts

@goldenelementxl said:

Are you all being serious? Looking at the very first chip you can tell it's fake. A Ryzen 3, 6c/12t, 4 GHz boost, 50W TDP at $99? If any of you believe this...

Idk man, I am optimistically believing in it even though most of it is too good to be true. I want this for AMD.

The GPUs are even more insane. 2070 like performance for half the price and less TDP? I would be in for day 1.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40  Edited By ronvalencia
Member since 2008 • 29612 Posts

@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

20 percent clock speed improvement on 4.2 Ghz lands on 5.04 Ghz (this is for boost speeds with less CPU cores being active).

https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips

The 7 nm node is a big deal for the foundry industry in general and TSMC in particular. When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).

Avatar image for deactivated-63d2876fd4204
deactivated-63d2876fd4204

9129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 deactivated-63d2876fd4204
Member since 2016 • 9129 Posts

Well if Ron believes the leak, it’s 100% fake confirmed. Discussion over.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#42  Edited By ronvalencia
Member since 2008 • 29612 Posts

@X_CAPCOM_X said:
@goldenelementxl said:

Are you all being serious? Looking at the very first chip you can tell it's fake. A Ryzen 3, 6c/12t, 4 GHz boost, 50W TDP at $99? If any of you believe this...

Idk man, I am optimistically believing in it even though most of it is too good to be true. I want this for AMD.

The GPUs are even more insane. 2070 like performance for half the price and less TDP? I would be in for day 1.

It's not strange i.e.

Navi 12 with 40 CU is 62 percent from 7 nm Vega 64 with 64 CU, 1800Mhz and 300 watts, hence my 187 watts estimate.

It has been shown Vega 56 at 1710 Mhz with 12.2 TFLOPS can beat Strix Vega 64 at 1590Mhz with 13.02 TFLOPS. TFLOPS argument doesn't factor in classic GPU hardware (quad rasterzation, 64 ROPS and L2 cache) performance i.e. vega 56 with 1710 Mhz has superior classic GPU hardware performance over Strix Vega 64.

For RTX, NVIDIA doubled L2 cache storage over Pascal counterparts. AMD needs to increase L2 cache storage to follow NVIDIA's RTX improvements. Both AMD's and NVIDIA's ROPS are connected to L2 cache.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43  Edited By ronvalencia
Member since 2008 • 29612 Posts

@goldenelementxl said:

Well if Ron believes the leak, it’s 100% fake confirmed. Discussion over.

I didn't made the leak nor the leak was done by wccftech. I was correct on X1X's ROPS being different from old school AMD ROPS.

Avatar image for techhog89
Techhog89

5430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 Techhog89
Member since 2015 • 5430 Posts
@ronvalencia said:
@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

20 percent clock speed improvement on 4.2 Ghz lands on 5.04 Ghz (this is for boost speeds with less CPU cores being active).

https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips

The 7 nm node is a big deal for the foundry industry in general and TSMC in particular. When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).

Oh wow. I didn't know you were still around.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45  Edited By Zaryia
Member since 2016 • 21607 Posts

PC wins again.

Avatar image for drlostrib
DrLostRib

5931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#46 DrLostRib
Member since 2017 • 5931 Posts

@techhog89 said:
@ronvalencia said:
@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

20 percent clock speed improvement on 4.2 Ghz lands on 5.04 Ghz (this is for boost speeds with less CPU cores being active).

https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips

The 7 nm node is a big deal for the foundry industry in general and TSMC in particular. When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).

Oh wow. I didn't know you were still around.

the RonBot is always scanning

Avatar image for DragonfireXZ95
DragonfireXZ95

26645

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 DragonfireXZ95
Member since 2005 • 26645 Posts
@GarGx1 said:

This may or may not be true but it's looking like the expectations for top end Navi cards could be accurate.

@blackhairedhero said:

So PS5 would rock a RX 3080 and a Ryzen 5 3600 or something along those lines.

You can certainly hope so but $430 for GPU and CPU alone (I know Sony get a bulk buyer discount) may be a bit rich if they still want to make profit from their consoles. I'd be more inclined to expect a Navi 12 equivalent, at this point, in the next PlayStation's APU as it would keep the costs down.

You may be right though, only time will tell.

If anything, it'll be a Ryzen 3 severely underclocked. They can't put 6 cores/12 threads in a small box and expect it not to overheat unless they do some major downclocking.

Avatar image for deactivated-63d2876fd4204
deactivated-63d2876fd4204

9129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 deactivated-63d2876fd4204
Member since 2016 • 9129 Posts

@DragonfireXZ95: I was gonna post about this a little later, but figured it would go over people’s heads. Power draw and cooling are huge factors when designing a console. It’s like every console cycle people fall for the same shit. It’s really getting old.

And again, those specs are fake.

Avatar image for blackhairedhero
Blackhairedhero

3231

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#49  Edited By Blackhairedhero
Member since 2018 • 3231 Posts

@GarGx1: Not sure about MS but the PS4 took around s $60 hit on each console at launch. I'm curious if Sony helped with the funding of Navi and what kind of deal they will get from AMD. Its hard to tell what's true and what's not but most rumores say the processor will have 8 cores although I'm sure it will be stripped down since consoles dont need all the multitasking features for the OS.

https://www.google.com/amp/s/www.forbes.com/sites/erikkain/2013/09/20/sony-to-take-a-loss-on-playstation-4-sales/amp/

Now if we take into account that the PS5 will cost $100 more and Sony is far better off financially then when they launched the PS4 I don't think it seems unrealistic.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#50 ronvalencia
Member since 2008 • 29612 Posts

@blackhairedhero said:

@GarGx1: Not sure about MS but the PS4 took around s $50 hit on each console at launch. I'm curious if Sony helped with the funding of Navi and what kind of deal they will get from AMD. Its hard to tell what's true and what's not but most rumores say the processor will have 8 cores although I'm sure it will be stripped down since consoles dont need all the multitasking features for the OS.

https://www.forbes.com/sites/erikkain/2013/09/20/sony-to-take-a-loss-on-playstation-4-sales/#49faeaa26f1d

Xbox chief marketing officer Yusuf Mehdi told GamesIndustry International that Microsoft is "looking to break even or low margin at worst."

https://www.eurogamer.net/articles/2013-09-20-sony-expects-to-recoup-playstation-4-hardware-loss-at-launch

Eurogamer has heard from well-placed sources that Sony expects to make an approximate $60 loss per $399 unit sold. When presented with the figure, Ito denied it