Ryzen 7 3000 series leak + new line of GPU's

  • 99 results
  • 1
  • 2
  • 3
Avatar image for Gatygun
#1 Edited by Gatygun (1297 posts) -

GPU's:

The good:

Cheaper and bit more performance most likely from the current lineup and hopefully it will press nvidia's prices. Also nice budget 1080 for people that just need performance and nothing else for cheap.

The bads:

Refresh, no raytracing or other nvidia like features, low amounts of memory. Only mid to low range gpu's.

CPU:

The good:

Cheap, lots of cores, high clocks, low prices, intel got pushed in a corner here massively. Even there flag ship 9900k chip is comparable to this ryzen refresh a mid gen chip for 250 bucks.

The best is we finally see 16/32 cores with high ghz something i have been waiting for already for a decade to upgrade towards. The price is still steep and i probably will wait until next gen consoles to a appearance and i know what the core counts are going to be. No hurry here.

Also this bode well for next generation consoles. CPU's in those boxes could be powerhouses.

The bad:

Gotta wait.

https://www.youtube.com/watch?v=PCdsTBsH-rI

Avatar image for R4gn4r0k
#3 Posted by R4gn4r0k (29521 posts) -

I'll be buying a Ryzen 3000.

Intel prices just aren't affordable for me anymore.

Avatar image for techhog89
#4 Posted by Techhog89 (3091 posts) -

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

Avatar image for R4gn4r0k
#5 Posted by R4gn4r0k (29521 posts) -

@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

What's fake ?

You think the price will be higher ? or the specs will be worse ?

Avatar image for scatteh316
#6 Edited by scatteh316 (9952 posts) -

No ray tracing?

Of course it'll do Ray Tracing...

/smh

Avatar image for scatteh316
#7 Posted by scatteh316 (9952 posts) -

And 5Ghz boost clock? I doubt AMD have managed to fix Ryzen's ~4Ghz clock wall that quickly although I will happily be surprised if they have.

Avatar image for davillain-
#8 Edited by DaVillain- (33006 posts) -

I just got myself a Ryzen 7 2700X and I totally love it! What a great CPU by AMD but I already knew the Ryzen 7 3000 series would make it's way to the world of PC market soon and I wasn't in the mood to wait for the 3000 series anyways, so the 2700X will do it for me. Gonna see how AMD plays out for the Ryzen 7 4000 series. I wanna wait till the 4000 series are a bare minimum to upgrade and only to double the core/thread count I have.

And as for their new GPU, another disappointing on the higher up but still, more competition is needed, so hey, more good thing for the PC crowd. If the Navi Cards can Crossfire with the on-board graphics of the G-series chips, imagine what kind of performance-per-dollar that would bring?

Avatar image for howmakewood
#9 Posted by Howmakewood (5532 posts) -

Yeah I'll believe the cpu specs when I see em

Avatar image for rmpumper
#10 Posted by rmpumper (342 posts) -

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

Avatar image for scatteh316
#11 Edited by scatteh316 (9952 posts) -
@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

Doesn't work that way dude..... AMD's architecture is massively un-balanced which they said they're addressing with Navi. That itself should generate a good chunk of extra performance...

You also have to think process shrink will bring power and cost down.

Avatar image for techhog89
#12 Edited by Techhog89 (3091 posts) -
@R4gn4r0k said:
@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

What's fake ?

You think the price will be higher ? or the specs will be worse ?

The whole report is fake. These won't be the SKUs we see.

@scatteh316 said:

And 5Ghz boost clock? I doubt AMD have managed to fix Ryzen's ~4Ghz clock wall that quickly although I will happily be surprised if they have.

This is a new architecture on a new node. They'll break the barrier. It's just a question of by how much.

@scatteh316 said:

No ray tracing?

Of course it'll do Ray Tracing...

/smh

None of these GPUs, if they did exist, are powerful enough for decent ray tracing. Besides that, the current method is Nvidia exclusive and patented.

@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

This is actually possible. The RTX series is on 12nm while this is on 7nm. If RTX were on 7nm the 2070 would be closer to 120W, if not less. It's the RTX series that didn't jump as much as usual.

Avatar image for rmpumper
#13 Edited by rmpumper (342 posts) -

@techhog89 said:

@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

This is actually possible. The RTX series is on 12nm while this is on 7nm. If RTX were on 7nm the 2070 would be closer to 120W, if not less. It's the RTX series that didn't jump as much as usual.

GTX 10xx are 14nm, RX 5xx are 14nm (590 some kind of fake 12nm), but nvdia is twice as efficient. So nm do not mean anything when the whole architecture is shit.

Avatar image for mrbojangles25
#14 Posted by mrbojangles25 (42382 posts) -

If they can come up with a video card that 90% as good as an RTX 2080 but 75% of the price, I just might buy it.

Avatar image for R4gn4r0k
#15 Posted by R4gn4r0k (29521 posts) -

@techhog89: Everything I've heard so far about Zen 2 sounds really great, so I'll wait for final specs. But if Intel can't compete than I'll definitely go AMD this time.

Avatar image for Gatygun
#16 Edited by Gatygun (1297 posts) -
@rmpumper said:
@techhog89 said:

The whole report is fake. These won't be the SKUs we see.

This is a new architecture on a new node. They'll break the barrier. It's just a question of by how much.

None of these GPUs, if they did exist, are powerful enough for decent ray tracing. Besides that, the current method is Nvidia exclusive and patented.

@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

This is actually possible. The RTX series is on 12nm while this is on 7nm. If RTX were on 7nm the 2070 would be closer to 120W, if not less. It's the RTX series that didn't jump as much as usual.

GTX 10xx are 14nm, RX 5xx are 14nm (590 some kind of fake 12nm), but nvdia is twice as efficient. So nm do not mean anything when the whole architecture is shit.

Why are you looking at Nvidia?

its a 7nm vega card with a updated architecture most likely. Original vega = 14nm ( massive difference )

Wanna see how much more you can do with it, here's something from a chief techical officier that makes 7nm cpu's and stated the following a year ago he also talks about 5ghz going to be gained relative easy with time.

While a move from 14 nm to 7 nm was expected to provide, at the very best, a halving in the actual size of a chip manufactured in 7 nm compared to 14 nm, Gary Patton is now saying that the are should actually be reduced by up to 2.7 times the original size. To put that into perspective, AMD's series processors on the Zeppelin die and 14 nm process, which come in at 213 mm² for the full, 8-core design, could be brought down to just 80 mm² instead. AMD could potentially use up that extra die space to either build in some overprovisioning, should the process still be in its infancy and yields need a small boost; or cram it with double the amount of cores and other architectural improvements, and still have chips that are smaller than the original Zen dies.

So in short 7nm process is carrying there entire ryzen line and soon gpu line with it.

It's clear AMD ( gpu ) is no longer chasing the top end. They are not pushing performance up most likely at all with the die shrink, so power consumption will take a massive nose dive. And yes if Nvidia released 14 > 12 nm and next up is another 12 nm card or higher then 7nm card, then AMD is skipping a few gens indeed.

The fact that there architecture for gpu's is clearly worse as you mention is also the reason they probably don't care much for pushing high end gpu's on 7nm or maybe they have other priority's as dedicated high end gpu's are hardly there priority or business right now.

It could also be that higher TDP means less efficient and basically harder for them to reach top end performance something they struggled with before which also makes zero sense push those 7nm vega chips performance.

I wouldn't be shocked if they actually underclocked some of those chips to gain even lower tdp.

Avatar image for scatteh316
#17 Posted by scatteh316 (9952 posts) -
@techhog89 said:

@scatteh316 said:

No ray tracing?

Of course it'll do Ray Tracing...

/smh

None of these GPUs, if they did exist, are powerful enough for decent ray tracing. Besides that, the current method is Nvidia exclusive and patented.

Wrong, wrong, wrong and wrong.

Avatar image for goldenelementxl
#18 Posted by GoldenElementXL (2411 posts) -

Are you all being serious? Looking at the very first chip you can tell it's fake. A Ryzen 3, 6c/12t, 4 GHz boost, 50W TDP at $99? If any of you believe this...

Avatar image for davillain-
#19 Posted by DaVillain- (33006 posts) -

For those who keeps saying it's fake by Adored. Adored leak the RTX specs before Nvidia reveal it to the world and Adored's leaks was mostly right and most of the time, accurate leaks to say the least. Looking again from those leaks, this is too good to be true, especially Ryzen 7: 3700/3700X & Ryzen 9: 3800/3050X being the most attractive budget consumer friendly, there's no way this can't be fake.

As for the GPU side, I don't like the names AMD is going with this. 3060/3070/3080 is shady AF right now! AMD can come up with better names then this and honestly, naming higher up numbers over what Nvidia has now in the market is just so shady, this looks like it'll blow up AMD in the face with those specs.

Avatar image for Postosuchus
#20 Posted by Postosuchus (607 posts) -

AMD needs a comeback for their GPUs like they did for their CPU’s; Nvidia is following the same path Intel took during AMD’s bulldozer days (sky high prices, incremental improvements, etc).

Avatar image for Basinboy
#21 Posted by Basinboy (13673 posts) -

I’m committed to Nvidia GPUs due to owning several Gsync displays, but I’d be willing to consider a Ryzen build in the coming years. I’m also not against adopting Freesync displays moving forward, but I want to see how Intel plans to get in the GPU market first before making a decision.

Avatar image for adamosmaki
#22 Posted by adamosmaki (10649 posts) -
@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

while i take this rumors with a grain of salt is actually quite possible considering AMD is moving to 7nm . Just look the efficiency jump from AMD FX series to ryzen series. A 65W 6 core ryzen completely trounces a 120-130 W FX cpu in performance and efficiency.

Avatar image for musicalmac
#23 Posted by musicalmac (24981 posts) -

Am I the only one who finds the naming convention genuinely hilarious? lol

Avatar image for Grey_Eyed_Elf
#24 Posted by Grey_Eyed_Elf (5970 posts) -

Zen at 12c/24t at 5Ghz boost and 105w TDP?... $330?

Avatar image for blackhairedhero
#25 Edited by Blackhairedhero (2351 posts) -

So PS5 would rock a RX 3080 and a Ryzen 5 3600 or something along those lines.

Avatar image for mazuiface
#26 Posted by mazuiface (667 posts) -

I know that Zen 2 will have higher clocks, but 16c/32t at 5.1ghz? Fake as f

That would destroy everything, even at 499 msrp.

Avatar image for techhog89
#27 Edited by Techhog89 (3091 posts) -
@rmpumper said:
@techhog89 said:
@rmpumper said:

You can tell it's fake just by looking at the TDP of RX3080.

1080 TDP -180W (~170W in reality), 2070 - 175W (~200W in reality), while current AMD 590 is rated at 175W (220W average in reality) but only a bit fasted than a 120W 1060 6GB. So are you telling us that AMD will skip two gens, beat nvidia in power efficiency and will then sell the 2070 equivalent for half the price? What a fucking joke.

This is actually possible. The RTX series is on 12nm while this is on 7nm. If RTX were on 7nm the 2070 would be closer to 120W, if not less. It's the RTX series that didn't jump as much as usual.

GTX 10xx are 14nm, RX 5xx are 14nm (590 some kind of fake 12nm), but nvdia is twice as efficient. So nm do not mean anything when the whole architecture is shit.

14nm to 7nm more than doubles efficiency. This indicates a rough doubling of efficiency for AMD. It matches up perfectly.

And Nvidia isn't twice as efficient in the midrange. It's more efficient, but not 2x. https://www.techpowerup.com/reviews/Zotac/GeForce_RTX_2080_AMP_Extreme/32.html

Avatar image for techhog89
#28 Posted by Techhog89 (3091 posts) -
@scatteh316 said:
@techhog89 said:
@scatteh316 said:

No ray tracing?

Of course it'll do Ray Tracing...

/smh

None of these GPUs, if they did exist, are powerful enough for decent ray tracing. Besides that, the current method is Nvidia exclusive and patented.

Wrong, wrong, wrong and wrong.

Go on.

Avatar image for Pedro
#29 Posted by Pedro (32072 posts) -

@techhog89:DXR

Avatar image for GarGx1
#30 Posted by GarGx1 (10550 posts) -

This may or may not be true but it's looking like the expectations for top end Navi cards could be accurate.

@blackhairedhero said:

So PS5 would rock a RX 3080 and a Ryzen 5 3600 or something along those lines.

You can certainly hope so but $430 for GPU and CPU alone (I know Sony get a bulk buyer discount) may be a bit rich if they still want to make profit from their consoles. I'd be more inclined to expect a Navi 12 equivalent, at this point, in the next PlayStation's APU as it would keep the costs down.

You may be right though, only time will tell.

Avatar image for techhog89
#31 Posted by Techhog89 (3091 posts) -
@Pedro said:

@techhog89:DXR

That's only part of it. AMD would need to create their own hardware solution and get devs to support it.

Either way, ray-tracing is overhyped. It'll be at least 10 years before it becomes mainstream since next-gen consoles won't support it. Nvidia is just doing this to make AMD look bad. They know it's not really ready yet.

Avatar image for davillain-
#32 Posted by DaVillain- (33006 posts) -

@musicalmac said:

Am I the only one who finds the naming convention genuinely hilarious? lol

Dunno why AMD is copying the competition on every naming scheme now if true, almost seems like they are trying to get unsuspecting people to buy their stuff. (which of course looks to be the case cause AMD is just as sneaky as Nvidia is)

Avatar image for blackhairedhero
#33 Posted by Blackhairedhero (2351 posts) -

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

Avatar image for adamosmaki
#34 Posted by adamosmaki (10649 posts) -
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

That wouldnt be a small loss if you factor Harddrive/bluray/power supply/ casing/motherboard. Its highly likely it will have an upper midrange GPU but the Cpu is quite likely it will be a lower end part . I dont believe they will do the mistake of using an outdated CPU design like the one on PS4 but i doubt it will be anything more than a lower end ryzen 3 ( which still would be a decent choice )

Avatar image for davillain-
#35 Posted by DaVillain- (33006 posts) -

@adamosmaki said:
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

That wouldnt be a small loss if you factor Harddrive/bluray/power supply/ casing/motherboard. Its highly likely it will have an upper midrange GPU but the Cpu is quite likely it will be a lower end part . I dont believe they will do the mistake of using an outdated CPU design like the one on PS4 but i doubt it will be anything more than a lower end ryzen 3 ( which still would be a decent choice )

If that's the case, I don't expect Sony to include PS4 BC if they want to keep the cost down as much as they can. PS3 price at the time was high due to PS2/PS1 BC making it more expensive then the Blu-Ray drive did.

And let's take into consideration that Sony most likely will take some of the ideas from Nintendo's Switch cause as it is, the sales are down in Japan. It's hard to say what Sony will go for.

Avatar image for GarGx1
#36 Posted by GarGx1 (10550 posts) -
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

To hit $499 they'd need to get the rest of the machine together for ~$100 to $150, motherboard, RAM (prices are high at the moment), custom case, PSU, cooling, Optical drive (presumably), O/S, design, manufacturing, shipping and seller mark up. That's a big ask, remember AMD will be looking to make some kind of profit from these chips as well.

As far as I'm aware neither Sony or MS made a loss on the PS4 or Xbox One at launch. I could be wrong on that.

Avatar image for 04dcarraher
#37 Posted by 04dcarraher (23076 posts) -

@davillain- said:
@adamosmaki said:
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

That wouldnt be a small loss if you factor Harddrive/bluray/power supply/ casing/motherboard. Its highly likely it will have an upper midrange GPU but the Cpu is quite likely it will be a lower end part . I dont believe they will do the mistake of using an outdated CPU design like the one on PS4 but i doubt it will be anything more than a lower end ryzen 3 ( which still would be a decent choice )

If that's the case, I don't expect Sony to include PS4 BC if they want to keep the cost down as much as they can. PS3 price at the time was high due to PS2/PS1 BC making it more expensive then the Blu-Ray drive did.

And let's take into consideration that Sony most likely will take some of the ideas from Nintendo's Switch cause as it is, the sales are down in Japan. It's hard to say what Sony will go for.

PS4 BC will be easy as its x86 based and not having to be emulated as with previous consoles.

Avatar image for FireEmblem_Man
#38 Posted by FireEmblem_Man (18707 posts) -

@Gatygun: you forgot that Ryzen 3000 will be 7 nm

Avatar image for X_CAPCOM_X
#39 Posted by X_CAPCOM_X (8274 posts) -

@goldenelementxl said:

Are you all being serious? Looking at the very first chip you can tell it's fake. A Ryzen 3, 6c/12t, 4 GHz boost, 50W TDP at $99? If any of you believe this...

Idk man, I am optimistically believing in it even though most of it is too good to be true. I want this for AMD.

The GPUs are even more insane. 2070 like performance for half the price and less TDP? I would be in for day 1.

Avatar image for ronvalencia
#40 Edited by ronvalencia (26528 posts) -

@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

20 percent clock speed improvement on 4.2 Ghz lands on 5.04 Ghz (this is for boost speeds with less CPU cores being active).

https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips

The 7 nm node is a big deal for the foundry industry in general and TSMC in particular. When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).

Avatar image for goldenelementxl
#41 Posted by GoldenElementXL (2411 posts) -

Well if Ron believes the leak, it’s 100% fake confirmed. Discussion over.

Avatar image for ronvalencia
#42 Edited by ronvalencia (26528 posts) -

@X_CAPCOM_X said:
@goldenelementxl said:

Are you all being serious? Looking at the very first chip you can tell it's fake. A Ryzen 3, 6c/12t, 4 GHz boost, 50W TDP at $99? If any of you believe this...

Idk man, I am optimistically believing in it even though most of it is too good to be true. I want this for AMD.

The GPUs are even more insane. 2070 like performance for half the price and less TDP? I would be in for day 1.

It's not strange i.e.

Navi 12 with 40 CU is 62 percent from 7 nm Vega 64 with 64 CU, 1800Mhz and 300 watts, hence my 187 watts estimate.

It has been shown Vega 56 at 1710 Mhz with 12.2 TFLOPS can beat Strix Vega 64 at 1590Mhz with 13.02 TFLOPS. TFLOPS argument doesn't factor in classic GPU hardware (quad rasterzation, 64 ROPS and L2 cache) performance i.e. vega 56 with 1710 Mhz has superior classic GPU hardware performance over Strix Vega 64.

For RTX, NVIDIA doubled L2 cache storage over Pascal counterparts. AMD needs to increase L2 cache storage to follow NVIDIA's RTX improvements. Both AMD's and NVIDIA's ROPS are connected to L2 cache.

Avatar image for ronvalencia
#43 Edited by ronvalencia (26528 posts) -

@goldenelementxl said:

Well if Ron believes the leak, it’s 100% fake confirmed. Discussion over.

I didn't made the leak nor the leak was done by wccftech. I was correct on X1X's ROPS being different from old school AMD ROPS.

Avatar image for techhog89
#44 Posted by Techhog89 (3091 posts) -
@ronvalencia said:
@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

20 percent clock speed improvement on 4.2 Ghz lands on 5.04 Ghz (this is for boost speeds with less CPU cores being active).

https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips

The 7 nm node is a big deal for the foundry industry in general and TSMC in particular. When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).

Oh wow. I didn't know you were still around.

Avatar image for zaryia
#45 Edited by Zaryia (6703 posts) -

PC wins again.

Avatar image for drlostrib
#46 Posted by DrLostRib (4367 posts) -

@techhog89 said:
@ronvalencia said:
@techhog89 said:

It's fake. Adored believes it because he has an Intel hate boner and wants it to be true, and the original source seems to be WCCF, which has a horrible track record for "leaks." Take it with 16 grains of salt.

20 percent clock speed improvement on 4.2 Ghz lands on 5.04 Ghz (this is for boost speeds with less CPU cores being active).

https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips

The 7 nm node is a big deal for the foundry industry in general and TSMC in particular. When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).

Oh wow. I didn't know you were still around.

the RonBot is always scanning

Avatar image for DragonfireXZ95
#47 Posted by DragonfireXZ95 (24712 posts) -
@GarGx1 said:

This may or may not be true but it's looking like the expectations for top end Navi cards could be accurate.

@blackhairedhero said:

So PS5 would rock a RX 3080 and a Ryzen 5 3600 or something along those lines.

You can certainly hope so but $430 for GPU and CPU alone (I know Sony get a bulk buyer discount) may be a bit rich if they still want to make profit from their consoles. I'd be more inclined to expect a Navi 12 equivalent, at this point, in the next PlayStation's APU as it would keep the costs down.

You may be right though, only time will tell.

If anything, it'll be a Ryzen 3 severely underclocked. They can't put 6 cores/12 threads in a small box and expect it not to overheat unless they do some major downclocking.

Avatar image for goldenelementxl
#48 Posted by GoldenElementXL (2411 posts) -

@DragonfireXZ95: I was gonna post about this a little later, but figured it would go over people’s heads. Power draw and cooling are huge factors when designing a console. It’s like every console cycle people fall for the same shit. It’s really getting old.

And again, those specs are fake.

Avatar image for blackhairedhero
#49 Edited by Blackhairedhero (2351 posts) -

@GarGx1: Not sure about MS but the PS4 took around s $60 hit on each console at launch. I'm curious if Sony helped with the funding of Navi and what kind of deal they will get from AMD. Its hard to tell what's true and what's not but most rumores say the processor will have 8 cores although I'm sure it will be stripped down since consoles dont need all the multitasking features for the OS.

https://www.google.com/amp/s/www.forbes.com/sites/erikkain/2013/09/20/sony-to-take-a-loss-on-playstation-4-sales/amp/

Now if we take into account that the PS5 will cost $100 more and Sony is far better off financially then when they launched the PS4 I don't think it seems unrealistic.

Avatar image for ronvalencia
#50 Posted by ronvalencia (26528 posts) -

@blackhairedhero said:

@GarGx1: Not sure about MS but the PS4 took around s $50 hit on each console at launch. I'm curious if Sony helped with the funding of Navi and what kind of deal they will get from AMD. Its hard to tell what's true and what's not but most rumores say the processor will have 8 cores although I'm sure it will be stripped down since consoles dont need all the multitasking features for the OS.

https://www.forbes.com/sites/erikkain/2013/09/20/sony-to-take-a-loss-on-playstation-4-sales/#49faeaa26f1d

Xbox chief marketing officer Yusuf Mehdi told GamesIndustry International that Microsoft is "looking to break even or low margin at worst."

https://www.eurogamer.net/articles/2013-09-20-sony-expects-to-recoup-playstation-4-hardware-loss-at-launch

Eurogamer has heard from well-placed sources that Sony expects to make an approximate $60 loss per $399 unit sold. When presented with the figure, Ito denied it