Ryzen 7 3000 series leak + new line of GPU's

  • 99 results
  • 1
  • 2
  • 3
Avatar image for ronvalencia
#51 Edited by ronvalencia (26541 posts) -

@04dcarraher said:
@davillain- said:
@adamosmaki said:
@blackhairedhero said:

@GarGx1: Rumors are saying $499. So they would take a small loss but not much. That's common for console makers in the first year.

That wouldnt be a small loss if you factor Harddrive/bluray/power supply/ casing/motherboard. Its highly likely it will have an upper midrange GPU but the Cpu is quite likely it will be a lower end part . I dont believe they will do the mistake of using an outdated CPU design like the one on PS4 but i doubt it will be anything more than a lower end ryzen 3 ( which still would be a decent choice )

If that's the case, I don't expect Sony to include PS4 BC if they want to keep the cost down as much as they can. PS3 price at the time was high due to PS2/PS1 BC making it more expensive then the Blu-Ray drive did.

And let's take into consideration that Sony most likely will take some of the ideas from Nintendo's Switch cause as it is, the sales are down in Japan. It's hard to say what Sony will go for.

PS4 BC will be easy as its x86 based and not having to be emulated as with previous consoles.

Sony's PS4 BC issues are time sensitive resource sync issues e.g. a simple PS4 BC with PS4 Pro titles needs to have correct clock cycle timings. PS4 is NOT a PC when it comes to resource management. Sony has more than 5 years to figure out BC virtual machine for non-symmetric scaled hardware upgrade.

Typical console programming wasn't designed with PC's clock speed scaling such as boost mode or power saving modes.

MS has fancy virtual machine (VM) XBO on X1X which can fake XBO and XBO S behaviors. DirectX on XBO has less of sync resource timing issues since XBO titles can be boosted with X1X's extra power without major problems. DirectX's resource management deals with resource sync issues which give MS higher liberty with hardware upgrades.

X1X dev kit has extra retail X1X VM mode since X1X dev kit has full 44 CU enabled GPU.

Both Vulkan and DirectX 12 APIs has to deal with PC's clock speed and timing differences.

Avatar image for ronvalencia
#52 Edited by ronvalencia (26541 posts) -

@blackhairedhero said:

@GarGx1: Not sure about MS but the PS4 took around s $60 hit on each console at launch. I'm curious if Sony helped with the funding of Navi and what kind of deal they will get from AMD. Its hard to tell what's true and what's not but most rumores say the processor will have 8 cores although I'm sure it will be stripped down since consoles dont need all the multitasking features for the OS.

https://www.google.com/amp/s/www.forbes.com/sites/erikkain/2013/09/20/sony-to-take-a-loss-on-playstation-4-sales/amp/

Now if we take into account that the PS5 will cost $100 more and Sony is far better off financially then when they launched the PS4 I don't think it seems unrealistic.

In terms of task switching, Jaguar wasn't cut-down from Core 2 design i.e. comparable integer performance.

Avatar image for ronvalencia
#54 Edited by ronvalencia (26541 posts) -

@DragonfireXZ95 said:
@GarGx1 said:

This may or may not be true but it's looking like the expectations for top end Navi cards could be accurate.

@blackhairedhero said:

So PS5 would rock a RX 3080 and a Ryzen 5 3600 or something along those lines.

You can certainly hope so but $430 for GPU and CPU alone (I know Sony get a bulk buyer discount) may be a bit rich if they still want to make profit from their consoles. I'd be more inclined to expect a Navi 12 equivalent, at this point, in the next PlayStation's APU as it would keep the costs down.

You may be right though, only time will tell.

If anything, it'll be a Ryzen 3 severely underclocked. They can't put 6 cores/12 threads in a small box and expect it not to overheat unless they do some major downclocking.

15 watts on battery power with 25 watts TDP cooling budget with wall main power.

No Cinebench score degradation after multiple benchmark runs with HP Envy x360 15z with Ryzen 5 2500U. Low end desktop versions has the worst quality silicon with higher internal resistance.

My estimate is 6 CPU cores with clock speed than range from 2Ghz to 3Ghz. 7nm+ has another 10 percent improvement over 1st gen 7nm generation.

Avatar image for ronvalencia
#55 Posted by ronvalencia (26541 posts) -

@goldenelementxl said:

@DragonfireXZ95: I was gonna post about this a little later, but figured it would go over people’s heads. Power draw and cooling are huge factors when designing a console. It’s like every console cycle people fall for the same shit. It’s really getting old.

And again, those specs are fake.

You are arguing TSMC's 7nm is a "nothing burger" with zero improvements from GoFlo 14 nm/12nm.

Avatar image for goldenelementxl
#56 Edited by GoldenElementXL (2415 posts) -

@ronvalencia said:
@goldenelementxl said:

@DragonfireXZ95: I was gonna post about this a little later, but figured it would go over people’s heads. Power draw and cooling are huge factors when designing a console. It’s like every console cycle people fall for the same shit. It’s really getting old.

And again, those specs are fake.

You are arguing TSMC's 7nm is a "nothing burger" with zero improvements from GoFlo 14 nm/12nm.

1 - RAM prices. GDDR6 prices are around $25 per GB at cost. You expect me to believe AMD can put $100 of VRAM in a GPU and charge $129 total????? With no 6 pin power and rival a GTX 1060??????

2 - Those CPU specs. A Ryzen 3, with 6c/12t at a 4GHz boost at $99 vs 2018's top end Ryzen 5 2600X with 6c/12t, a 4.2 GHz boost at $229. Either those Ryzen 3 specs are off or the price is off.

Then I'm quite skeptical that those Ryzen "7" and "9" leaked chips would require new motherboards. The way AM4 is built, you can't have 12 and 16 cores...

3 - Where the hell does that leave Threadripper? You would get faster CPU's, minus the PCIE lanes for 1/2 the price? Yeah right!

4 - Those "G" APU chips. You can't be serious... Going from a 4c/4t and 4c/8t with 8/11 Vega CU's to 6c/12t and 8c/16t with 15/20 CU's is some fairy tale nonsense. Especially at $129 and $199. Again, either the specs are off or the prices are off.

I'm saying there are far too many holes in those numbers and prices. Especially the GPU's. That's easily fact-checkable. 4GB of GDDR6 in a $129 card? 8GB of GDDR6, ($200 at cost) in a $199 card? Wake the hell up...

Avatar image for ronvalencia
#57 Edited by ronvalencia (26541 posts) -

@goldenelementxl said:
@ronvalencia said:
@goldenelementxl said:

@DragonfireXZ95: I was gonna post about this a little later, but figured it would go over people’s heads. Power draw and cooling are huge factors when designing a console. It’s like every console cycle people fall for the same shit. It’s really getting old.

And again, those specs are fake.

You are arguing TSMC's 7nm is a "nothing burger" with zero improvements from GoFlo 14 nm/12nm.

1 - RAM prices. GDDR6 prices are around $25 per GB at cost. You expect me to believe AMD can put $100 of VRAM in a GPU and charge $129 total????? With no 6 pin power and rival a GTX 1060??????

2 - Those CPU specs. A Ryzen 3, with 6c/12t at a 4GHz boost at $99 vs 2018's top end Ryzen 5 2600X with 6c/12t, a 4.2 GHz boost at $229. Either those Ryzen 3 specs are off or the price is off.

Then I'm quite skeptical that those Ryzen "7" and "9" leaked chips would require new motherboards. The way AM4 is built, you can't have 12 and 16 cores...

3 - Where the hell does that leave Threadripper? You would get faster CPU's, minus the PCIE lanes for 1/2 the price? Yeah right!

4 - Those "G" APU chips. You can't be serious... Going from a 4c/4t and 4c/8t with 8/11 Vega CU's to 6c/12t and 8c/16t with 15/20 CU's is some fairy tale nonsense. Especially at $129 and $199. Again, either the specs are off or the prices are off.

I'm saying there are far too many holes in those numbers and prices. Especially the GPU's. That's easily fact-checkable. 4GB of GDDR6 in a $129 card? 8GB of GDDR6, ($200 at cost) in a $199 card? Wake the hell up...

1.That's July 2018 prices with cryptocurrency hype. Both PS4 Pro and RX-570 has GDDR5-7000. PS5 is dependent on RX-570's replacement with Navi equivalent.

https://pcpartpicker.com/product/3JdFf7/msi-radeon-rx-570-8gb-armor-oc-video-card-rx-570-armor-8g-oc

MSI's RX 570 ARMOR 8G OC with 8 chips GDDR5-7000 with retail $144.99 price tag. Your German link's price $22.11 x 8 chips = $176.88 just for memory chips!

There's something wrong with your German link's memory price. German link's memory price list debunked!

It's you who needs to wake the hell up. Your fact check is garbage.

2. AM4 can support 4 CPU cores + RX Vega 11 CU cores, hence that's 15 SIMD processing cores on existing 128 bit DDR4. Vega 11 CU has higher floating point processing intensity over CPU's equivalent. Vega 11 CU iGPU was exchanged for more CPU cores.

Dual channel DDR4-2400 already supports Raven Ridge APU. AM4's DDR4 memory speed reaches up to about DDR4-3200 depending on motherboard's revision.

https://www.anandtech.com/show/12946/gskills-dram-extremes-ddr4-4000-on-amd-ryzen

This Ryzen setup has 128 bit DDR4-4000 (64 GB/s). XBO without 32MB SRAM usage has 68 GB/s memory bandwidth.

3.

7 nm Eypc 2 has 64 CPU cores with 8 memory channels. Targets similar to Eypc 1's TDP.

12nm ThreadRipper 2 has 32 CPU cores with 4 memory channels. Needs to be cost reduced with Eypc 2's chiplet setup. ThreadRipper 1 is dead. ThreadRipper 2 to be replaced by 7nm version.

7 nm Ryzen 9 could have 16 CPU cores with 2 memory channels. The proposed Ryzen 9 (with 16 CPU cores) is 1/4 of Eypc 2's setup.

4. Intel Kabylake G already has Vega kitbash with 24 CU! 15 to 20 CU iGPU with 3rd gen Ryzen APU almost catches up to Kabylake G's 24 CU Vega kitbash.

Raven Ridge APU already has RX Vega (gfx9) 11 CU iGPU and four CPU cores. Vega ROPS has GPU L2 cache connection and superior DCC over A12 APU's solution.

Intel Kabylake G already has Intel's "super glue" links LOL.

Avatar image for goldenelementxl
#59 Posted by GoldenElementXL (2415 posts) -

@ronvalencia: What?

I would love to debate you here, but the majority of your response doesn’t seem like you are addressing anything I said...

The only thing that seems on topic is the RAM price debate. RAM is still expensive and the crypto shit is over. Just because those garbage AMD GPUs are cheap now doesn’t mean anything. They have to be priced that low or they would sell 0 each quarter.

GDDR6 has actually hurt Nvidia to this point because of Microns production issues. And with AI and machine learning eating into the supply, Samsung will keep the prices up. The GDDR5 examples in the link I provided are high, but the GDDR6 prices are pretty damn close. So you comparing RX570’s that were manufactured over 20 months ago to cards coming out months from now isn’t very relevant

Avatar image for ronvalencia
#60 Edited by ronvalencia (26541 posts) -

@goldenelementxl said:

@ronvalencia: What?

I would love to debate you here, but the majority of your response doesn’t seem like you are addressing anything I said...

The only thing that seems on topic is the RAM price debate. RAM is still expensive and the crypto shit is over. Just because those garbage AMD GPUs are cheap now doesn’t mean anything. They have to be priced that low or they would sell 0 each quarter.

GDDR6 has actually hurt Nvidia to this point because of Microns production issues. And with AI and machine learning eating into the supply, Samsung will keep the prices up. The GDDR5 examples in the link I provided are high, but the GDDR6 prices are pretty damn close. So you comparing RX570’s that were manufactured over 20 months ago to cards coming out months from now isn’t very relevant

https://www.made-in-china.com/showroom/emmalau0301/product-detailIXRxybwYAtkv/China-Gddr5-64mx32-2-0GHz-Dram-H5GQ2H24MFR-T2C-.html

US $3.95 / Piece

Digi-Key is a distributor and you can order (relatively) small batches.

GPU and graphic cards vendors surely getting very much better prices than this (they order directly from the memory makers.

Avatar image for drlostrib
#61 Posted by DrLostRib (4374 posts) -

This threads been Ron-ed

Avatar image for ronvalencia
#62 Edited by ronvalencia (26541 posts) -

@goldenelementxl said:

@ronvalencia: What?

I would love to debate you here, but the majority of your response doesn’t seem like you are addressing anything I said...

The only thing that seems on topic is the RAM price debate. RAM is still expensive and the crypto shit is over. Just because those garbage AMD GPUs are cheap now doesn’t mean anything. They have to be priced that low or they would sell 0 each quarter.

GDDR6 has actually hurt Nvidia to this point because of Microns production issues. And with AI and machine learning eating into the supply, Samsung will keep the prices up. The GDDR5 examples in the link I provided are high, but the GDDR6 prices are pretty damn close. So you comparing RX570’s that were manufactured over 20 months ago to cards coming out months from now isn’t very relevant

https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC28/

Density: 8 Gbits

Speed: 7Gbps GDDR5 aka GDDR5-7000

https://www.alibaba.com/product-detail/GDDR5-256Kx32-28-IC-online-by_60807721800.html?spm=a2700.7724857.normalList.22.5da1719bUwpNMT

Part number: K4G80325FB-HC28

US $1.22 to $8.99 / Pieces

The price per piece is dependent on the bulk order size. Your German price list is a ripoff.

Note why RX-570 can exist as retail $144.99 price tag.

RX-570 was recently renamed as RX-580 with clock speed increase in China.

X1X uses GDDR5-7000 chips with 384 bits with 6.8Gbps setup (slightly down-clocked for extra reliability and slightly lower TDP).

PS4 Pro uses GDDR5-7000 chips with 256bits and 1 GB DDR3 setup.

RTX series GPUs chip area sizes are very large. Building GPU cards starts in China NOT some Digi-Key distributor.

Avatar image for ronvalencia
#63 Posted by ronvalencia (26541 posts) -

@goldenelementxl said:

@ronvalencia: What?

I would love to debate you here, but the majority of your response doesn’t seem like you are addressing anything I said...

The only thing that seems on topic is the RAM price debate. RAM is still expensive and the crypto shit is over. Just because those garbage AMD GPUs are cheap now doesn’t mean anything. They have to be priced that low or they would sell 0 each quarter.

GDDR6 has actually hurt Nvidia to this point because of Microns production issues. And with AI and machine learning eating into the supply, Samsung will keep the prices up. The GDDR5 examples in the link I provided are high, but the GDDR6 prices are pretty damn close. So you comparing RX570’s that were manufactured over 20 months ago to cards coming out months from now isn’t very relevant

https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC25/

Part No: K4G80325FB-HC25

Density: 8 Gbits

Speed: 8Gbps GDDR5 aka GDDR5-8000

https://www.alibaba.com/product-detail/GDDR5-256Kx32-28-IC-online-by_60807721800.html?spm=a2700.7724857.normalList.22.5da1719bUwpNMT

Part No: K4G80325FB-HC25

US $1.22-8.99 / Pieces

The price per piece is dependent on the bulk order size.

Avatar image for Yams1980
#64 Edited by Yams1980 (3040 posts) -

damn yes, 7nm cpus. And if those numbers are even slightly correct, i'll be buying at least two ryzen cpus come next year.

I got a first gen 1700x ryzen in a pc i use for thread heavy tasks, not entirely sure if i can simply just swap it out and toss in like a 3800x but i'm hoping i can and keep using the same motherboard.

If those clock speeds are accurate also, their cpus are going to be up there with intels ipc. Maybe even passing them.

I still use an i7 4770k for gaming, maybe time to finally replace it with an AMD cpu if the gaming is strong in the ryzen 3000's.

Avatar image for scatteh316
#65 Edited by scatteh316 (9956 posts) -
@techhog89 said:
@Pedro said:

@techhog89:DXR

That's only part of it. AMD would need to create their own hardware solution and get devs to support it.

Either way, ray-tracing is overhyped. It'll be at least 10 years before it becomes mainstream since next-gen consoles won't support it. Nvidia is just doing this to make AMD look bad. They know it's not really ready yet.

No they don't..... DXR is built to run via GPU Compute shaders.... so any GPU that has that functionality will be able to run DXR...... i.e.... Any AMD and Nvidia GPU built over the last 10 years as I imagine the legendary 8800GTX should be able to run the code..... it won't be fast enough to use but it'll still be able to functionally run it. There's a video of a guy getting RT working in BFV on a 1080ti..... it's slow as shit and has artefacts but it's running the code.

Besides AMD had their RT orientated card out before Nvidia.

Avatar image for techhog89
#66 Posted by Techhog89 (3109 posts) -

@scatteh316: It needs hardware to run at anything remotely acceptable. Same thing.

Avatar image for rzxv04
#67 Edited by rzxv04 (285 posts) -

Don't we still have 2-3 years for GDDR6 prices to go down at least by a bit?

Come to think of it...

For unified memory, would we be better off with something like 32 GB GDDR5 than a mix of 16 GB GDDR6 + 8 GB DDR (24 GB total only). Ultimately depends on the prices of GDDR5 vs 6 for a more accurate projection of amount of RAM though.

I wonder if GDDR6 prices would still be so bad by 2021 tops, that 64 GB GDDR5 for uni memory would make more sense.

Avatar image for scatteh316
#68 Edited by scatteh316 (9956 posts) -
@techhog89 said:

@scatteh316: It needs hardware to run at anything remotely acceptable. Same thing.

And what do you think compute shaders are? Lmfao.... you said it's Nvidia patented and exclusive.... when it's not..... And to think so is idiotic.

It'll run perfectly fine and fast via compute........... and AMD has also typically had much better compute performance then Nvidia.

AMD's Radeon Rays (Released before RTX) already uses GPU Compute in the professional rendering sector and is performing just fine.

Avatar image for goldenelementxl
#69 Edited by GoldenElementXL (2415 posts) -

@ronvalencia: You replied to me 3 times describing GDDR5 prices when the GPUs in question are using GDDR6... I said my sources GDDR5 prices were off while giving reasons why GDDR6 will still be pricey. Yet here you are, researching GDDR5 prices.

Avatar image for DragonfireXZ95
#70 Posted by DragonfireXZ95 (24740 posts) -
@scatteh316 said:
@techhog89 said:
@Pedro said:

@techhog89:DXR

That's only part of it. AMD would need to create their own hardware solution and get devs to support it.

Either way, ray-tracing is overhyped. It'll be at least 10 years before it becomes mainstream since next-gen consoles won't support it. Nvidia is just doing this to make AMD look bad. They know it's not really ready yet.

No they don't..... DXR is built it run via GPU Compute shaders.... so any GPU that has that functionality will be able to run DXR...... i.e.... Any AMD and Nvidia GPU built over the last 10 years as I imagine the legendary 8800GTX should be able to run the code..... it won't be fast enough to use but it'll still be able to functionally run it. There's a video of a guy getting RT working in BFV on a 1080ti..... it's slow as shit and has artefacts but it's running the code.

Besides AMD had their RT orientated card out before Nvidia.

Where'd you see a video of it working on the 1080 Ti? The only video I found didn't have it working at all. It said it was enabled, but wasn't working, which is basically just a bug in the software. They also had to use a 2080 Ti to even get it to turn on. Your argument is based on nothing unless you link said video.

Avatar image for jahnee
#71 Posted by jahnee (3863 posts) -

The leaks are true. The RX 3800 has a TDP of 150W, when the RX580 released it had a TDP of 185W. It is now safe to assume that next generation consoles will at the very least have the power of an RX 3800 (GTX 1080) alongside extra features (perhaps HBM2 memory?).

Avatar image for scatteh316
#72 Posted by scatteh316 (9956 posts) -
@DragonfireXZ95 said:
@scatteh316 said:
@techhog89 said:
@Pedro said:

@techhog89:DXR

That's only part of it. AMD would need to create their own hardware solution and get devs to support it.

Either way, ray-tracing is overhyped. It'll be at least 10 years before it becomes mainstream since next-gen consoles won't support it. Nvidia is just doing this to make AMD look bad. They know it's not really ready yet.

No they don't..... DXR is built it run via GPU Compute shaders.... so any GPU that has that functionality will be able to run DXR...... i.e.... Any AMD and Nvidia GPU built over the last 10 years as I imagine the legendary 8800GTX should be able to run the code..... it won't be fast enough to use but it'll still be able to functionally run it. There's a video of a guy getting RT working in BFV on a 1080ti..... it's slow as shit and has artefacts but it's running the code.

Besides AMD had their RT orientated card out before Nvidia.

Where'd you see a video of it working on the 1080 Ti? The only video I found didn't have it working at all. It said it was enabled, but wasn't working, which is basically just a bug in the software. They also had to use a 2080 Ti to even get it to turn on. Your argument is based on nothing unless you link said video.

My argument? There is no fucking argument.... DXR uses GPU compute..... the same GPU we've had on GPU's and consoles for years....

It's quite public information that's how DXR was designed to work....... So yea.. my 'argument' is based on public knowledge..

/smh

Avatar image for ronvalencia
#73 Edited by ronvalencia (26541 posts) -

@goldenelementxl said:

@ronvalencia: You replied to me 3 times describing GDDR5 prices when the GPUs in question are using GDDR6... I said my sources GDDR5 prices were off while giving reasons why GDDR6 will still be pricey. Yet here you are, researching GDDR5 prices.

You attempted to dismiss my RX-570 argument i.e. your "So you comparing RX570’s that were manufactured over 20 months ago to cards coming out months from now isn’t very relevant" statement.

Your German's price scaling from GDDR5 to GDDR6 can be applied on Chinese GDDR5 cost to figure out Chinese GDDR6 cost

Avatar image for ronvalencia
#74 Edited by ronvalencia (26541 posts) -

@scatteh316 said:
@techhog89 said:
@Pedro said:

@techhog89:DXR

That's only part of it. AMD would need to create their own hardware solution and get devs to support it.

Either way, ray-tracing is overhyped. It'll be at least 10 years before it becomes mainstream since next-gen consoles won't support it. Nvidia is just doing this to make AMD look bad. They know it's not really ready yet.

No they don't..... DXR is built to run via GPU Compute shaders.... so any GPU that has that functionality will be able to run DXR...... i.e.... Any AMD and Nvidia GPU built over the last 10 years as I imagine the legendary 8800GTX should be able to run the code..... it won't be fast enough to use but it'll still be able to functionally run it. There's a video of a guy getting RT working in BFV on a 1080ti..... it's slow as shit and has artefacts but it's running the code.

Besides AMD had their RT orientated card out before Nvidia.

DXR compute path is the fall back mode i.e. software acceleration path instead of fix hardware acceleration path.

RTX's fix hardware acceleration

For fix hardware acceleration method, AMD needs implement shader software code into complex hardware logic (runs contrary to RISC design ideology i.e. simple instructions). Like fix function graphics hardware rasterization, extra fix graphics hardware functions separates GPU from DSP.

GPUs are extreme CISC ideology i.e. instructions with complex logic hardware for graphics fix functions.

GTX 8800 wouldn't be able to run fall back method since it requires shader model 6 (which requires certain shader model 5.x hardware).

AMD probably can convert BVH search engine into FPGA before implementing into ASIC (Application-Specific Integrated Circuit) hardware. AMD is pretty good with cloning somebody's hardware. AMD's CPU department can help RTG with converting software into ASIC. It needs a powerful branch hardware with decision making processes.

Avatar image for ronvalencia
#75 Posted by ronvalencia (26541 posts) -

@scatteh316 said:
@techhog89 said:

@scatteh316: It needs hardware to run at anything remotely acceptable. Same thing.

And what do you think compute shaders are? Lmfao.... you said it's Nvidia patented and exclusive.... when it's not..... And to think so is idiotic.

It'll run perfectly fine and fast via compute........... and AMD has also typically had much better compute performance then Nvidia.

AMD's Radeon Rays (Released before RTX) already uses GPU Compute in the professional rendering sector and is performing just fine.

AMD Radeon Ray is missing AI de-noise tricks to speed up real time ray tracing. AI de-noise can be done with Vega's AI instructions.

Avatar image for Dark_sageX
#76 Posted by Dark_sageX (3261 posts) -

I'm VERY interested in that RX 3080, couldn't care less about Ray Tracing, I want higher fps with higher resolution (4k) I would have turned off ray tracing if it was available anyway. a GPU thats shy from a GTX 1080 in performance but at a $250 price point? all I have to say to that is screw Nvidia! gonna jump to team red next year!

Avatar image for DragonfireXZ95
#77 Posted by DragonfireXZ95 (24740 posts) -

@scatteh316 said:
@DragonfireXZ95 said:
@scatteh316 said:
@techhog89 said:
@Pedro said:

@techhog89:DXR

That's only part of it. AMD would need to create their own hardware solution and get devs to support it.

Either way, ray-tracing is overhyped. It'll be at least 10 years before it becomes mainstream since next-gen consoles won't support it. Nvidia is just doing this to make AMD look bad. They know it's not really ready yet.

No they don't..... DXR is built it run via GPU Compute shaders.... so any GPU that has that functionality will be able to run DXR...... i.e.... Any AMD and Nvidia GPU built over the last 10 years as I imagine the legendary 8800GTX should be able to run the code..... it won't be fast enough to use but it'll still be able to functionally run it. There's a video of a guy getting RT working in BFV on a 1080ti..... it's slow as shit and has artefacts but it's running the code.

Besides AMD had their RT orientated card out before Nvidia.

Where'd you see a video of it working on the 1080 Ti? The only video I found didn't have it working at all. It said it was enabled, but wasn't working, which is basically just a bug in the software. They also had to use a 2080 Ti to even get it to turn on. Your argument is based on nothing unless you link said video.

My argument? There is no fucking argument.... DXR uses GPU compute..... the same GPU we've had on GPU's and consoles for years....

It's quite public information that's how DXR was designed to work....... So yea.. my 'argument' is based on public knowledge..

/smh

Because it never ran successfully on anything besides RTX cards. That's why.

Avatar image for pc_rocks
#78 Posted by PC_Rocks (1443 posts) -

@DragonfireXZ95:

Do not feed the trolls. When someone is desperate enough to argue equivalent to HW T&L is not needed because SW T&L also works without acceptable performance.

Avatar image for ronvalencia
#79 Posted by ronvalencia (26541 posts) -

@Gatygun:

https://www.techpowerup.com/250412/amd-hired-agency-in-south-korea-teases-amd-ryzen-7-3700x-ryzen-5-3600x

The tease in question was posted by an AMD-contracted Sales agency in South Korea, which launched a campaign inviting users to guess Cinebench scores for upcoming AMD processors: namely, the Ryzen 7 3700X and Ryzen 5 3600X - thus confirming the nomenclature for AMD's upcoming CPUs

Avatar image for DragonfireXZ95
#80 Posted by DragonfireXZ95 (24740 posts) -

@pc_rocks said:

@DragonfireXZ95:

Do not feed the trolls. When someone is desperate enough to argue equivalent to HW T&L is not needed because SW T&L also works without acceptable performance.

I know, he's always full of shit. I just like to call said trolls out on their BS and see them either never respond or try to deflect.

Avatar image for ronvalencia
#81 Posted by ronvalencia (26541 posts) -

@pc_rocks said:

@DragonfireXZ95:

Do not feed the trolls. When someone is desperate enough to argue equivalent to HW T&L is not needed because SW T&L also works without acceptable performance.

Not the same scenario since both methods are still accelerated by GPU not being run on CPU. Software path on GPU is better than the CPU version.

Avatar image for kweeni
#82 Posted by kweeni (11368 posts) -

Man, please let those gpu leaks be true... I've been wanting to switch to team red for quite some time because of my freesync monitor but there is nothing worth switching to right now.

Avatar image for gamecubepad
#83 Posted by gamecubepad (7910 posts) -

@jahnee:

Are you assuming 2019 launch, or 2020? Also, $400 or $500 price point?

I'm assuming 2020 with $399 launch that gets you 8-core Ryzen, 16GB RAM, 2TB HDD, and UHD BD. I don't believe they'd squeeze a $250 GPU into that budget. That 3070 sounds about right. Probably somewhere between a 1070 and 1070ti in power.

Avatar image for HalcyonScarlet
#84 Posted by HalcyonScarlet (8234 posts) -

It is crazy that I could build a PC that has a CPU that more than suits my needs for $129 and it's only 65 Watt tdp. Intel should be ashamed of themselves. For so long they kept the market at 4cores and 8 threads and made us pay an absolute premium for it. Those entry level skus would start from £250 from Intel I think.

Avatar image for goldenelementxl
#85 Posted by GoldenElementXL (2415 posts) -

@HalcyonScarlet: Hold on to your excitement. This leak sounds too good to be true.

Avatar image for rzxv04
#86 Posted by rzxv04 (285 posts) -

@goldenelementxl said:

@HalcyonScarlet: Hold on to your excitement. This leak sounds too good to be true.

I have a feeling it is too good to be true but who knows.

It might not be smart for long term competition to leapfrog someone so much specially since Intel and NV have so much more capital for R&D unless AMD has some monstrous cards still hidden for the coming decade.

Still, I would love to see the whole ATI HD4870 happening again that destroyed pricing of competition. Either CPU or GPU markets.

Avatar image for ronvalencia
#87 Edited by ronvalencia (26541 posts) -

@goldenelementxl said:

@HalcyonScarlet: Hold on to your excitement. This leak sounds too good to be true.

14 nm to 7 nm jump is like 28 to 14 nm jump.

The leaks follows Ryzen 9's 16 CPU cores with dual memory controllers being 1/4 from Epyc 2 (7nm, 8 memory channels, 64 CPU cores) and 1/2 from Threadripper 2 (12 nm, 4 memory channels, 32 CPU cores)

Avatar image for apacheguns515
#88 Posted by Apacheguns515 (53 posts) -

@HalcyonScarlet: Intel kept a stranglehold on the market pricing because they could. It's not really the fault of a business for making the absolute most money they can from consumers, it's the consumers fault. How much something is worth is purely based on how much people are willing to pay for it and if people think something costs too much but buy it anyway then no company on Earth is going to deliberately drop their prices out of the goodness of their hearts.

Same with Nvidia. These leak prices and specs are amazing and if they prove true then this is exactly what the market needs to regulate the pricing of Intel and Nvidia products. AMD needs to be an actual viable alternative as they've become with Ryzen instead of being the "cheap" knockoff brand that you only buy because you can't afford Intel or Nvidia.

I've been an Intel/Nvidia guy for over 20 years and a few weeks ago for the first time in my life I swapped out my Intel CPU for a Ryzen build. Not because I couldn't afford the new Intel chips, but because I believe as a consumer that Ryzen pricing is what chips "should" cost and Intel is simply overpriced. I'll stick with Nvidia GPU's for now though unless AMD does to their GPU's what they did with their CPU's in regards to Ryzen, which is sounds like they may be doing with Navi.

TLDR: Intel and Nvidia cost so much because people buy them at those prices. Consumers regulate the pricing of products, not the businesses.

Avatar image for davillain-
#89 Edited by DaVillain- (33063 posts) -

Looking back, that Ryzen 9 3800X for only $450 has now peak my interest, and if I can get it to all cores in 4.3Ghz or 4.5Ghz at least, I'll be happy with that and since this is all coming from AdoredTV, they been right about RTX specs leaks and I'm very confident this is all legit except for those GPU names which I'm not believing those names are legit. But in any case, if all this turns out to be true, this is gonna destroyed Intel big time. I'm running with a 2700X now, but I be down upgrading to Ryzen 9 level if this is true of course, we'll have to wait and see in January 2019 for CES event.

Avatar image for apacheguns515
#90 Posted by Apacheguns515 (53 posts) -
@rzxv04 said:
@goldenelementxl said:

@HalcyonScarlet: Hold on to your excitement. This leak sounds too good to be true.

I have a feeling it is too good to be true but who knows.

It might not be smart for long term competition to leapfrog someone so much specially since Intel and NV have so much more capital for R&D unless AMD has some monstrous cards still hidden for the coming decade.

Still, I would love to see the whole ATI HD4870 happening again that destroyed pricing of competition. Either CPU or GPU markets.

Sounds to me like AMD is willing to gamble and take an initial loss to become competitive in the GPU market in the hopes that their more consumer friendly pricing will result in more overall sales at a lower price point.

This is actually a very good thing for the market and if successful could greatly influence market pricing in the long run. If these Navi cards are successful then it will set a new precedent for the GPU market to slash prices dramatically. Intel and Nvidia will probably continue on course with marketing their products as "the best" and charging premium prices for them, but their mid tier products may have to take a heft price cut to remain competitive.

Intel and Nvidia get away with this so often because they tap into the mentality of so many PC gamers. We like bragging, and we like saying we have the "best" hardware. Intel and Nvidia ARE the "best" hardware manufacturers out there and whether their products are even actually worth what they charge is irrelevant. The are the best, and many people buy them because they enjoy being able to say they did.

My buddy at work is one of those people. His number 1 game is Roller Coaster Tycoon.....literally. He came in to my office last week telling me how excited he was that his new RTX 2080ti came in the mail yesterday. The guy seriously plays Roller Coaster tycoon on a 34' 1440p widescreen with an RTX 2080ti.....People like him are who Nvidia and Intel target and it's people like him who console folks make fun of.

Every business on Earth sells products with a hefty markup obviously to make a profit, can't blame any of them for it. If people want to spend $1300 on a GPU to brag about having top of the line hardware and play Steam indie games all day then that's their business. But if people are tired of paying mortgage prices for graphics cards then we need to support competition within the market.

Avatar image for HalcyonScarlet
#91 Edited by HalcyonScarlet (8234 posts) -

@apacheguns515 said:

@HalcyonScarlet: Intel kept a stranglehold on the market pricing because they could. It's not really the fault of a business for making the absolute most money they can from consumers, it's the consumers fault. How much something is worth is purely based on how much people are willing to pay for it and if people think something costs too much but buy it anyway then no company on Earth is going to deliberately drop their prices out of the goodness of their hearts.

Same with Nvidia. These leak prices and specs are amazing and if they prove true then this is exactly what the market needs to regulate the pricing of Intel and Nvidia products. AMD needs to be an actual viable alternative as they've become with Ryzen instead of being the "cheap" knockoff brand that you only buy because you can't afford Intel or Nvidia.

I've been an Intel/Nvidia guy for over 20 years and a few weeks ago for the first time in my life I swapped out my Intel CPU for a Ryzen build. Not because I couldn't afford the new Intel chips, but because I believe as a consumer that Ryzen pricing is what chips "should" cost and Intel is simply overpriced. I'll stick with Nvidia GPU's for now though unless AMD does to their GPU's what they did with their CPU's in regards to Ryzen, which is sounds like they may be doing with Navi.

TLDR: Intel and Nvidia cost so much because people buy them at those prices. Consumers regulate the pricing of products, not the businesses.

The pricing is one thing, but they kept CPUs with more than 4 cores well out of reach and it would really have benefited certain people to have more than 4 cores. For a really long time they kept the i3 2/4 i5 4/4 and the i7 4/8. I believe they'd been doing 6 cores for a while, but it cost a grand.

It also was not about what people were willing to pay imo. When you have pretty much the monopoly, people don't have a choice about what to pay to get the performance they need. Not just the people who are gaming and have the choice about how they want to game, but people for example in music or film that could do with with more cores or threads.

I agree with what you said. AMD need to change they way they market their products. Make me want it, because of quality, not because it's cheaper than the competition. I've always said, Nvidia make people excited to buy their products as a whole. AMD most of the time it's because it's the cheaper alternative. I wouldn't mind using AMD parts at all, but it would just be because it's the cheaper alternative rather than because I'd be excited to use them and that needs to change. It feels like Skoda vs BMW. I remember an advert where Skoda said you could get one for a £1 (obviously subsequent payments). That's what it feels like to me with AMD vs Intel/Nvidia.

Avatar image for apacheguns515
#92 Posted by Apacheguns515 (53 posts) -
@HalcyonScarlet said:
@apacheguns515 said:

@HalcyonScarlet: Intel kept a stranglehold on the market pricing because they could. It's not really the fault of a business for making the absolute most money they can from consumers, it's the consumers fault. How much something is worth is purely based on how much people are willing to pay for it and if people think something costs too much but buy it anyway then no company on Earth is going to deliberately drop their prices out of the goodness of their hearts.

Same with Nvidia. These leak prices and specs are amazing and if they prove true then this is exactly what the market needs to regulate the pricing of Intel and Nvidia products. AMD needs to be an actual viable alternative as they've become with Ryzen instead of being the "cheap" knockoff brand that you only buy because you can't afford Intel or Nvidia.

I've been an Intel/Nvidia guy for over 20 years and a few weeks ago for the first time in my life I swapped out my Intel CPU for a Ryzen build. Not because I couldn't afford the new Intel chips, but because I believe as a consumer that Ryzen pricing is what chips "should" cost and Intel is simply overpriced. I'll stick with Nvidia GPU's for now though unless AMD does to their GPU's what they did with their CPU's in regards to Ryzen, which is sounds like they may be doing with Navi.

TLDR: Intel and Nvidia cost so much because people buy them at those prices. Consumers regulate the pricing of products, not the businesses.

The pricing is one thing, but they kept CPUs with more than 4 cores well out of reach and it would really have benefited certain people to have more than 4 cores. For a really long time they kept the i3 2/4 i5 4/4 and the i7 4/8. I believe they'd been doing 6 cores for a while, but it cost a grand.

It also was not about what people were willing to pay imo. When you have pretty much the monopoly, people don't have a choice about what to pay to get the performance they need. Not just the people who are gaming and have the choice about how they want to game, but people for example in music or film that could do with with more cores or threads.

I agree with what you said. AMD need to change they way they market their products. Make me want it, because of quality, not because it's cheaper than the competition. I've always said, Nvidia make people excited to buy their products as a whole. AMD most of the time it's because it's the cheaper alternative. I wouldn't mind using AMD parts at all, but it would just be because it's the cheaper alternative rather than because I'd be excited to use them and that needs to change. It feels like Skoda vs BMW. I remember an advert where Skoda said you could get one for a £1 (obviously subsequent payments). That's what it feels like to me with AMD vs Intel/Nvidia.

And the people who would have benefited with more than 4 cores could either pay a premium for it or just accept not having more than 4 cores. Companies don't care what benefits "you" they care about what makes them money. When more than 4 cores became beneficial Intel simply said well if you really NEED it then it's available, and we are going to charge you an arm and a leg for it because some people HAD to have it. Messed up yes, but a good business decision, supply and demand.

People all over the internet are complaining about the RTX prices and Intels 9th gen prices vs performance. As of right now, we the consumers, have the ability to tell both Intel and Nvidia that their prices need to come down, we now have an actual viable competitor in the CPU market that is priced more fairly. Intel and Nvidia have long since passed the price point of diminishing return. Is a 8700k or 9700k or 9900k a better CPU than the Ryzen 2700x? Yes it is. Is the 8700k/9700k a $100 better CPU than the Ryzen? No. Is the 9900k a $200 better CPU than the Ryzen? No. For the actual real world performance difference the 8700k/9700k should cost around 20 or 30 dollars more than the Ryzen 2700x. Anything more than that is diminishing return. And for the prices Intel charges every single one of their processors should come with a quality stock cooler. The Ryzen 2700x is actually only about $230 USD, they also just include the $70 Wraith Prism with the bundle. Intel charges $350+ for the 8700k or $400+ for the 9700k for the CPU alone.

The performance difference between a 2700x and a 9700k is simply not $170 USD worth of difference, however, Intel is simply not going to ever stop charging those types of prices until it no longer becomes profitable for them to do so. What consumers need to do if they want to see prices decrease is to make it clear what we are going to deem "acceptable" pricing going forward.

In layman's terms, if people want Intel and Nvidia to drop their prices then they need to quit buying overpriced Intel and Nvidia stuff. As a business neither Intel nor Nvidia gives one ounce of a damn that consumers think their products are overpriced as long as they are selling enough of them. It's the same reason why Apple is able to get away with charging hilarious amounts of money for their products. Are Apple products good? Yes they actually are quality products. Are they worth the prices that Apple charges for them? Hell no, not even close.

Avatar image for goldenelementxl
#93 Posted by GoldenElementXL (2415 posts) -
Loading Video...

He pokes holes in every aspect of the "leak."

The GPU leak is beyond stupid and I'm not sure why any of you believe it.

Avatar image for davillain-
#94 Posted by DaVillain- (33063 posts) -

@goldenelementxl: I never stated I believe they are legit, but AdoredTV was spot on with the RTX leaks before Nvidia reveal. At this point, it's anyone's guess if this is legit. Also, AMD will have their CES this January 2019 and we'll find out if everything from AdoredTV is legit. But i will say that I personally don't believe those GPU names are for real, those names are shady and I can't see AMD going for 30XX naming.

Avatar image for goldenelementxl
#95 Posted by GoldenElementXL (2415 posts) -

@davillain- said:

@goldenelementxl: I never stated I believe they are legit, but AdoredTV was spot on with the RTX leaks before Nvidia reveal. At this point, it's anyone's guess if this is legit. Also, AMD will have their CES this January 2019 and we'll find out if everything from AdoredTV is legit. But i will say that I personally don't believe those GPU names are for real, those names are shady and I can't see AMD going for 30XX naming.

The video explains that the recent AMD reveals were during "Tech Days" that were scheduled by AMD months in advance. No one from the industry has any word or invites regarding any such event this year at CES. The only time AMD didn't do one was well in advance of the Threadripper 2 launch where only vague details were shared during the keynote. So if AMD does announce new hardware at their CES keynote, it's not likely to be this. The video also points out how strange it is to have prices and clock speeds for the hardware when in the past, AMD didn't lock these details down until days to a week before launch. Also, the video goes into detail regarding the launch of the RX 590 and how it kinda proves the GPU leaks to be bogus. AMD launching a GPU with twice the performance at a lower price just months after the 590 launch?

Is it possible that AdoredTV got lucky with the RTX stuff? This AMD leak just seems bonkers.

Avatar image for X_CAPCOM_X
#96 Posted by X_CAPCOM_X (8279 posts) -

@goldenelementxl: https://www.forbes.com/sites/antonyleather/2018/12/10/amds-disruptive-ryzen-3000-series-more-evidence-it-exists/#56a0c2f44cdb

Check this out. It may be real after all. My next system will be all AMD if this is the case. Actually, I'm in for one of those on launch if these are true.

Avatar image for goldenelementxl
#97 Posted by GoldenElementXL (2415 posts) -

@X_CAPCOM_X said:

@goldenelementxl: https://www.forbes.com/sites/antonyleather/2018/12/10/amds-disruptive-ryzen-3000-series-more-evidence-it-exists/#56a0c2f44cdb

Check this out. It may be real after all. My next system will be all AMD if this is the case. Actually, I'm in for one of those on launch if these are true.

The only new information from your link is, "A leaked slide from a Korean quiz supposedly asks what readers' estimates are for Cinebench scores for two 3000-series CPUs" Other than that slide, (which only mentions 2 of the CPU's by name, no price or specs) that article is pure clickbait. It's basically a, "Remember that leak? I will keep you updated on all the latest news and offer my speculation regarding the new AMD hardware. Keep it locked on Forbes.com" article.

Avatar image for rzxv04
#98 Posted by rzxv04 (285 posts) -

@apacheguns515 said:
@rzxv04 said:
@goldenelementxl said:

@HalcyonScarlet: Hold on to your excitement. This leak sounds too good to be true.

I have a feeling it is too good to be true but who knows.

It might not be smart for long term competition to leapfrog someone so much specially since Intel and NV have so much more capital for R&D unless AMD has some monstrous cards still hidden for the coming decade.

Still, I would love to see the whole ATI HD4870 happening again that destroyed pricing of competition. Either CPU or GPU markets.

Sounds to me like AMD is willing to gamble and take an initial loss to become competitive in the GPU market in the hopes that their more consumer friendly pricing will result in more overall sales at a lower price point.

This is actually a very good thing for the market and if successful could greatly influence market pricing in the long run. If these Navi cards are successful then it will set a new precedent for the GPU market to slash prices dramatically. Intel and Nvidia will probably continue on course with marketing their products as "the best" and charging premium prices for them, but their mid tier products may have to take a heft price cut to remain competitive.

Intel and Nvidia get away with this so often because they tap into the mentality of so many PC gamers. We like bragging, and we like saying we have the "best" hardware. Intel and Nvidia ARE the "best" hardware manufacturers out there and whether their products are even actually worth what they charge is irrelevant. The are the best, and many people buy them because they enjoy being able to say they did.

My buddy at work is one of those people. His number 1 game is Roller Coaster Tycoon.....literally. He came in to my office last week telling me how excited he was that his new RTX 2080ti came in the mail yesterday. The guy seriously plays Roller Coaster tycoon on a 34' 1440p widescreen with an RTX 2080ti.....People like him are who Nvidia and Intel target and it's people like him who console folks make fun of.

Every business on Earth sells products with a hefty markup obviously to make a profit, can't blame any of them for it. If people want to spend $1300 on a GPU to brag about having top of the line hardware and play Steam indie games all day then that's their business. But if people are tired of paying mortgage prices for graphics cards then we need to support competition within the market.

I agree. Marketing + market leading product is crazy.. even if some people even just play 2d mmorpgs LMFAO. It happens!

Avatar image for ronvalencia
#99 Edited by ronvalencia (26541 posts) -

@goldenelementxl said:

He pokes holes in every aspect of the "leak."

The GPU leak is beyond stupid and I'm not sure why any of you believe it.

5.1 Ghz is boost mode clock speed few CPU cores being active.

5.1 Ghz clock speed from 4.2 Ghz is about 22 percent increase.

https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips

The 7 nm node is a big deal for the foundry industry in general and TSMC in particular. When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity). So far, TSMC has taped out 18 customer products using the CLN7FF technology, more than 50 CLN7FF products will be taped out by the end of 2018.

Avatar image for bluearchan
#100 Posted by BlueArchan (58 posts) -

This is the same processor that will be installed in next Chinese rumored console i.e. Subar. Cant wait to see if all of this will be true.