A Look Into Next Gen: AMD CES Keynote Thread

  • 66 results
  • 1
  • 2
Avatar image for goldenelementxl
#1 Posted by GoldenElementXL (2550 posts) -

New CPU and GPU news is expected.
Phil Spencer is in the house as well!

Avatar image for Pedro
#2 Posted by Pedro (32721 posts) -

Radeon VII has been announced.

Avatar image for goldenelementxl
#3 Posted by GoldenElementXL (2550 posts) -

Avatar image for R4gn4r0k
#4 Posted by R4gn4r0k (29827 posts) -

Ryzen 3000, Zen 2 please

Avatar image for Telekill
#5 Edited by Telekill (8241 posts) -

So next gen is 25fps faster...?

Yay?

Avatar image for ellos
#6 Posted by ellos (1818 posts) -

What Uncle Phil is there. Is that a hint that Microsoft will go first. Microsoft will announce next xbox at e3.

Avatar image for goldenelementxl
#7 Posted by GoldenElementXL (2550 posts) -

Avatar image for davillain-
#8 Posted by DaVillain- (33768 posts) -

@ellos said:

What Uncle Phil is there. Is that a hint that Microsoft will go first. Microsoft will announce next xbox at e3.

MS always attends CES because you know, Windows 10.

Avatar image for ellos
#9 Edited by ellos (1818 posts) -

@davillain-: I know MS Sony attend CES for obvious reasons. I'm just going along with the spirit of the thread and next gen consoles.

Avatar image for Pedro
#10 Posted by Pedro (32721 posts) -

Priced equal to the RTX 2080. Bold. :)

Avatar image for davillain-
#11 Posted by DaVillain- (33768 posts) -

@Pedro said:

Priced equal to the RTX 2080. Bold. :)

Performance in between RTX 2080 and RTX 2080Ti in games. I find that, intrigued.

Avatar image for Pedro
#12 Posted by Pedro (32721 posts) -

16GB is very interesting. Would be a cheaper alternative for development.

Avatar image for xantufrog
#13 Edited by xantufrog (10362 posts) -

I'm pretty much decided to go AMD with my next card anyway. This is encouraging. So sick of Nvidia and their bullshit. The problem is that historically they have genuinely offered excellent hardware and so they've deserved first place based on the numbers alone... and so I, like many, go with them. But they suck nuts as a company and these days you can get something tolerably competitive in the mid-high range where I like to be from AMD. So why give Nvidia my money? Let them earn it back, I say.

Avatar image for neutrinoworks
#14 Posted by NeutrinoWorks (63 posts) -

Sounds like this thing will still require 285 watts just like Vega 64 to get the performance being shown...

This thing is fail

Navi is the savior we needed

Avatar image for XVision84
#15 Posted by XVision84 (14772 posts) -

Well damn, AMD ain't playing. It's about time!

Avatar image for goldenelementxl
#16 Posted by GoldenElementXL (2550 posts) -

So the Ryzen leaks were fake... Like I said...

Avatar image for pc_rocks
#17 Edited by PC_Rocks (1602 posts) -

@neutrinoworks said:

Sounds like this thing will still require 285 watts just like Vega 64 to get the performance being shown...

This thing is fail

Navi is the savior we needed

Don't know about fail as it's a potent card for the price range that goes toe to toe with RTX 2080 but I'm thinking the wattage will be high compared to RTX since AMD didn't mention it while they were quick to make a note of it in case of Ryzen. It's crazy when you think about it because it's on 7nm and RTX is essentially a 16nm with improvements (I know Nvidia/TSMC calls it 12nm).

Though good to see the card has double the bandwidth and RAM.

Avatar image for osan0
#18 Posted by osan0 (15286 posts) -

hmm. so is it basically a Vega 60 on 7NM and they clocked the nards off it or did they do more? on the one hand im thinking of finally biting the bullet on a 4K screen (asus have made a monitor thats nigh on perfect for me. i swear they are stalking my posting history :P). but i dont think my 4GB RX 580 is going to cut the mustard with 4K. hmm....still might hold off.

Avatar image for GarGx1
#19 Posted by GarGx1 (10736 posts) -

I've never trusted AMD's pre-launch figures, not going to start now. I'll wait for actual third party, unbiased, benchmarks.

Avatar image for goldenelementxl
#20 Edited by GoldenElementXL (2550 posts) -

Wait, could they have benched the Ryzen 5 or 8 core Ryzen 7 against the 9900K? This makes the 16 core rumor look a little more likely... Hmmmmm

Avatar image for XVision84
#21 Posted by XVision84 (14772 posts) -

@neutrinoworks: Is wattage that much of a problem though? Good quality high wattage PSUs are fairly cheap these days.

Avatar image for techhog89
#22 Posted by Techhog89 (3351 posts) -
@XVision84 said:

@neutrinoworks: Is wattage that much of a problem though? Good quality high wattage PSUs are fairly cheap these days.

Less power also means less heat and potentially more overclocking headroom.

Avatar image for xantufrog
#23 Posted by xantufrog (10362 posts) -

shamelessly off-topic, but relevant to the people who would be interested in this thread:

https://www.gamespot.com/forums/pc-and-av-hardware-909394553/important-input-needed-merge-pcmaclinux-society-wi-33449818/

Avatar image for kali-b1rd
#24 Posted by Kali-B1rd (1715 posts) -

Lets just believe the vendor benchmarks... because that's ALWAYS ubiased! /s

Avatar image for madrocketeer
#25 Edited by madrocketeer (5988 posts) -

So, another electron-guzzling miniature frying pan. And it's $700.

Meanwhile, I just found a RTX 2070 that's very attractively-priced and I know for sure will definitely fit in my ITX case. I'm only holding off for now because it's from a brand (Zotac) I know nothing about...

Do better, AMD, or I'm gonna buy it.

Avatar image for neutrinoworks
#26 Posted by NeutrinoWorks (63 posts) -

@techhog89 said:
@XVision84 said:

@neutrinoworks: Is wattage that much of a problem though? Good quality high wattage PSUs are fairly cheap these days.

Less power also means less heat and potentially more overclocking headroom.

And easier to keep noise levels down

Avatar image for xantufrog
#27 Posted by xantufrog (10362 posts) -

@neutrinoworks: although, honestly, my GTX970 is one of the louder and definitely the worst overclocker cards I've owned. So it's not just about the chipset - the board manufacturer and binning is pretty critical for the end user experience.

Avatar image for davillain-
#28 Edited by DaVillain- (33768 posts) -

Honestly this even more somewhat disappointing than the RTX lineup, main issues with RTX cards are the prices. This Radeon VII is on the hyped up 7nm and yet, this uses more wattage than the RTX 2080 and price at $699? If AMD had did that for $499, that would have been a total blow across Nvidia bow. But all in all, it's priced at $699 because of the the 16GB Vram and the fact it's better for video editing and ''content creation'' but when you look at it that way, the price is okayish, everybody should have known AMD weren't going to match 2080Ti in performance unfortunately. Does it have HDMI 2.1 though? I still cannot find out the info but I'm assuming no which is total BS if it doesn't!

All that said and done, AMD Radeon VII is a really not somewhat bad for a compute card, but the problem is, the Vega architecture is just not optimized for gaming. AMD said the OpenCL performance is improved by 62% which is absolutely massive since it will put it way ahead of the RTX 2080Ti in a lot of OpenCL benchmarks.

At this point, it's best to wait for tech youtubers benchmarks for the final decision making.

Avatar image for horgen
#29 Posted by Horgen (119146 posts) -

@davillain-: If it beats RTX2080 in games while being priced roughly the same, that's damn well done by AMD. Nvidia needs competition.

Avatar image for Pedro
#30 Posted by Pedro (32721 posts) -

@davillain-: The main attraction is the content creator application of the card. 16GB or RAM comes in handy when rendering. I would like to see the performance for Unity GPU Accelerated lightmap since it runs on Radeon Rays.

Avatar image for R4gn4r0k
#31 Posted by R4gn4r0k (29827 posts) -

Anyone who didn't buy those overpriced RTX cards and waited is being rewarded for their patience.

Avatar image for dagubot
#32 Posted by Dagubot (1089 posts) -

This is underwhelming. If it would have been released for $499.99 then AMD would have been onto something. This is interesting now as next gen consoles are in the works; I see a bigger impact on the CPU side of things instead of the GPU to keep costs down on the consoles.

Avatar image for cdragon_88
#33 Posted by cdragon_88 (1598 posts) -

Same performance as the 2080 and the same price? Welp, another card that doesn't peak my interest. Sticking with my good 'ol 1070. However, if it's basically the same as the 2080 and priced the same--what's the point? Unless you really really want to jump ship to team Red, if I were upgrading, I'd rather just get the 2080.

Avatar image for davillain-
#34 Posted by DaVillain- (33768 posts) -

@R4gn4r0k said:

Anyone who didn't buy those overpriced RTX cards and waited is being rewarded for their patience.

Or they could've waited for a better RTX mid-range like the latest RTX 2060? I admit, I was really impress what I saw from the benchmarks and for only $350, not too shabby.

@Pedro said:

@davillain-: The main attraction is the content creator application of the card. 16GB or RAM comes in handy when rendering. I would like to see the performance for Unity GPU Accelerated lightmap since it runs on Radeon Rays.

I'm sure the Radeon VII will do fine on it's own. This announcement is alot to take into but the 16GB is the most interesting though.

@horgen said:

@davillain-: If it beats RTX2080 in games while being priced roughly the same, that's damn well done by AMD. Nvidia needs competition.

Totally. I'm sick of Nvidia taking all the fun away without competition. Intel better come through with their own GPU in the future.

Avatar image for XVision84
#35 Posted by XVision84 (14772 posts) -
@neutrinoworks said:
@techhog89 said:
@XVision84 said:

@neutrinoworks: Is wattage that much of a problem though? Good quality high wattage PSUs are fairly cheap these days.

Less power also means less heat and potentially more overclocking headroom.

And easier to keep noise levels down

That makes sense, personally I don't see those as detracting factors but I can see why they would for others. Overclocking doesn't even get you much of an increase in fps, heat is manageable with proper cooling (which isn't that expensive to get relative to your other PC parts), I wear headphones so PC noise doesn't influence me.

Avatar image for techhog89
#36 Posted by Techhog89 (3351 posts) -
@XVision84 said:
@neutrinoworks said:
@techhog89 said:
@XVision84 said:

@neutrinoworks: Is wattage that much of a problem though? Good quality high wattage PSUs are fairly cheap these days.

Less power also means less heat and potentially more overclocking headroom.

And easier to keep noise levels down

That makes sense, personally I don't see those as detracting factors but I can see why they would for others. Overclocking doesn't even get you much of an increase in fps, heat is manageable with proper cooling (which isn't that expensive to get relative to your other PC parts), I wear headphones so PC noise doesn't influence me.

That extra headroom is also why a 16-core part might be possible.

Avatar image for XVision84
#37 Posted by XVision84 (14772 posts) -

@techhog89: That would definitely be interesting to see :). Also looking forward to Navi and how all of this will influence next gen consoles. I'm still on board with ray tracing though and I hope AMD adopts it in the coming years.

Avatar image for jasonofa36
#38 Posted by JasonOfA36 (1066 posts) -
@madrocketeer said:

So, another electron-guzzling miniature frying pan. And it's $700.

Meanwhile, I just found a RTX 2070 that's very attractively-priced and I know for sure will definitely fit in my ITX case. I'm only holding off for now because it's from a brand (Zotac) I know nothing about...

Do better, AMD, or I'm gonna buy it.

Zotac's definitely a good brand. Not as good as EVGA, but their cards are good. Plus, they have a 5-year extended warranty, if that helps.

Avatar image for jasonofa36
#39 Posted by JasonOfA36 (1066 posts) -
@davillain- said:

Honestly this even more somewhat disappointing than the RTX lineup, main issues with RTX cards are the prices. This Radeon VII is on the hyped up 7nm and yet, this uses more wattage than the RTX 2080 and price at $699? If AMD had did that for $499, that would have been a total blow across Nvidia bow. But all in all, it's priced at $699 because of the the 16GB Vram and the fact it's better for video editing and ''content creation'' but when you look at it that way, the price is okayish, everybody should have known AMD weren't going to match 2080Ti in performance unfortunately. Does it have HDMI 2.1 though? I still cannot find out the info but I'm assuming no which is total BS if it doesn't!

All that said and done, AMD Radeon VII is a really not somewhat bad for a compute card, but the problem is, the Vega architecture is just not optimized for gaming. AMD said the OpenCL performance is improved by 62% which is absolutely massive since it will put it way ahead of the RTX 2080Ti in a lot of OpenCL benchmarks.

At this point, it's best to wait for tech youtubers benchmarks for the final decision making.

Maybe AMD will release a cut down version with maybe 12gb or 8gb vram? Surely that will help with the prices and be more competitive against the 2080, since there are some 2080s that are going for 699USD right now in newegg.

But then again, AMD drivers are surely better now than nvidia's 'cause of finewine, and Radeon VII will for sure beat the 2080 in the future.

Avatar image for blaznwiipspman1
#40 Posted by blaznwiipspman1 (7092 posts) -

good news, but im waiting for news on intels new gpu. Its going to rape nvidia, and ill support intel to do it.

Avatar image for ronvalencia
#41 Edited by ronvalencia (26679 posts) -

@goldenelementxl: Rivaling RTX 2080 with VII is good enough. NVIDIA needs competition.

@davillain- said:

Honestly this even more somewhat disappointing than the RTX lineup, main issues with RTX cards are the prices. This Radeon VII is on the hyped up 7nm and yet, this uses more wattage than the RTX 2080 and price at $699? If AMD had did that for $499, that would have been a total blow across Nvidia bow. But all in all, it's priced at $699 because of the the 16GB Vram and the fact it's better for video editing and ''content creation'' but when you look at it that way, the price is okayish, everybody should have known AMD weren't going to match 2080Ti in performance unfortunately. Does it have HDMI 2.1 though? I still cannot find out the info but I'm assuming no which is total BS if it doesn't!

All that said and done, AMD Radeon VII is a really not somewhat bad for a compute card, but the problem is, the Vega architecture is just not optimized for gaming. AMD said the OpenCL performance is improved by 62% which is absolutely massive since it will put it way ahead of the RTX 2080Ti in a lot of OpenCL benchmarks.

At this point, it's best to wait for tech youtubers benchmarks for the final decision making.

According to Anandtech site, VII comes with 128 ROPS, hence on ROPS read-write bound situation, VII would be different from Vega 64

AMD's game benchmarks shown in CES 2019 are heavy compute shader/TMU read-write path games. Battlefield V's 3D engine is well known for software based compute tile render.

The other bottleneck is with rasterization i.e. the fix function hardware for the following concepts.

Avatar image for ronvalencia
#42 Posted by ronvalencia (26679 posts) -

@blaznwiipspman1 said:

good news, but im waiting for news on intels new gpu. Its going to rape nvidia, and ill support intel to do it.

It depends IF Raja "Mr TFLOPS" Koduri learns from past mistakes i.e. TFLOPS doesn't include classic GPU hardware performance.

Avatar image for ronvalencia
#43 Posted by ronvalencia (26679 posts) -

@dagubot said:

This is underwhelming. If it would have been released for $499.99 then AMD would have been onto something. This is interesting now as next gen consoles are in the works; I see a bigger impact on the CPU side of things instead of the GPU to keep costs down on the consoles.

I would like to see lower cost VII with 56 CUs at 1800 Mhz and 8GB VRAM SKU.

Avatar image for Shewgenja
#44 Posted by Shewgenja (20941 posts) -

Does VII support the ray-tracing, as well?

Avatar image for loco145
#45 Posted by loco145 (12085 posts) -

@ronvalencia said:
@dagubot said:

This is underwhelming. If it would have been released for $499.99 then AMD would have been onto something. This is interesting now as next gen consoles are in the works; I see a bigger impact on the CPU side of things instead of the GPU to keep costs down on the consoles.

I would like to see lower cost VII with 56 CUs at 1800 Mhz and 8GB VRAM SKU.

The Vega 56 already exists.

Avatar image for organic_machine
#46 Posted by organic_machine (9962 posts) -

So much for Navi.

Anyways... $699? Really? Yeesh.

Don't get me wrong, it's probably an amazing card. But am I willing to part with my R9 Fury for this? I dunno. $699 is a steep price for any card.

Avatar image for ronvalencia
#47 Edited by ronvalencia (26679 posts) -

@loco145 said:
@ronvalencia said:
@dagubot said:

This is underwhelming. If it would have been released for $499.99 then AMD would have been onto something. This is interesting now as next gen consoles are in the works; I see a bigger impact on the CPU side of things instead of the GPU to keep costs down on the consoles.

I would like to see lower cost VII with 56 CUs at 1800 Mhz and 8GB VRAM SKU.

The Vega 56 already exists.

Not with 1800Mhz clock speed and 128 ROPS. Higher clock speed improves existing rasterization hardware.

Avatar image for loco145
#48 Posted by loco145 (12085 posts) -

So, the price and performance of a 2080 with significantly higher power consumption and without the tensor and ray tracing cores? Meh.

Avatar image for ronvalencia
#49 Posted by ronvalencia (26679 posts) -

@loco145 said:

So, the price and performance of a 2080 with significantly higher power consumption and without the tensor and ray tracing cores? Meh.

VII's AI instructions are integrated with the main compute units like Tegra X1's SMs.

Avatar image for loco145
#50 Posted by loco145 (12085 posts) -

@ronvalencia said:
@loco145 said:

So, the price and performance of a 2080 with significantly higher power consumption and without the tensor and ray tracing cores? Meh.

VII's AI instructions are integrated with the main compute units like Tegra X1's SMs.

Which means that if it uses them for AI then they are unavailable for anything else.