NUSNA_Moebius' forum posts

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#1  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

One of the biggest issues with the AMD-ATi buy out was the lack of cohesion between AMD and previously ATi staff due to differences in social and office culture. The Fusion project took way longer than it should've, and steadily dropping market share in CPUs and GPUs shows that AMD has taken too many steps in the wrong direction.

AMD's graphics division was mostly doing fine the past 9 years, with some great products like the 4, 5, and 7000 series. However, AMD's CPU division is dragging the graphics division down with it thanks to Derp-dozer, and AMD's ineptness to market products. There is also the problem with consumer demand, as graphics card sales have fallen quite a bit, likely much due in part to the fact that the consoles are not bleeding edge compared to what was available on PC at the time of their launch. No one with a high end graphics card from the past couple years needs to upgrade, since most games are multiplatform titles built with console limitations in mind and even three year old GPUs like the 7900s poop all over the PS4 and Xbone in terms of performance (they are GCN based after all). The visual edge of the consoles might get better, but anyone with a Radeon 7970, R9 290, R9 380/390, R9 Fury GTX 780, GTX 980, etc already is set for the entire generation unless they want to move to 4K/60 FPS.

What AMD does need to focus more on is reducing the price for performance, especially in the entry and medium end gaming GPU categories where the main force of sales really is. Having the trophy card is a nice thing to rave about but in the end, AMD needs to cater to consumers in a logical manner. A 2000 SP GPU at 14 nm, with HBM could make a big splash at the right price for the $200 market, and something like a Pitcairn @ 14nm with HBM would be great for the $100-150 market.

The dropping prices in 4K displays is likely to move high end graphics processors, but the high end GPUs need to be executed a manner that makes them affordable, efficient, and a good upgrade for their price. The top end model has no real business being more than $500, and I miss the days when you could get the "cut down" version of the top dog GPU at $300.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#2 NUSNA_Moebius
Member since 2014 • 118 Posts

@Shewgenja said:

All I can say is, as a native Texan, there is no group on Earth I'd like to **** up more than the cartels in Mexico. They are not only involved in the drug bullshit but also human trafficking, kidnapping, and all kinds of nasty stuff. I find the premise of this game very satisfying.

QFT

This game took me by surprise. Half way through watching the Ubisoft live stream, I pretty much figured that it had to be a GR title due to the setting and the enemies involved.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#3 NUSNA_Moebius
Member since 2014 • 118 Posts

It's not the first game to be stuck in development hell. While I would like to get a chance to play it like anyone else, it's one of those things where "it's done when it's done." Clearly Japan Studio got over-ambitious (are all the physically dynamic feathers really necessary?), because honestly this kind of game doesn't require PS4, or even PS3-level CPU and GPU power to be decent looking and well playing, unless dynamic environmental physics are extremely crucial to the gameplay itself.

The fact they showed anything at all is promising, but I would've like to have seen a disclaimer saying the footage was captured directly off of a PS4 in realtime.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#4 NUSNA_Moebius
Member since 2014 • 118 Posts

The weak CPU is possibly a culprit. Not enough SIMD performance to handle the number of calculations needed for a modern car sim, at least in comparison to the Xbone and PS4. I figured the Wii U version was using the same graphical assets as the cancelled PS3 and 360 versions, of which I don't see why it would have issue. Perhaps it's the memory system. The 32 MB of eDRAM is pretty large, but the main memory system may be too slow to sufficiently feed the eDRAM which I suspect is used as a texture and frame buffer.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#5  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

I like DFs budget system vs PS4 and Xbone videos. They highlight console limitations while giving budget PC gaming more spotlight when most people think that PC gaming means spending oodles of cash. I think the biggest limitation for either console is the CPU constraints, and you can visually see how limited they are in terms of NPCs.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#6 NUSNA_Moebius
Member since 2014 • 118 Posts

@Ben-Buja:

Steam says it requires 40 GB of HDD space. Not sure if the game is compressed at all, so expect it to actually be a 40 GB download.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#7 NUSNA_Moebius
Member since 2014 • 118 Posts

I guess you didn't notice the "4096 bit"? It's speculated that the Radeon 300s might have stacked memory and therefore such a massive memory bus would be plausible.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#8  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

Whatever they do, hopefully it's good. Like Intel, their new core needs to be scalable from 2 to 6 or 8 cores for different products and if possible, suitable as a Jaguar type APU replacement. It needs to be modular enough to accept pairing with GPU CUs for APUs or even ARM CPUs for specified work types. It would be fascinating if AMD could actually get HSA to really work and perhaps integrate the graphics CUs into the actual SIMD processing stack without having to use exotic/custom ISAs. That's the real ticket, and would be a big hit with content creators.

AMD might even have a core that comprises x86 and CUs. Much like CUs, you can just modularly add blocks together.

I envision the following possible dies configs to suit a broad range of products:

2 Core; 4 CU APU - Jaguar successor
3 Core; 12 CU APU - mainstream APU successor to current Kaveri line
8 Core; 4 CU "APU" - High end CPU with minimal CU count for basic rendering. I see no logic in wasting die area and therefore adding cost with large on-die graphics unless HSA takes off

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#9 NUSNA_Moebius
Member since 2014 • 118 Posts

As much as I dislike the franchise, the commitment to delivering great performance is nice to see. Pulling the wool over gamer's eyes and saying the game required a 4 core processor was completely dishonest though. At least the game has been patched to officially support dual cores. As for the GPU performance, AMD GCN GPUs really are right in my line of expectations, with the 7870/R9 270X being the closest-to-PS4 GPU on that graph. PS4's graphics is the same config as Pitcairn, with 2 CUs disabled for yields. The R9 270X definitely has a clock advantage and it shows in the graph. I would've loved to have seen how the Radeon 7850 performs in lieu of the PS4 APU and R9 270X, considering the 7850 is Pitcairn with 4 CUs disabled.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#10 NUSNA_Moebius
Member since 2014 • 118 Posts

@emgesp said:

Next-gen can't even do 30fps. I'm tired of all these devs pushing visuals over performance. What good is pretty graphics when the game runs like ass?

I miss the DC and early PS2 days where most games were 60 FPS. Unity was definitely a rush job. The lack of a dev's commitment to delivering smooth framerate is quite annoying to say the least.