NUSNA_Moebius' forum posts

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#1 NUSNA_Moebius
Member since 2014 • 118 Posts

@HaloinventedFPS said:

Hell no, Next Gen Consoles won't be out until 2020ish, 2016 Hardware is going to wipe the floor with the best 2014 Hardware, Nvidia Pascal is going to be a massive leap, AMD's next gen CPU could also be beastly, Jim Keller is a legend

Jim Keller may be talented but AMD lacks the cash to hire as many or as talented engineers that Intel possesses. Here's hoping for the best though........

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#2 NUSNA_Moebius
Member since 2014 • 118 Posts

It's an interesting device, and I'm surprised at all the praise it's getting (which mostly revolves around the tablet itself). I'd probably refrain from buying one on the count that it's not an x86 Windows device that could run everything I already have.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#3 NUSNA_Moebius
Member since 2014 • 118 Posts

@mgools said:

If it feeds into the games story then fine, but not just for the sake of it.

This. Shoehorning is always bad.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#4  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

You say that now but in 5 years? However, the mid range is a good price point to stick to, upgrading every couple years, instead of spending ALOT at any given moment.

There are different strategies to getting good value for your dollar over a set period of time. LOL I need to upgrade bad myself. I look at a "guaranteed will outlast the console generation" system to be one with twice as much CPU GFLOPS capability (FX8000 or SB Core i5+), twice as much GPU with 4 GB VRAM or more (R9 280), and 16 GB RAM.

I'm pretty tired of my Phenom II x4, Radeon 5850, and 4 GB of DDR3, all circa 2009.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#5  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

@b4x: I think Sony made the right decision to go x86. Jaguar was a good fit for packing many cores with lower TDP into a small area. The general non-SIMD performance of the 8 Jaguar cores is leagues beyond the Cell BE's, which it was highly lacking.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#6 NUSNA_Moebius
Member since 2014 • 118 Posts

Wii U has horrible SIMD. 2 single precision floats per clock per each of the three cores @ only 1.2 GHz, this compared to......

4 SP floats per each of three cores @ 3.2 GHz on the Xenon
4 SP floats per the PPE and 7 SPEs @ 3.2 GHz on the Cell

It's not gonna happen without major concessions, and I doubt the Wii U has any real GPGPU capabilities. The GPU does have some excess GFLOPS to lend itself to helping the CPU but that depends on whether or not Nintendo and AMD integrated the proper silicon and instructions to make it possible.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#7 NUSNA_Moebius
Member since 2014 • 118 Posts

Do yourselves a favor and quit worrying so much about perceived value between the two platforms. You'll be much more satisfied with spending $800-$1000 on a new PC that not only craps all over the consoles in their current state, but could manage 4k gaming, and literally last this whole generation as devs get better optimization out of the PS4 and Xbone.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#8  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

I would imagine that the Cell handled quite a bit of TLOU's rendering, so effectively ND has to reprogram the engine or build a new one to reintegrate all the rendering into the graphics array. This must be done while likely porting all non-graphics SPE jobs into the Jaguar cores which can only achieve half the same theoretical GFLOPS as the SPEs because they are half the clock speed (but still capable of 4 single precision FLOPS per clock). The PS4 is a great machine in totality I think, but it's not really the kind of setup suited to porting PS3 games to since the emphasis on large scale vector throughput isn't the CPU, but the GPU now, even though the CPU cores have plenty of GFLOPS and quite the step up in IPC.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#9  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

I imagine that the Cell's only two real problems in rendering were texture mapping and z-buffering since it doesn't have TMUs or ROPs. It's hard to make up for a lack of specialized hardware, hence why the PS3 needed a real GPU. Vertex generation and culling was best suited towards offloading to Cell probably because the RSX didn't have enough vertex shaders to meet the future needs of developers versus the unified shader arch of Xenos. The Cell has an excess of GFLOPs that can be dedicated to rendering, as long as you have enough left over for animation and physics.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#10 NUSNA_Moebius
Member since 2014 • 118 Posts

@chikenfriedrice:

@chikenfriedrice said:

Man the fall of Crytek is sad...after such 2 great games Crysis & Crysis Warhead they fell off.

Yup. They were blinded by gold in console land. Tis' a shame and a moral story for the ages.