NUSNA_Moebius' forum posts

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#1 NUSNA_Moebius
Member since 2014 • 118 Posts

First FEAR was very inspired and stylistic, but lacked visual polish with some rough textures and models. However, there was a cohesive direction in being creepy, giving the players a very mysterious but interesting plot, and delivering great shooting action that still to this day hasn't really been emulated.

FEAR 2 had lots of visual polish but no inspiration, and therefore no soul. As said before by someone, Monolith Productions started chasing the CoD crowd. While the aim down sight feature didn't bother me, throwing out lean capabilities sure did. The AI wasn't well scripted to flank and work together either. The "scares" were your typical stupid b-horror crap.

And all the other things everyone listed in preference to the first game, I agree quite a bit. FEAR 2's graphics were better in many ways (better models, mostly better textures, and normal maps, SSAO). But FEAR 2 lacked the indelible lighting and shadowing the first game made so much use of to really make the game atmospheric. And yes, the HUD was god awful, and there were no dynamic water surfaces either. Like Crysis 2, the second game was a downgrade, with too much focus on chasing a crowd instead of satisfying it's own.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#2  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

@mdk12345: This is true for PC only games with the rare exception like Starcraft, but you do see commercials for multiplatform games with the "PC-DVD" symbol and I think I've seen ones with the Steam logo before as well.

It's important to consider that PC users likely spend more time online on forums, websites, etc where ads for PC games are going to likely be more effective per dollar spent by a good margin than a TV network commercial targeted specifically for a PC exclusive. Most PC exclusives these days are the A and AA studio games as well as indies.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#3  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

Been doing the PC HDTV thing for 5 years. I prefer the desktop experience for FPS, but a good chair, a stool as a mouse platform and keyboard in my lap (all wireless) works quite nicely.

HDMI makes things simple.

Consoles are easy, but PCs can be easy too. It's funny that console games often require lengthy installations of which you're stuck waiting to complete while on PC you can have a download and install in the background while playing a game or doing whatever else without too much issue.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#4  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

This gen really is about bigger, better, faster. However it was like that going from PS2 to the PS3. PS1 to PS2 was important because of not just more polygons and textures, but the use of z-buffering and interpolation to fix texture issues while giving developers a huge leap in capabilities.

What could make this gen significant is the use of true GPGPU if it ever takes off.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#5 NUSNA_Moebius
Member since 2014 • 118 Posts

Sony did practically everything better than MS when it came to developing a game console for this new generation. While MS was busy chasing media functions, Sony was busy making a gaming console that was dev friendly, could host a new generation of games without being overly expensive, and most of all Sony promoted it's function as a game machine.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#6  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

@super600: I think it was the controller novelty that attracted FPS gamers to the Wii. Some games like CoD on the Wii U support the Wiimote, but I think the system and it's capabilities being so past due have alienated gamers who'd rather just play on a more powerful system. Nintendo does not design hardware to satisfy third party devs and they're paying for it.........again.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#7  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

@Xtasy26: It's entirely likely that AMD never would've needed the PS4 and Xbone contracts had they didn't buy ATi years earlier. Their CPU division might've been in much better shape to counter the Core 2 Duo, Nehalem, Sandy Bridge, etc, etc. They could've bought one of the other countless graphics engineering companies like Imagination Technologies before ARM bought them. With AMD's then resources, they could've funded the development of an on-die GPU or general purpose SIMD accelerator that APUs have failed to become since so few software developers cater to them. While on die IGPs have certainly become the norm,

I think it was AMD's acquisition of ATi and the announcement to go after Fusion that likely spurred Intel to do the same since they knew it could threaten their x86 leadership. So Intel has not only bested AMD in terms of CPU tech, but also IGP tech simply because their overall package better fits the kind of uses most consumers care about. That's what revenue, research, and market agility can do for you.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#8  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

Using a triple SLI 8800GTX system as the basic of a comparison? That's ridiculous considering how costly it was in 2006.

I guess anyone with 4x 7970s or GTX 780s already has the power of the PS5 in their hands /sarcasm

I'm a PC person through and through, but some people are just so stupid. A more reasonable thing to say is that the 8800GTX can still play any game on the market which is a feat owing to it's extreme advancement in performance when it was released.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#9  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

As much as I want to congratulate AMD on winning their lawsuit, it's still AMD's fault they are in the hole they are in now. Buying ATi was not worth the $5 billion they paid. Betting their future on Fusion APUs cost them valuable research dollars, personnel, market share, revenue and their fabs.

Such irony when AMD's former head declared "real men have fabs."

AMD really better sink this cash into their new core and get it out on time.

As far as the FX-8350 beating Intels goes, it's likely the games being tested are SIMD/FPU dependent of which the higher clocked processor (AMD FXs generally) can win if the overall SIMD width is the same across both processors:

FX-8350 = 8x 128 bit SIMD (2x 128 bit FPUs per module)

i7-2xxx/3xxx = 4x 256 bit SIMD

Granted, AMD FXs use much more power, their single thread performance sucks and are hot. On a budget, I would consider one but I'd rather go Intel quadcore when I build a computer next. I'm pretty disappointed that AMD didn't produce a desktop quad-module Steamroller CPU to replace the old FX ones.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#10  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

Not enough Ninty games from launch

Crappy hardware scared 3rd parties away

I sincerely hope they have the follow up ready by 2016. They need to get off the backwards compatibility train (unless they can implement it in a useful manner) and actually release some kind of hardware that is within firing range of the Xbone at least. Something like 4x AMD Beema cores running at 2.4 GHz + 512 or more Stream Processors @ 800+ MHz with full HSA support, 32 MB of eDRAM and 4 GB of GDDR5 on a 128 bit bus.