NUSNA_Moebius' forum posts

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#1 NUSNA_Moebius
Member since 2014 • 118 Posts

@hrt_rulz01 said:

Forcing people to buy the special editions to play COD4... typical Activision. Pass!

Not being forced to do anything. It's just a videogame and you don't have to buy it. Vote with your wallet.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#2 NUSNA_Moebius
Member since 2014 • 118 Posts
@locopatho said:
@nintendoboy16 said:

Decides to stop using them after seeing that there is little to no interest for them on other systems? I don't see the PlayStation/XBOX/PC audience buying the likes of what Nintendo has to offer when, again, the majority of them have since moved on from Nintendo at this point. I don't see Smash faring anywhere near as well as Traditional Fighters, I don't see Splatoon faring as well as CoD, and I sure as hell don't see JRPG fans on PlayStation clamor for Pokemon like they do Final Fantasy, Tales of, or Shim Megami Tensei/Persona.

I know YOU don't see it. I know YOU think the strongest brand names in gaming history (Mario and Pokemon) will collapse overnight if they *GASP* run on a generic computer rather than a Nintendo branded computer. But you're crazy, to be fair. That's not how reality works.

Gamers like good games, it's really that simple. They just aren't willing to buy shit systems.

Even Nintendo is waking up to how profitable open platforms are. I'm already predicting software platforms, as opposed to hardware, are the future. The stage is already set for this with two x86 based console platforms and a likely third (Nintendo NX) on the way, on top of the expected mid-gen update to the PS4 platform. Add to that is the recent introduction of PS4 to PC streaming, which I normally would not have expected from a console company, but Sony has deep ties to the PC and mobile, with the now defunct Viao PC, Sony Online Entertainment and Playstation Mobile brands. While graphics hardware is not homogeneous across all platforms, the adoption of new and very efficient APIs can make this possible without any real porting necessary between PC and console, or vice-versa.

Built-in DRM and excellent customer service is what makes Steam so successful, and is a very good model for emulation. I for one would be more than happy to buy PS4 exclusives for my PC, even if it meant using Sony's proprietary digital download system or operating system. Just make the process fast, easy, and secure for the users. Piracy will always be a possibility on either console or PC, so it's best to just deliver a good experience and design DRM to be in the background with minimal interference both in-game and directly at the customer.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#3  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

@Bread_or_Decide said:
@nusna_moebius said:

Like the setting, and even like quite a bit of the mechanical/conceptual designs, but no doubt it will still play like CoD, have all the same stupid set piece design mechanics. As someone mentioned earlier, everyone is starting to chase Titanfall. Activision knows it's a threat to their baby.

Titanfall is a threat? That franchise with the one barely successful game? THAT is a threat?

With the exo-suit stuff added to Advanced Warfare, yeah, it's certainly a threat when you consider how much Activision is trying to emulate it. Plus it plays and feels almost exactly like Call of Duty since it is made by the ex-founders of Infinity Ward. I think very few mainstream CoD players know who made Titanfall and why they should be playing such a well made game, even if it is lacking in overall content. I would argue that this next iteration does need a basic campaign to reel in people who like that sort of thing. I would also narrow down the number game modes to concentrate player bases, while offering more game customization options for non-ranked play.

Expect Titanfall 2 to receive much more support from EA this go around, with a likely much bigger advertising budget.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#4 NUSNA_Moebius
Member since 2014 • 118 Posts

Like the setting, and even like quite a bit of the mechanical/conceptual designs, but no doubt it will still play like CoD, have all the same stupid set piece design mechanics. As someone mentioned earlier, everyone is starting to chase Titanfall. Activision knows it's a threat to their baby.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#5  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

So are the high clock features of Maxwell a direct result of self monitoring, or is the architecture simply designed for it? I ask because it seems possible Nvidia could lose the high clockability of Maxwell in Pascal trying to design an architecture that will compete with the GCN/Polaris-DX12 juggernaut.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#6 NUSNA_Moebius
Member since 2014 • 118 Posts

@GunSmith1_basic said:
The NX will ape the architecture of the competition to encourage porting, and Nintendo will be bribing 3rd parties for content, something they haven't done in the past, but what Sony and MS couldn't stop doing. Nintendo has the most disposable income of all three companies (the gaming sides of them anyway) considering the massive haul they got from the Wii.

What do you mean by "ape"? Copy or as in "ape-s**t" as in it's going to be more powerful?

I think Nintendo learned it's lesson this last go around, but I doubt they saw the PS4 Neo coming. At best, I'm expecting something inbetween the PS4 and it's Neo counterpart. My bets are that the NX's SoC will use Polaris, and I'm sort of betting that it will use Zen. A March 2017 release coincides well with what should be the time just after the launch of the first PC Zen CPUs, and APUs might be debuting by that time as well. Both Zen and Polaris are 14nm architectures, and applicable for lower power form factors like Nintendo prefers.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#7 NUSNA_Moebius
Member since 2014 • 118 Posts

@LegatoSkyheart said:

@Renegade_Fury: I'd like to argue more about that The WiiU totally could have (some) games that are on Current Systems.

Certain games like Farcry 3, Farcry 4, Saints Row 4 + Gat outta Hell, GTAV, Call of Duty Advance Warfare, The Evil Within, Tomb Raider, Dark Souls 2, Crisis 3, Metal Gear Solid V (Phantom Pain and Ground Zeroes, Dead or Alive 5, Ultra Street Fighter IV, etc could have easily gone to the WiiU, but it didn't not because the hardware wasn't up to snuff, but because General Interest in 3rd Parties and Consumers just wasn't there. The only proof I have that those games could work on WiiU are the few 3rd party titles the WiiU actually has like Deus Ex Human Revolution and Assassin's Creed IV. Those are games that are on WiiU so why didn't the WiiU get those 3rd Party games that the 360 and PS3 share with the Xbox One and PS4?

I don't see DEHR as being any kind of real indicator of what the Wii U truly can do. AC4 Black Flag on Wii U is pretty impressive, and what I've seen on Youtube indicates no CPU based compromises were made, in particular the semi-interactive/reactive water during ship combat and travel. It's unlikely to be GPGPU based considering the performance impact it could have (though the Wii U GPU has enough GFLOPS to do while leaving enough for rendering).

I can't see the three PowerPC 750 cores being able to handle it either with their extremely limited SIMD, unless they are fabled 750VX cores or were given some kind of other bolt-on VMX/Altivec capability. 750CL like on the Wii (which was called Broadway) inherited the Gekko's (Gamecube CPU) limited 2x 32 bit/ 4x 16 bit "SIMD" capabilities, and by extension, the Wii U CPU should have the same problems, even with 3 cores and at a higher 1.25 GHz clock speed. Each core is only capable of 5.0 GFLOPS under the 4 x 16 bit model.

I always saw the lack of CPU GFLOPS as an issue for why the Wii U has seen so few mainstream games ported to it. Then again, even the PS2 did get a few instances of very high quality interactive water simulation on it's 6.0 total GFLOPS, and that was likely with dedicating one vector unit to general game rendering, leaving the 2.4 GFLOPS + 1.2 GFLOPS of the other vector unit and the FPU respectively.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#8 NUSNA_Moebius
Member since 2014 • 118 Posts

My bets are on the Neo using Polaris. The reported jump to ~220 GB/s from 176 GB/s I believe isn't enough for delivering 1080p/60 FPS visuals on par with 1080p/30 FPS games currently on the PS4. I don't think even GCN 1.3 would be able to deliver either. Polaris, like Maxwell, will have a big increase in L2 cache to make the architecture more robust in the face of limited relative bandwidth for the amount of GFLOPS being put out. It then just comes down to Polaris being able to run the same instructions as GCN, since Polaris is the fourth iteration of GCN, but a big enough jump to warrant a new family name.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#9  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

Nintendo simply failed to provide developers with a competent machine and online system in the Wii U. They were used to CPUs that gave them the kind of performance they needed for modern games (aka GFLOPS!). The more powerful graphics in the Wii compared to the 360 and PS3 wasn't enough to mean much when the games had to be paired back so much just to run on primitive proto-SIMD FPU units. Developers were not going to waste their time delivering inferior games on a machine they knew would be a waste of time.

Ninty had the chance to build a generational gap-bridging system that for a full year would've been the system for multiplatforms, and when the PS4 and Xbone came out, would've been in their ballpark of performance to still be a decently modern gaming machine. But they failed to understand this, and once again placed their bets on a gimmick. Instead of designing everything to be excellent. The CPU should've just been a modern PowerPC variant that would've likely been able to run Wii code without issue via legacy instruction set support, and Wii graphics could've been emulated just fine with a decent CPU (just look at the Flipper emulator). Going with archaic Radeon 4000 series-derived graphics was incredibly silly. Minimally, they should've gone with AMD/ATI Evergreen graphics for native tessellation capabilities that R700 did not, and employed a decently large GPU with at least 1000 GFLOPS (like AMD Juniper or Cape Verde) to put it in some kind of ballpark of where the Xbone and PS4 would be. For a whole year, the Wii U would've been enjoying multiplatform games at 1080p/60 FPS while the PS3 and 360 were struggling with 720p/30 FPS.

Avatar image for nusna_moebius
NUSNA_Moebius

118

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 5

#10  Edited By NUSNA_Moebius
Member since 2014 • 118 Posts

@Opus_Rea-333 said:

GameCube was great Mario Sunshine looked so good on a small analog tv... Vibrant colors for such small cube shaped console powered by the great ATi graphics

Love it, plus the Metroid Prime TV commercial was hot.

The Flipper GPU was originally designed by ArtX who was acquired by ATi, so calling at ATi chip is a bit of a misnomer. Honestly, I don't think it was that great a graphics processor in the first place considering it did not have the same kind of programmable shaders the XGPU did (despite releasing at the same time). The TEV was fairly limited, did not reflect the direction graphics programming was going and therefore was highly underutilized. But, it was well geared towards 480p rendering with it's on-GPU eDRAM texture and framebuffer caches, and a good number of GC developers really showed not only the prowess of the Gamecube as a whole, but their technical competence as well. I still think to this day, reusing the Gamecube architecture in the way Nintendo did for the Wii was a big mistake, especially as the years grew on the system. But the Wii sold like hotcakes and was immensely profitable for Nintendo..........

The way IBM was able to develop a triple core PowerPC 750 variant for the Wii U does make me wonder how suitable a dual core variant for the original Wii could've been. It could've been coupled to either a two-core Hollywood/Flipper (for split frame rendering) or a highly expanded version of Hollywood with more pipes and bolted on programmable shader capability, or just did away with BC altogether for a low end ATi X1000 series GPU like the x1400 or x1600. The Wii would've at least been in the kind of ballpark graphically to receive proper down ports of PS3/360 games. The Wii U even currently feels the absolute lack of any competent SIMD/Altivec.