imt558's forum posts

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1  Edited By imt558
Member since 2004 • 976 Posts

@kingtito said:

Nothing on a website upsets me kid. I think it's hilarious you cows are just pathetic. You've become that meme "leave my Sony alone" whenever someone mentions them, good or bad. It's just an automatic response from cows. Funny to sit back and watch and even better to respond to cows just to watch the reactions. Take you for example.

You couldn't actually say anything about my post without making up something so you resorted to just repeating that I'm upset. It's funny how easily I can get under skin and I didn't even have to put Sony down to do it hahaha.

Nobody in whole universe are salty and more pathetic than lems, led by MisterXmedia first and foremost. You should join to his forum. There you will find serenity.

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3  Edited By imt558
Member since 2004 • 976 Posts

@ronvalencia:

Did you know that Xbone has DGPU FPU double stacked eSRAM with HBM combined with MMU? And also Xbone is a Polaris archutecture? No? Know you know.

Keep dreaming, MrX follower. And about that PS4 bandwidth : Oddworld: New ‘n’ Tasty Dev On PS4’s 8GB GDDR5 RAM: “Fact That Memory Operates at 172GB/s is Amazing"

http://gamingbolt.com/oddworld-inhabitants-dev-on-ps4s-8gb-gddr5-ram-fact-that-memory-operates-at-172gbs-is-amazing

So, you add Xbone's GPU pixel fill-rate to ESRAM bandwith. Why not add PS4's GPU pixel fill-rate to GDDR5 bandwidth?

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 imt558
Member since 2004 • 976 Posts

@ronvalencia said:

So what with the 8th cycle? ESRAM is a full duplex memory technology hence it has less mode switch overheads than half duplex DRAMs.

Sorry, it's just a technology type difference. My PC GPUs are also affected by QB's memory access patterns and it's different from Hitman DX12 (AMD Gaming Evolved title).

That means it doesn't have read/write operations unlike GDDR5 with every cycle!

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 imt558
Member since 2004 • 976 Posts

@ronvalencia said:

My speculation...

If QB uses XBO's ESRAM strength with large among of TMU/ROPS read/write memory operations for it's effects layers (shader program could be simple with high memory write operations), it will gimped on PC GPUs without comparable memory bandwidth. 980 Ti has enough memory bandwidth to replicate XBO's ESRAM memory read and write operations. PC GPU's higher FLOPS power can't do anything about very large memory read and write operations bottleneck.

You speculations are shit, man! ESRAM has read/write possibilities, BUT EVERY 8TH CYCLE!

XBone has it's strength and QB has exploited it.

Something under NDA?

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6  Edited By imt558
Member since 2004 • 976 Posts

Guys, one of the lems posts on Eurogamer's U4 review ( http://www.eurogamer.net/articles/2016-05-05-uncharted-4-review )

XBgamer 23 minutes ago

@MrTomFTW Hi Mrtom here is the question., why doesn't QB or Rise TOMB RAIDER deserve a review score but UC4 does. Is it just a that UC4 is better than those two games put together or is it something more?

XBgamer 14 minutes ago

@MrTomFTW Hi Mrtom so you agree UC4 is better than QB & TR put together considering both of those games couldn't get a review score. And going by metacritic scores QB scores 77 and TR scored 86. Please you may kid yourself but EG clearly has a Agenda to push don't we agree.

LOL! Every media are Sony biased confirmed!

And also :

Well done, ND!

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7  Edited By imt558
Member since 2004 • 976 Posts

@dynamitecop said:
@elpresador-911 said:

@dynamitecop: who cares about metacritic lmao? 10/10 on Gamespot is all that matters here. And even if it fell to a 89/88 on metacritic it would still outscore every single Xbox 1 exclusive.

If it falls into the 80's even if it's an 89 it will be considered a flop, you will hear it from everyone everywhere until the end of time.

This is not QB, NYaDC. Just stay with your lovely Xbone and don't bother with PS4.

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8  Edited By imt558
Member since 2004 • 976 Posts

@ronvalencia said:
@notafanboy said:
@ronvalencia said:
@notafanboy said:

You are completely insane if you think the Xbone 1.5 is at 10 TF. You just discredited yourself by posting this bullcrap.

The Xbone 1.5 will be weaker than the base PS4 model because Microsoft is not as smart as Cerny. Get it through your thick skull.

Microsoft already stated they don't want to do half gen step and if there's any upgrade, they wanted a substantial upgrade. Remember, Microsoft has the larger chip size than PS4's 348 mm^2 and high capability GDDR5 is a known factor at this time. Get it through your thick skull.

Microsoft's testing for different Xbox configurations would be pointless for yet another Xbox One's 1.x TFLOPS level power.

AMD knows the large chips doesn't deliver the same "bang per buck" as smaller chips and a solution must be found for any future consoles and mainstream PC SKUs i.e. watch the AMD's Masterplan Part 2 video. Consoles will NOT have PC's Vega 10, but Vega has a higher pref/watt than Polaris's 2.5X.

Btw, From http://arstechnica.com/gadgets/2016/03/amd-gpu-vega-navi-revealed/

There's another Vega in the form of "Vega 11".

AMD also confirmed that there will be at least two GPUs released under Vega: Vega 10 and Vega 11.

Polaris has it's high-low ASICs. 1st gen FinFET design

Vega has it's high-low ASICs. 2nd gen FinFET design.

If there's a new process tech, there's a new wave of PC GPUs.

AMD usually mixes different PC GCN revisions within a series models e.g. GCN 1.0, 1.1, 1.2 within the same marketing Rx-3x0 model series.

Unlike XBO/PS4's GPU designs, Xbox 360's GPU (SIMD based) wasn't attached to PC's Radeon HD's VLIW5/VLIW4 based designs.

What MS says and what they actually do are 2 different things.

Ex: MS says that they have great exclusives, yet their exclusives are trash.

MS will need to include the eSRAM in their Xbone 1.5 in order to enable backwards compatibility with the Xbone, so their SoC size is irrelevant, since 50+% of it will be consumed by the eSRAM.

MS will be lucky to even match the base PS4. PS4 is an engineering marvel. MS still hasn't fully reversed-engineered it yet, meaning they can't copy 100% of it in time for the Xbone 1.5 launch. It's game over. SONY wins again.

If you recall from Oxide's BradW statement on eSRAM, the programmer access eSRAM via Xbox One's specific eSRAM APIs. Microsoft updated eSRAM API to make it easier to use.

Xbox One's ESRAM is treated like local video memory with 32 MB size.

Did you remember Tiled Resource API being run NVIDIA GPU that consumed 16 MB of VRAM?

While DX11 layer was thrown in the bin, programmers doesn't have real access to the hardware with current generation consoles i.e. programmer has to go through the GPU driver e.g. for PS4's low level API's case.

At the driver level programming, the programmer can create any multi-threading model just like AMD's driver programmers e.g. bolted Mantle API on AMD's WDDM 1.3 drivers and bypassed DX11 layer, but Xbox One programmers doesn't have AMD's driver access level, hence they are stuck with DX11.X multi-threading model. It took MS DX12 update for XBO to change multi-threading model.

The real metal programming is the access level to create a GPU driver with any multi-threading model.

Furthermore, modern X86 CPUs translates CISC into RISC and programmers doesn't have access to X86's internal RISC instructions i.e. the secect sauce remains with Intel and AMD.

AMD K5 has provided programmer access to it's internal RISC core instruction set i.e. the ability to bypass the X86 decoders to gain additional performance. On PS4's Jaguar CPU, you don't get that access.

BradW has asserted PS4's low level API as last gen API with updates.

MisterXmedia, is that you? Or one of his followers. So, we also have MrX followers here on Gamespot. God bless them.

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9  Edited By imt558
Member since 2004 • 976 Posts

@quadknight said:

lol @ anyone who believes this rubbish.

Lems, of course! This thread should go on. Just for reading lems posts here.

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 imt558
Member since 2004 • 976 Posts

Xrays is the same idiot like MisterXmedia.