Rumor.Shadow of Mordor 720p 30FPS on XBO.?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#151 tormentos
Member since 2003 • 33784 Posts

@StormyJoe said:

"Damage Control" and "Pointing Our Gross Misrepresentations" are not at all the same thing.

If someone says "Racer A beat Racer B by a mile!", is it damage control to point out that Racer A really won by .3 seconds? I don't think so.

All debating aside - where's the definitive answer on this? The game is out, why do we not know the resolution and frame rates?

We do the xbox one version was pixel counted to be 900p,and the PS4 version is 1080p 30+ FPS.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#152 tormentos
Member since 2003 • 33784 Posts
@RyviusARC said:

Well the PS4 version does not run the game at max settings anyway. I know it definitely does not run Ambient Occlusion at ultra (which is fairly demanding) or some of the other settings on ultra, it probably doesn't even use tessellation.

Also I was running the benchmark at 2560x1440 which is almost twice the pixels at 1920x1080 (1080p)

Yeah that is because it has a r260 like GPU,while you are running 2 GPU which cost more than the PS4 it self.

Your PC isn't $399 and comparison with the PS4 are worthless since you spend a ton of cash on your set up,which most people don't do.

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#153 delta3074
Member since 2007 • 20003 Posts

@tormentos said:
@RyviusARC said:

Well the PS4 version does not run the game at max settings anyway. I know it definitely does not run Ambient Occlusion at ultra (which is fairly demanding) or some of the other settings on ultra, it probably doesn't even use tessellation.

Also I was running the benchmark at 2560x1440 which is almost twice the pixels at 1920x1080 (1080p)

Yeah that is because it has a r260 like GPU,while you are running 2 GPU which cost more than the PS4 it self.

Your PC isn't $399 and comparison with the PS4 are worthless since you spend a ton of cash on your set up,which most people don't do.

You cannot do that, you cannot discount the fact that his machine runs it better than the Ps4 because of the cost difference because it doesn't change the Basic Fact that his machine runs it better.

I hate to say it but saying 'that doesn't count because of the price difference' is blatant damage controlling at it's finest.

When Ps3 exclusives looked better than Xbox 360 exclusives you didn't see lems jump in and say ' doesn't count because the Ps3 costs more' or 'worthless comparison because the Ps3 costs more'

The bottom line is that his machine runs the game better, the PC version runs better regardless of the difference in costs or specs and you Cows really need to Accept what Lems have accepted for years, the basic Fact that High end PC rigs can Run these games better than consoles, cost isn't going to change that fundamental fact.

When it comes down to which machine runs the game better, which is what we are talking about, then cost is irrelevant dude.

Avatar image for Sagemode87
Sagemode87

3416

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#154  Edited By Sagemode87
Member since 2013 • 3416 Posts

@delta3074: Lmao, your comparison is BS. PC isn't a console, it can be upgraded. Consoles are closed. PS3s extra cost over 360 came from Blu-ray. PS3 has costed the same as 360 for years now, but to play your little game, PS4 and X1 costs the same Lemming. Continue trying to hide behind PC though, pathetic.

Avatar image for cainetao11
cainetao11

38032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#155 cainetao11
Member since 2006 • 38032 Posts

@tormentos:

Does this thread hurt your fillings.? Is not even a flame thread so you can't take then goggles not even for a second i see.

And i have a PC you know and i will buy this game on PC.

No, my fillings are right where the dentist put them. Just don't see the point to another thread about something we have all come to accept (well most of us). The PS4 is stronger than the X1 and WiiU, period. Are your FEELINGS so insecure that you need to keep reaffirming this to yourself?

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#157 delta3074
Member since 2007 • 20003 Posts

@Sagemode87 said:

@delta3074: Lmao, your comparison is BS. PC isn't a console, it can be upgraded. Consoles are closed. PS3s extra cost over 360 came from Blu-ray. PS3 has costed the same as 360 for years now, but to play your little game, PS4 and X1 costs the same Lemming. Continue trying to hide behind PC though, pathetic.

Did you forget the Memo? it was made quite clear by the GS staff that the Pc is a 'system' and that you cannot discount it, it doesn't matter if it isn't a console, this board isn't called 'console wars' it's called 'system wars' you F***ing idiot and as far as i was aware this thread is not about How much Systems cost it's about Which system runs the game better.

Also, i own niether a Ps4 or an Xbone so how am i hiding behind the PC?

Obvious to me that you are a complete Moron who didn't even understand the point i was making and have obviously NEVER read any of my previous posts in this very thread.

THE ONLY point i was trying to make is that when it comes to Actually running the game then Cost is irrlevant, you are the one who tried to complicate what is essentially a simple with 'wah,wah, wah PC can be upgraded, wha, wah, wah Lem, wha, wah, wah Pc isn't a console, wah,wah,wah stop hiding behind the PC'

Hurts doesn't it? that High end Pc's can run a game better than your precious little Ps4 can?

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#158 Kinthalis
Member since 2002 • 5503 Posts

PS4 runs at 30+?? +?? IS that a thing now?

Dat next gen feel of almost 60 FPS except not really, cause it's more liek 35 FPS.

LOL!

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#159 StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:

@StormyJoe said:

"Damage Control" and "Pointing Our Gross Misrepresentations" are not at all the same thing.

If someone says "Racer A beat Racer B by a mile!", is it damage control to point out that Racer A really won by .3 seconds? I don't think so.

All debating aside - where's the definitive answer on this? The game is out, why do we not know the resolution and frame rates?

We do the xbox one version was pixel counted to be 900p,and the PS4 version is 1080p 30+ FPS.

Is that confirmed, or just someone on a forum trying to count pixels? I am not trying to argue, I do not know the answer.

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#160  Edited By darkangel115
Member since 2013 • 4562 Posts

@delta3074 said:
@tormentos said:

@darkangel115 said:

Funny how when i mentioned 4k before you tried calling me a fake hermit, then when i explained how you didn't read my post and the my TV upscales my PS4 and X1 to 4k regardless you never replied. So whats the difference if everything i played is upscaled to 4K? I don't notice any difference in any of the games. But i see you still keep up the good fight in being a stupid fanboy. good job for you to ignore everything dumb you say ;)

What TV you have that upscale your games to 4K.?

Upscale is blowing up pixels,is not the same it introduce artifact and blur the image.

Take 720p for example is 1280x720 that = 921,600 pixels

1080p is 1920x1080p which = 2,073,600

Now is your game is natively 1080p you get 2,073,600 pixels.

If your game is natively 720p you get 921,600 pixels if you want to display that 720p game in 1080p,and you upscale it,what the hardware does is stretch those 921,600 pixels to fit a space that should be fill by 2,073,600 pixels,which is why the image blurs because you are stretching the pixel.

If you upscale a 720p game to 4K it wold be a total blur fest but i do believe that you don't have such a TV and i don't think 4K TV upscale your PS4 or xbox one games on its own,that is a job of your console which is not doing now either.

The fact that you claim your TV upscales to 4k is enough to know you are a blind biased lemming who knows sh** about what youre talking.

Bang on the money there tormentos, it's the console that upscales the games not the TV, the guys talking utter bollocks.

Remember, we used to 'debate' the difference between the Hardware upscaler in the 360 (which coincidently caused the E74 errors on the 360) versus the software upscaler in the PS3.

Well pointed out, on the ball today eh?

lol OMG you guys are so dumb. Newsflash. all 4k TVs have a built in upscaler and no upscaling isn't stretching.

FYI this is the TV i have

http://www.samsung.com/us/video/tvs/UN65HU8550FXZA

[quote]UHD Upscaling delivers the complete UHD picture experience with a proprietary process including signal analysis, noise reduction, UHD upscaling and detail enhancement to seamlessly upconvert SD, HD or full HD content to UHD-level picture quality.[/quote]

So @tormentos and @delta3074 Its said, especially with tormentos who already has 0 knowledge and everything he posts is copy and paste form elsewhere, wouldn't look into it before making those allegations. So now i wonder what lame comeback you guys will have

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#161 StormyJoe
Member since 2011 • 7806 Posts

@delta3074 said:

@Sagemode87 said:

@delta3074: Lmao, your comparison is BS. PC isn't a console, it can be upgraded. Consoles are closed. PS3s extra cost over 360 came from Blu-ray. PS3 has costed the same as 360 for years now, but to play your little game, PS4 and X1 costs the same Lemming. Continue trying to hide behind PC though, pathetic.

Did you forget the Memo? it was made quite clear by the GS staff that the Pc is a 'system' and that you cannot discount it, it doesn't matter if it isn't a console, this board isn't called 'console wars' it's called 'system wars' you F***ing idiot and as far as i was aware this thread is not about How much Systems cost it's about Which system runs the game better.

Also, i own niether a Ps4 or an Xbone so how am i hiding behind the PC?

Obvious to me that you are a complete Moron who didn't even understand the point i was making and have obviously NEVER read any of my previous posts in this very thread.

THE ONLY point i was trying to make is that when it comes to Actually running the game then Cost is irrlevant, you are the one who tried to complicate what is essentially a simple with 'wah,wah, wah PC can be upgraded, wha, wah, wah Lem, wha, wah, wah Pc isn't a console, wah,wah,wah stop hiding behind the PC'

Hurts doesn't it? that High end Pc's can run a game better than your precious little Ps4 can?

Normally, I don't interrupt an interesting discussion, but I would like to interject that *only* in SW are PC's competition for consoles. Neither MS, Sony, nor Nintendo list the "Personal Computer" as a competitor for their consoles inther 10-k reports.

Avatar image for mikhail
mikhail

2697

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#162 mikhail
Member since 2003 • 2697 Posts

@Kinthalis said:

PS4 runs at 30+?? +?? IS that a thing now?

Dat next gen feel of almost 60 FPS except not really, cause it's more liek 35 FPS.

LOL!

In the Giant Bomb Quick Look comments, Brad Shoemaker says that the PS4 version they played is definitely running at 30. Not 60.

Lol, consoles.

Avatar image for scartm
ScarTM

174

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#163  Edited By ScarTM
Member since 2014 • 174 Posts

30FPS on XBOX One confirmed

http://www.therem.org/shadow-of-mordor-for-xbox-one-update-on-resolution-and-frame-rate/1217584/

PS4 will be playing it on 1080p 60FPS.

Still no news about the XBOX One resolution though

EDIT : someone really needs to teach my how to confine the link into one word.I get a blank page every time I try it

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#164  Edited By DEadliNE-Zero0
Member since 2014 • 6607 Posts

Digital Foundry article on the 6GB requirements. Mostly poitning out stuff we already knew, but some new stuff:

http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures

"PS4 is basically a match for the high quality setting on PC, which requires a graphics card with 3GB of RAM for best performance. Ultra is a small improvement over high based on the quality of the ground texture here, while medium looks a little blurry."

That's interesting. Having tested the medium and high textures, it didn't look that blurry on my TV, even in borderless mode (stupid v-sync). But interesting to find out the PS4 is running on high textures. Making use of that shared memory pool i guess.

Still, that's for textures. The rest is most likely pulling the "1 setting bellow max on pc" thing. But cool none the less

"One thing we should point out is that it is perfectly possible to run higher-quality artwork on lower-capacity graphics cards. However, you quickly fall foul of the split-memory architecture of the PC. On Xbox One and PS4, the available memory is unified in one address space, meaning instant access to everything. On PC, memory is typically split between system DDR3, and the graphics card's onboard GDDR5. Running high or ultra graphics on a 2GB card sees artwork swapping between the two memory pools, creating stutter. Shadow of Mordor has an optional 30fps cap incorporated into its options, though - with a 2GB GTX 760, we could run the game at ultra settings with high quality textures and frame-rate was pretty much locked at the target 30fps with only very minor stutter. In short, there's a way forward for those using 2GB cards, but it does involve locking frame-rate at the console standard - and the ultra textures didn't play nicely with the card, even at 30fps."

Hm, looks like games might actually start ass pulling 3GB for textures from now on. Seems stupid to me. So far, Wash Doge and SoM (TF too? Don't remember that one) only use them for texture quality.

So, we have to run those bellow but can max others out. Ironically, i have a 2GB 760 too. I was getting around 46fps with those setting (fucking v-sync). I capped it at 30, but will see about 60fps today. Stutering was mostly absent, specially compared to WD.

So, brute GPU power is still more important, btu VRAM requirement might annoyingly go up from now on. Still, just the textures alone is fine with me.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#165  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@mikhail said:

@Kinthalis said:

PS4 runs at 30+?? +?? IS that a thing now?

Dat next gen feel of almost 60 FPS except not really, cause it's more liek 35 FPS.

LOL!

In the Giant Bomb Quick Look comments, Brad Shoemaker says that the PS4 version they played is definitely running at 30. Not 60.

Lol, consoles.

While my pc is able to play it at Ultra with high textures with 60+ average at 1080p.... pathetic

Avatar image for b4x
B4X

5660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#166 B4X
Member since 2014 • 5660 Posts

@tormentos:

Rumor still?

That OP title.....

Avatar image for clone01
clone01

29824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#167 clone01
Member since 2003 • 29824 Posts

@uninspiredcup said:
@PonchoTaco said:

Do Xboners even care about this anymore?

If I were an Xboner, I wouldn't.

Most of these fellows play Call Of Duty and mountain dew.

How does one play Mountain Dew, Sniper?

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168  Edited By RyviusARC
Member since 2011 • 5708 Posts

@deadline-zero0 said:

Digital Foundry article on the 6GB requirements. Mostly poitning out stuff we already knew, but some new stuff:

http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures

"PS4 is basically a match for the high quality setting on PC, which requires a graphics card with 3GB of RAM for best performance. Ultra is a small improvement over high based on the quality of the ground texture here, while medium looks a little blurry."

That's interesting. Having tested the medium and high textures, it didn't look that blurry on my TV, even in borderless mode (stupid v-sync). But interesting to find out the PS4 is running on high textures. Making use of that shared memory pool i guess.

Still, that's for textures. The rest is most likely pulling the "1 setting bellow max on pc" thing. But cool none the less

"One thing we should point out is that it is perfectly possible to run higher-quality artwork on lower-capacity graphics cards. However, you quickly fall foul of the split-memory architecture of the PC. On Xbox One and PS4, the available memory is unified in one address space, meaning instant access to everything. On PC, memory is typically split between system DDR3, and the graphics card's onboard GDDR5. Running high or ultra graphics on a 2GB card sees artwork swapping between the two memory pools, creating stutter. Shadow of Mordor has an optional 30fps cap incorporated into its options, though - with a 2GB GTX 760, we could run the game at ultra settings with high quality textures and frame-rate was pretty much locked at the target 30fps with only very minor stutter. In short, there's a way forward for those using 2GB cards, but it does involve locking frame-rate at the console standard - and the ultra textures didn't play nicely with the card, even at 30fps."

Hm, looks like games might actually start ass pulling 3GB for textures from now on. Seems stupid to me. So far, Wash Doge and SoM (TF too? Don't remember that one) only use them for texture quality.

So, we have to run those bellow but can max others out. Ironically, i have a 2GB 760 too. I was getting around 46fps with those setting (fucking v-sync). I capped it at 30, but will see about 60fps today. Stutering was mostly absent, specially compared to WD.

So, brute GPU power is still more important, btu VRAM requirement might annoyingly go up from now on. Still, just the textures alone is fine with me.

I find the 6GB vram rumor to be bs though.

To even get the game to max out my vRAM of 4GB I had to go above 4k resolution with max settings and ultra textures.

4k resolution was still using less than 4GB of vRAM but 5120x2880 actually used all of my vRAM.

At 1440p max settings with ultra textures I was getting over 90+fps. So I ran the game at higher settings than the PS4 with almost twice the pixel count and almost 3x the frame rate.

I think the console versions don't even use tessellation.

Also here is a screenshot comparison between high and ultra textures.

High is the first screenshot and Ultra is the second.

View them in a separate tab for better quality.

Avatar image for glez13
glez13

10310

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#169 glez13
Member since 2006 • 10310 Posts

Nvidia better not f*ck up and put 3GB on the GTX 960.

Avatar image for glez13
glez13

10310

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#170 glez13
Member since 2006 • 10310 Posts

Nvidia better not f*ck up and put 3GB on the GTX 960.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#171  Edited By tormentos
Member since 2003 • 33784 Posts
@Kinthalis said:

PS4 runs at 30+?? +?? IS that a thing now?

Dat next gen feel of almost 60 FPS except not really, cause it's more liek 35 FPS.

LOL!

Which is better than what the majority of PC gamers can play it at..

You know 1080p is good enough and 30 FPS as well when your card doesn't have more in the tank,since not every one on PC has a set up like you it make your argument null not all PC can run it even close to your set up..

@RyviusARC said:

I find the 6GB vram rumor to be bs though.

To even get the game to max out my vRAM of 4GB I had to go above 4k resolution with max settings and ultra textures.

4k resolution was still using less than 4GB of vRAM but 5120x2880 actually used all of my vRAM.

At 1440p max settings with ultra textures I was getting over 90+fps. So I ran the game at higher settings than the PS4 with almost twice the pixel count and almost 3x the frame rate.

I think the console versions don't even use tessellation.

Also here is a screenshot comparison between high and ultra textures.

High is the first screenshot and Ultra is the second.

View them in a separate tab for better quality.

Once again you have a SLI set up dude.

Indeed, it's actually the compromises made to accommodate 2GB graphics cards that are more concerning. The game still looks good, but in certain areas, console is a cut above - unless you kick in the frame-rate limiter. We saw a similar story with Titanfall: Respawn's debut required a 3GB graphics card to match the texture quality found in the Xbox One version of the game. That being the case, the recent discounts found on the 3GB Radeon R9 280 start to look compelling, especially as its replacement, the R9 285, only has 2GB of RAM in its standard configuration.

I find that funny because there is a barrage of cards out there more powerful than the PS4 but which had 2GB of ram and are suffering because of the lack of ram.

Ram will hit hard 2 Gb GPU the more the generation goes on,on 2017 those 2GB cards will be screaming..

I call it long time ago.

@b4x said:

@tormentos:

Rumor still?

That OP title.....

And.?

For the 10 time this isn't an anti xbox one thread.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#172 Kinthalis
Member since 2002 • 5503 Posts

According to the devs the Ultra textures are the original, non-optimized high quality textures the artists worked on. High and lower are properly optimized textures.

It's a cool thing to have, but it in no way means 6 GB cards are needed in the future, just that if you happen to have one, you can probably enjoy super, sexy textures. The PS5 doesn't even have 6 Gigs of RAM to play with, nevermind 6 gigs for just rendering + textures. Since it's all one buffer, it also needs to hold game data in those 5.5 gigs.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#173 tormentos
Member since 2003 • 33784 Posts

@Kinthalis said:

According to the devs the Ultra textures are the original, non-optimized high quality textures the artists worked on. High and lower are properly optimized textures.

It's a cool thing to have, but it in no way means 6 GB cards are needed in the future, just that if you happen to have one, you can probably enjoy super, sexy textures. The PS5 doesn't even have 6 Gigs of RAM to play with, nevermind 6 gigs for just rendering + textures. Since it's all one buffer, it also needs to hold game data in those 5.5 gigs.

The PS4 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 2Gb GPU out there memory wise,the PS4 has 4.5GB of Vram.

DF it self is telling you that the consoles not only the PS4 are abode in some parts because it has single unified memory big enough for the game,and no is not juts for buffer please link that back it up please..lol

The PS4 textures are equal to high,damn my R270 is 2Gb and the game on high uses more than 2Gb so on PC i may not even get high when the PS4 is getting it.? damn..

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#174  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:
@RyviusARC said:

I find the 6GB vram rumor to be bs though.

To even get the game to max out my vRAM of 4GB I had to go above 4k resolution with max settings and ultra textures.

4k resolution was still using less than 4GB of vRAM but 5120x2880 actually used all of my vRAM.

At 1440p max settings with ultra textures I was getting over 90+fps. So I ran the game at higher settings than the PS4 with almost twice the pixel count and almost 3x the frame rate.

I think the console versions don't even use tessellation.

Also here is a screenshot comparison between high and ultra textures.

High is the first screenshot and Ultra is the second.

View them in a separate tab for better quality.

Once again you have a SLI set up dude.

Indeed, it's actually the compromises made to accommodate 2GB graphics cards that are more concerning. The game still looks good, but in certain areas, console is a cut above - unless you kick in the frame-rate limiter. We saw a similar story with Titanfall: Respawn's debut required a 3GB graphics card to match the texture quality found in the Xbox One version of the game. That being the case, the recent discounts found on the 3GB Radeon R9 280 start to look compelling, especially as its replacement, the R9 285, only has 2GB of RAM in its standard configuration.

I find that funny because there is a barrage of cards out there more powerful than the PS4 but which had 2GB of ram and are suffering because of the lack of ram.

Ram will hit hard 2 Gb GPU the more the generation goes on,on 2017 those 2GB cards will be screaming..

I call it long time ago.


lol such BS.... My poor 2gb video card is pulling at 1080p 60+ fps on ultra settings with high textures with Mordor, .. , and Titanfall's Vram requirement is also BS since gtx 650ti 2b can get nearly 60 fps average and even a 1gb GTX 750 is able to average 50 fps.... Whats the point of have 3gb of vram for either console gpu when they dont even have the power to use all the bells and whistles.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#175  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

lol such BS.... My poor 2gb video card is pulling at 1080p 60+ fps on ultra settings with high textures with Mordor, with Ultra textures it was around 50 fps... , and Titanfall's Vram requirement is also BS since gtx 650ti 2b can get nearly 60 fps average and even a 1gb GTX 750 is able to average 50 fps.... Whats the point of have 3gb of vram for either console gpu when they dont even have the power to use all the bells and whistles.

Let see who do we believe DF and the own fu**ing maker of the game or a blind biased sony hater who can't admit been wrong...

What poor video card.? i hope you are not talking about that sh** 560Ti...lol

You were owned on this topic already butthurt hermit...

I know sony make you mad because you had to upgrade on 2013 when you already upgraded on 2011 but is time to let it go...lol

Not 1 but 2 GPU on that chart which have 2 versions 1 with 1 GB and one with 2 in both cases in Skyrim memory was a freaking bottleneck PROVEN and by as much as 31 FPS on the same GPU that is sad.

Basically that is a way for GPU makers to hold performance back on GPU and force you to upgrade.

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#176  Edited By DEadliNE-Zero0
Member since 2014 • 6607 Posts

@tormentos said:

Once again you have a SLI set up dude.

Indeed, it's actually the compromises made to accommodate 2GB graphics cards that are more concerning. The game still looks good, but in certain areas, console is a cut above - unless you kick in the frame-rate limiter. We saw a similar story with Titanfall: Respawn's debut required a 3GB graphics card to match the texture quality found in the Xbox One version of the game. That being the case, the recent discounts found on the 3GB Radeon R9 280 start to look compelling, especially as its replacement, the R9 285, only has 2GB of RAM in its standard configuration.

I find that funny because there is a barrage of cards out there more powerful than the PS4 but which had 2GB of ram and are suffering because of the lack of ram.

Ram will hit hard 2 Gb GPU the more the generation goes on,on 2017 those 2GB cards will be screaming..

I call it long time ago.

The textures Tormentos. Of course, if you decide to tone down the textures to medium for 2GB, the consoles will be above in that regard.

But the point remains that, if the PS4 is indeed at 30fps, even 2GB cards can play at high textures with better overall settings than the PS4.

So far, only texture settings is eating up the VRAM for some reason. I agree 100% that in a couple of years, 2GB will not enough. Like Glez13 said, we need to keep cards at 4GB minimum. I've read that there're 8GB models for the 970 and 980 coming.

I did an extra test, and going from medium to high dropped me around 6 to 10fps. For 60fps solid, i put a mixture of medium and high. it think i'll be playing like this.

Lack of AA in this game is massive bullshit.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#177  Edited By tormentos
Member since 2003 • 33784 Posts

@deadline-zero0 said:

The textures Tormentos. Of course, if you decide to tone down the textures to medium for 2GB, the consoles will be above in that regard.

But the point remains that, if the PS4 is indeed at 30fps, even 2GB cards can play at high textures with better overall settings than the PS4.

So far, only texture settings is eating up the VRAM for some reason. I agree 100% that in a couple of years, 2GB will not enough. Like Glez13 said, we need to keep cards at 4GB minimum. I've read that there're 8GB models for the 970 and 980 coming.

I did an extra test, and going from medium to high dropped me around 6 to 10fps. For 60fps solid, i put a mixture of medium and high. it think i'll be playing like this.

Lack of AA in this game is massive bullshit.

If you run them on high on a 2Gb system it will hurt it performance wise which is why DF state so.

Better over all.? Didn't you read DF the PS4 version was a match for High,considering there are more powerful GPU than the PS4 with 2Gb that will run into problems running high that is great.

We don't know how fast the PSD4 build is,30+.

Avatar image for CrownKingArthur
CrownKingArthur

5262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#178  Edited By CrownKingArthur
Member since 2013 • 5262 Posts

on ultra settings on a gtx 970 the memory usage sits at about 2970 meg. i also have an r9 280x but i haven't tried the game on it. i'm willing to try if its helpful. steam family sharing --> would be easy to bench.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#179  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:

@04dcarraher said:

lol such BS.... My poor 2gb video card is pulling at 1080p 60+ fps on ultra settings with high textures with Mordor, with Ultra textures it was around 50 fps... , and Titanfall's Vram requirement is also BS since gtx 650ti 2b can get nearly 60 fps average and even a 1gb GTX 750 is able to average 50 fps.... Whats the point of have 3gb of vram for either console gpu when they dont even have the power to use all the bells and whistles.

Let see who do we believe DF and the own fu**ing maker of the game or a blind biased sony hater who can't admit been wrong...

What poor video card.? i hope you are not talking about that sh** 560Ti...lol

You were owned on this topic already butthurt hermit...

I know sony make you mad because you had to upgrade on 2013 when you already upgraded on 2011 but is time to let it go...lol

Not 1 but 2 GPU on that chart which have 2 versions 1 with 1 GB and one with 2 in both cases in Skyrim memory was a freaking bottleneck PROVEN and by as much as 31 FPS on the same GPU that is sad.

Basically that is a way for GPU makers to hold performance back on GPU and force you to upgrade.

lol your in a big state of denial..... your the one who's is butt hurt..... its funny is because that 560 handled skyim on ultra just fine so sad for your long ass loading times and worse graphics and broken DLC on PS3.... Its funny because you act like upgrading is a bad thing.... And no 2gb does not hold you back at 1080 lol but the piss poor gpu in the X1 and mediocre gpu in PS4 do.... Its funny because GTX 760 averages 65 fps all on ultra with high textures explain that when your precious PS4 that uses between 2-3gb for vram with only 30 fps and not even using all the graphical features. How sad you need help. Here are some real benchmarks for skyrim that are newer then that out of date crap you keep on spewing as proof.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#180 04dcarraher
Member since 2004 • 23829 Posts
@tormentos said:

@deadline-zero0 said:

The textures Tormentos. Of course, if you decide to tone down the textures to medium for 2GB, the consoles will be above in that regard.

But the point remains that, if the PS4 is indeed at 30fps, even 2GB cards can play at high textures with better overall settings than the PS4.

So far, only texture settings is eating up the VRAM for some reason. I agree 100% that in a couple of years, 2GB will not enough. Like Glez13 said, we need to keep cards at 4GB minimum. I've read that there're 8GB models for the 970 and 980 coming.

I did an extra test, and going from medium to high dropped me around 6 to 10fps. For 60fps solid, i put a mixture of medium and high. it think i'll be playing like this.

Lack of AA in this game is massive bullshit.

If you run them on high on a 2Gb system it will hurt it performance wise which is why DF state so.

Better over all.? Didn't you read DF the PS4 version was a match for High,considering there are more powerful GPU than the PS4 with 2Gb that will run into problems running high that is great.

We don't know how fast the PSD4 build is,30+.

O yes 2gb hurts performance so much that we can still get 60+ fps when ps4 is only 30+ fps calling DF review wrong

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#181 clyde46
Member since 2005 • 49061 Posts

@tormentos said:

@04dcarraher said:

lol such BS.... My poor 2gb video card is pulling at 1080p 60+ fps on ultra settings with high textures with Mordor, with Ultra textures it was around 50 fps... , and Titanfall's Vram requirement is also BS since gtx 650ti 2b can get nearly 60 fps average and even a 1gb GTX 750 is able to average 50 fps.... Whats the point of have 3gb of vram for either console gpu when they dont even have the power to use all the bells and whistles.

Let see who do we believe DF and the own fu**ing maker of the game or a blind biased sony hater who can't admit been wrong...

What poor video card.? i hope you are not talking about that sh** 560Ti...lol

You were owned on this topic already butthurt hermit...

I know sony make you mad because you had to upgrade on 2013 when you already upgraded on 2011 but is time to let it go...lol

Not 1 but 2 GPU on that chart which have 2 versions 1 with 1 GB and one with 2 in both cases in Skyrim memory was a freaking bottleneck PROVEN and by as much as 31 FPS on the same GPU that is sad.

Basically that is a way for GPU makers to hold performance back on GPU and force you to upgrade.

Show me another game that does that and is not a broken, buggy mess.

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#182  Edited By DEadliNE-Zero0
Member since 2014 • 6607 Posts

@tormentos said:

If you run them on high on a 2Gb system it will hurt it performance wise which is why DF state so.

Better over all.? Didn't you read DF the PS4 version was a match for High,considering there are more powerful GPU than the PS4 with 2Gb that will run into problems running high that is great.

We don't know how fast the PSD4 build is,30+.

What? Dude

PS4:

1080p, High settings, High texture quality, 30fps (Maybe, we're still waiting for confirmation)

PC with 2GB VRAM GPU, mid performance model:

1080p, Ultra settings, High texture quality, 30fps (capped). Potential stuttering may occur from time to time

Like i said, so far, only textures require higher VRAM. Everything else is just like normal. If the trend continues, just tone down the textures for 2GB, if it's that bad. It's not like one needs to go "PC MASTER RACE I WILL NOT PLAY BELOW CONSOLES!". It sucks, but it's easy to fix.

Also, if the PS4 is indeed 30+, all we need to do it is have mid range pcs set to console settings, and compare frame rates on both. I bet the PS3 jumps around 30-40 (maybe 45) and pcs go around 50-60fps.

And let's not forget there's 1440p resolution. I don't have that display, but who knows if it's possible to cap it 30fps with everything on ultra with 2GB 670, 760, 770, etc. If somebody wants to give those benchmarks, i'd be thankfull

Avatar image for mikhail
mikhail

2697

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#183 mikhail
Member since 2003 • 2697 Posts

So, so funny that PS4 owners were laughing at the Xbox One version for potentially being 30 fps when it turns out that is exactly what the PS4 runs at, too. Damage control time!

Man, these "next gen" consoles are so fucking lame.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#184  Edited By 04dcarraher
Member since 2004 • 23829 Posts

GTX 760 with Mordor's benchmark tool Vram usage maxed 1998mb, Ultra settings, with high textures at 1080p.

Avatar image for Legend002
Legend002

13405

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 1

#185 Legend002
Member since 2007 • 13405 Posts

My GTX 970 2-way SLI will chew this game up.

Avatar image for miiiiv
miiiiv

943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#186  Edited By miiiiv
Member since 2013 • 943 Posts
@04dcarraher said:

GTX 760 with Mordor's benchmark tool Vram usage maxed 1998mb, Ultra settings, with high textures at 1080p.

Digital Foundry has been proven wrong so many times lately that their credibility has long since gone out the window. Though 2 GB vram could become a problem for upcoming games on high settings even at 1080p but it's unlikely that the next-gen consoles can run any of those games at high settings (or without other compromises) due to their lack of power.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#187 tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

lol your in a big state of denial..... your the one who's is butt hurt..... its funny is because that 560 handled skyim on ultra just fine so sad for your long ass loading times and worse graphics and broken DLC on PS3.... Its funny because you act like upgrading is a bad thing.... And no 2gb does not hold you back at 1080 lol but the piss poor gpu in the X1 and mediocre gpu in PS4 do.... Its funny because GTX 760 averages 65 fps all on ultra with high textures explain that when your precious PS4 that uses between 2-3gb for vram with only 30 fps and not even using all the graphical features. How sad you need help. Here are some real benchmarks for skyrim that are newer then that out of date crap you keep on spewing as proof.

Once again your screen has no MSAAx4 is has simple 4xAA which isn't the same.

Second is 1080p not 1200p.

Memory has been a bottleneck pretty much since i have memory,the more memory you have the bigger the game the more things at once,Wasn't the saturn superior to the PS1 on 2D games because it had 1MB extra of memory.? How about when it receive the 4MB ram key which allowed a freaking perfect arcade port from Street Fighter vs Xmen.?

And not the bastardation the PS1 version was missing with animation and having crappy frames.

The same with the N64 and the ram pack which allowed higher resolution textures as well.

Memory is as big as bottleneck as power is,and this game has been confirmed even by a poster here to consume more than 2Gb of memory on high mean mean 2Gb card will have problems because it has to constantly be swapping textures in and would and that get you stutter and probably pop in to.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#188  Edited By tormentos
Member since 2003 • 33784 Posts
@04dcarraher said:

GTX 760 with Mordor's benchmark tool Vram usage maxed 1998mb, Ultra settings, with high textures at 1080p.

You are an idiot if you have 2Gb of ram how the fu** will the game use more than 2.? The game will not allocate more than your card has it will simple switch textures in and out constantly,which cause stutter and pop in since the textures have to constantly be swap in and out of the memory.

It was already posted here.

2700mb on high but hes card has more ram than a 670GTX which has 2GB,it when actually higher than 2700mb.

@miiiiv said:

Digital Foundry has been proven wrong so many times lately that their credibility has long since gone out the window. Though 2 GB vram could become a problem for upcoming games on high settings even at 1080p but it's unlikely that the next-gen consoles can run any of those games at high settings (or without other compromises) due to their lack of power.

DF wasn't proven wrong you sad butthurt fanboy,if your card is 2GB the game will not allocate more than 2Gb how the fu** it would.? that 670GTX is 2GB so the max it will use is 2GB,which is what his test is showing,instead what the game will do is swap textures in and out constantly because it doesn't have the need it amount which is why DF say it cause stutter do you see that screen i posted that is high on a 4GB GPU,since the GPU has more than 2Gb the game allocate more than 2GB but as you can see it doesn't reach 3GB it stay close to 3Gb tho.

@deadline-zero0

What? Dude

PS4:

1080p, High settings, High texture quality, 30fps (Maybe, we're still waiting for confirmation)

PC with 2GB VRAM GPU, mid performance model:

1080p, Ultra settings, High texture quality, 30fps (capped). Potential stuttering may occur from time to time

Like i said, so far, only textures require higher VRAM. Everything else is just like normal. If the trend continues, just tone down the textures for 2GB, if it's that bad. It's not like one needs to go "PC MASTER RACE I WILL NOT PLAY BELOW CONSOLES!". It sucks, but it's easy to fix.

Also, if the PS4 is indeed 30+, all we need to do it is have mid range pcs set to console settings, and compare frame rates on both. I bet the PS3 jumps around 30-40 (maybe 45) and pcs go around 50-60fps.

And let's not forget there's 1440p resolution. I don't have that display, but who knows if it's possible to cap it 30fps with everything on ultra with 2GB 670, 760, 770, etc. If somebody wants to give those benchmarks, i'd be thankfull

Yeah the setting are ultra because the GPU had extra power over the PS4 GPU,but is it ultra when the textures have to be on high.?

That is console like trade in,so is capping the game at 30 and still get stutter when the PS4 version is uncap and doesn't show this stutter.

1440 will consume allot higher memory than 1080p so that would only worsen the stutter problem,considering that you get stutter still even when cap.

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#189 darkangel115
Member since 2013 • 4562 Posts

@tormentos

So you replay 6 times after i posted this

@delta3074 said:
@tormentos said:

@darkangel115 said:

Funny how when i mentioned 4k before you tried calling me a fake hermit, then when i explained how you didn't read my post and the my TV upscales my PS4 and X1 to 4k regardless you never replied. So whats the difference if everything i played is upscaled to 4K? I don't notice any difference in any of the games. But i see you still keep up the good fight in being a stupid fanboy. good job for you to ignore everything dumb you say ;)

What TV you have that upscale your games to 4K.?

Upscale is blowing up pixels,is not the same it introduce artifact and blur the image.

Take 720p for example is 1280x720 that = 921,600 pixels

1080p is 1920x1080p which = 2,073,600

Now is your game is natively 1080p you get 2,073,600 pixels.

If your game is natively 720p you get 921,600 pixels if you want to display that 720p game in 1080p,and you upscale it,what the hardware does is stretch those 921,600 pixels to fit a space that should be fill by 2,073,600 pixels,which is why the image blurs because you are stretching the pixel.

If you upscale a 720p game to 4K it wold be a total blur fest but i do believe that you don't have such a TV and i don't think 4K TV upscale your PS4 or xbox one games on its own,that is a job of your console which is not doing now either.

The fact that you claim your TV upscales to 4k is enough to know you are a blind biased lemming who knows sh** about what youre talking.

Bang on the money there tormentos, it's the console that upscales the games not the TV, the guys talking utter bollocks.

Remember, we used to 'debate' the difference between the Hardware upscaler in the 360 (which coincidently caused the E74 errors on the 360) versus the software upscaler in the PS3.

Well pointed out, on the ball today eh?

lol OMG you guys are so dumb. Newsflash. all 4k TVs have a built in upscaler and no upscaling isn't stretching.

FYI this is the TV i have

http://www.samsung.com/us/video/tvs/UN65HU8550FXZA

[quote]UHD Upscaling delivers the complete UHD picture experience with a proprietary process including signal analysis, noise reduction, UHD upscaling and detail enhancement to seamlessly upconvert SD, HD or full HD content to UHD-level picture quality.[/quote]

So @tormentos and @delta3074 Its said, especially with tormentos who already has 0 knowledge and everything he posts is copy and paste form elsewhere, wouldn't look into it before making those allegations. So now i wonder what lame comeback you guys will have

_____________

and none of them were to me where i proved you wrong. You just make yourself looks so bad everyday

Avatar image for chikenfriedrice
chikenfriedrice

13561

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#190 chikenfriedrice
Member since 2006 • 13561 Posts

Guess this thread can die now since the rumor was false ehh mentos the crap thread maker.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#191 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:
@04dcarraher said:

GTX 760 with Mordor's benchmark tool Vram usage maxed 1998mb, Ultra settings, with high textures at 1080p.

You are an idiot if you have 2Gb of ram how the fu** will the game use more than 2.? The game will not allocate more than your card has it will simple switch textures in and out constantly,which cause stutter and pop in since the textures have to constantly be swap in and out of the memory.

It was already posted here.

2700mb on high but hes card has more ram than a 670GTX which has 2GB,it when actually higher than 2700mb.


You really have no clue about hardware works do you? Games will automatically adjust the data transfer to match the amount of vram you have. "switch textures in and out constantly,which cause stutter and pop in" this quote total bull shit right here, There is a difference in reserving memory and and actually being used. switching out data on memory is normal for all gpu's, and it does not cause stutter if you have enough system memory and harddrive is not being bogged down. Where in the hell are you getting this pop-in idea ? O wait I know where DF's unimpressive review, There are reports all over the world showing that strong 2gb cards are able run the game without issue. Monolith claimed that 3GB of VRAM is required for High settings, however that was nothing more than an exaggeration. get that through your head.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#192 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:


Once again your screen has no MSAAx4 is has simple 4xAA which isn't the same.

Second is 1080p not 1200p.


Again your being a typical "you" , 4x aa is 4x msaa you wouldn't know that since your a console fanboy..... also your going to say about a 10% increase in pixels is going to cause half the performance? god your desperate and have no idea what your posting making yourself look like a fool even more....

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#193  Edited By tormentos
Member since 2003 • 33784 Posts

@darkangel115 said:

@tormentos

So you replay 6 times after i posted this

lol OMG you guys are so dumb. Newsflash. all 4k TVs have a built in upscaler and no upscaling isn't stretching.

FYI this is the TV i have

http://www.samsung.com/us/video/tvs/UN65HU8550FXZA

[quote]UHD Upscaling delivers the complete UHD picture experience with a proprietary process including signal analysis, noise reduction, UHD upscaling and detail enhancement to seamlessly upconvert SD, HD or full HD content to UHD-level picture quality.[/quote]

So @tormentos and @delta3074 Its said, especially with tormentos who already has 0 knowledge and everything he posts is copy and paste form elsewhere, wouldn't look into it before making those allegations. So now i wonder what lame comeback you guys will have

_____________

and none of them were to me where i proved you wrong. You just make yourself looks so bad everyday

Really please take some screen and post them here of your TV with a note with our names that is for starters.

Second any process that involves upscale mean blowing pixels you moron,the fact that you even debate this prove you know sh** about what your talking hurray i can play 4K on my TV because the TV upscale the image to 4k..Hermits am cry

The PS4 can run Tomb Raider and Sniper elite at more than 30 FPS on 4K...lol

Upscaling using a TV scaler is the same as using a console to scale 4K in the end you are blowing up pixels moron,the game wasn't design for 4K so yeah blur fest galore,what you are quoting there is a samsung add.lol

I have been an HDTV owner since the early 2000,and upscaling isn't the same as native just like DVD upscale isn't the same as blu-ray native.

Basically you are a silly lemming that can't admit it has weak hardware.

Show us your 4k TV..lol

@chikenfriedrice said:

Guess this thread can die now since the rumor was false ehh mentos the crap thread maker.

My god you are a butthurt lemming,this thread isn't even anti xbox one..hahaha

And is not even downplaying the xbox one for the so call rumor all the contrary i was defending it..hahaha

I guess your fanboysm is just to damn big to let you see that..lol

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#194 DEadliNE-Zero0
Member since 2014 • 6607 Posts

@tormentos said:

@deadline-zero0

What? Dude

PS4:

1080p, High settings, High texture quality, 30fps (Maybe, we're still waiting for confirmation)

PC with 2GB VRAM GPU, mid performance model:

1080p, Ultra settings, High texture quality, 30fps (capped). Potential stuttering may occur from time to time

Like i said, so far, only textures require higher VRAM. Everything else is just like normal. If the trend continues, just tone down the textures for 2GB, if it's that bad. It's not like one needs to go "PC MASTER RACE I WILL NOT PLAY BELOW CONSOLES!". It sucks, but it's easy to fix.

Also, if the PS4 is indeed 30+, all we need to do it is have mid range pcs set to console settings, and compare frame rates on both. I bet the PS3 jumps around 30-40 (maybe 45) and pcs go around 50-60fps.

And let's not forget there's 1440p resolution. I don't have that display, but who knows if it's possible to cap it 30fps with everything on ultra with 2GB 670, 760, 770, etc. If somebody wants to give those benchmarks, i'd be thankfull

Yeah the setting are ultra because the GPU had extra power over the PS4 GPU,but is it ultra when the textures have to be on high.?

Holy shit Tormentos. Is it ultra? Yes, everything besides textures are ultra. BTW, not all settings have ultra in SoM. I believe shadows only go up to high anyway

That is console like trade in,so is capping the game at 30 and still get stutter when the PS4 version is uncap and doesn't show this stutter.

You have to be trolling me. I capped it because it was jumping around 40 and 50-55. Most likely still a better frame rate average than the PS4 (again, waiting confirmation) even with ultra/high (textures). I'm playing at 60fps just by poning down a few settings 1 slot down. I was playing at 30 yearsterday to see how much maxing out would perform.

1440 will consume allot higher memory than 1080p so that would only worsen the stutter problem,considering that you get stutter still even when cap.

And again, that's why i said someone should test it to see hwo teh frame rate and stutter would be at higher resolution.

Dude, a 2GB mid range card is more than enough for the next year. Unified memory in the consoles isn't enough to counter the brute power of GPUs.

Ense why WD runs at 900p high 30fps on the PS4, and i can run it at 1080p ultra, including the ultra textures that require 3 VRAM. Stuttering was abit worse in that game, but nothign special

Avatar image for chikenfriedrice
chikenfriedrice

13561

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#195 chikenfriedrice
Member since 2006 • 13561 Posts

@tormentos: I'm sure you had nothing but good intentions lol

Avatar image for miiiiv
miiiiv

943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#196  Edited By miiiiv
Member since 2013 • 943 Posts
@tormentos said:
@04dcarraher said:

GTX 760 with Mordor's benchmark tool Vram usage maxed 1998mb, Ultra settings, with high textures at 1080p.

You are an idiot if you have 2Gb of ram how the fu** will the game use more than 2.? The game will not allocate more than your card has it will simple switch textures in and out constantly,which cause stutter and pop in since the textures have to constantly be swap in and out of the memory.

It was already posted here.

2700mb on high but hes card has more ram than a 670GTX which has 2GB,it when actually higher than 2700mb.

@miiiiv said:

Digital Foundry has been proven wrong so many times lately that their credibility has long since gone out the window. Though 2 GB vram could become a problem for upcoming games on high settings even at 1080p but it's unlikely that the next-gen consoles can run any of those games at high settings (or without other compromises) due to their lack of power.

DF wasn't proven wrong you sad butthurt fanboy,if your card is 2GB the game will not allocate more than 2Gb how the fu** it would.? that 670GTX is 2GB so the max it will use is 2GB,which is what his test is showing,instead what the game will do is swap textures in and out constantly because it doesn't have the need it amount which is why DF say it cause stutter do you see that screen i posted that is high on a 4GB GPU,since the GPU has more than 2Gb the game allocate more than 2GB but as you can see it doesn't reach 3GB it stay close to 3Gb tho.

Yeah the setting are ultra because the GPU had extra power over the PS4 GPU,but is it ultra when the textures have to be on high.?

That is console like trade in,so is capping the game at 30 and still get stutter when the PS4 version is uncap and doesn't show this stutter.

1440 will consume allot higher memory than 1080p so that would only worsen the stutter problem,considering that you get stutter still even when cap.

DF's word isn't worth much these days, most recent example of this is Ryse which seems close to low settings on pc and we both know DF exaggerates the xbone version a bit.
And if a graphics card runs out of vram I guess that textures are streaming from system ram, which causes the frame drops and texture pop ins. But if 04dcarraher is running the game at ultra with high textures at avg 67 fps, I'd say it runs pretty well. I'm not denying that 2 GB cards will suffer in the future. In 2017 they could even be "screaming" as you said but by then I'm sure every multiplat on the ps4 is equivalent to low settings on pc or even worse.

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#197 darkangel115
Member since 2013 • 4562 Posts

@tormentos said:

@darkangel115 said:

@tormentos

So you replay 6 times after i posted this

lol OMG you guys are so dumb. Newsflash. all 4k TVs have a built in upscaler and no upscaling isn't stretching.

FYI this is the TV i have

http://www.samsung.com/us/video/tvs/UN65HU8550FXZA

[quote]UHD Upscaling delivers the complete UHD picture experience with a proprietary process including signal analysis, noise reduction, UHD upscaling and detail enhancement to seamlessly upconvert SD, HD or full HD content to UHD-level picture quality.[/quote]

So @tormentos and @delta3074 Its said, especially with tormentos who already has 0 knowledge and everything he posts is copy and paste form elsewhere, wouldn't look into it before making those allegations. So now i wonder what lame comeback you guys will have

_____________

and none of them were to me where i proved you wrong. You just make yourself looks so bad everyday

Really please take some screen and post them here of your TV with a note with our names that is for starters.

Second any process that involves upscale mean blowing pixels you moron,the fact that you even debate this prove you know sh** about what your talking hurray i can play 4K on my TV because the TV upscale the image to 4k..Hermits am cry

The PS4 can run Tomb Raider and Sniper elite at more than 30 FPS on 4K...lol

Upscaling using a TV scaler is the same as using a console to scale 4K in the end you are blowing up pixels moron,the game wasn't design for 4K so yeah blur fest galore,what you are quoting there is a samsung add.lol

I have been an HDTV owner since the early 2000,and upscaling isn't the same as native just like DVD upscale isn't the same as blu-ray native.

Basically you are a silly lemming that can't admit it has weak hardware.

Show us your 4k TV..lol

@chikenfriedrice said:

Guess this thread can die now since the rumor was false ehh mentos the crap thread maker.

My god you are a butthurt lemming,this thread isn't even anti xbox one..hahaha

And is not even downplaying the xbox one for the so call rumor all the contrary i was defending it..hahaha

I guess your fanboysm is just to damn big to let you see that..lol

1st off, Blowing up pixels is stretching, Upscaling doesn't do that, it adds in the extra pixels. You still see the same amount of pixels on your screen as native, the difference is some of them come from the source, and the rest come from the upscaling software. I also never said hermits am cry, only you stupid fanboys do that stuff. I just pointed out that whether i play my PS4 or X1, regardless of the resolution of the game, it still gets upscaled to 4K

as far as TR goes, who cares about "theoretically" points is, it doesn't. Also i can say that upscaled 4k content is not blurry at all. it actually looks really good. even in fast motion sports on TV.

Now normally I don't do this, but i will make a 1 time exception and post a pic of the TV just to shut you up, but this will be the 1 and only time i do "picture proof" here. I'll post it later tonight when I'm home.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#198  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@miiiiv said:

DF's word isn't worth much these days, most recent example of this is Ryse which seems close to low settings on pc and we both know DF exaggerates the xbone version a bit.

And if a graphics card runs out of vram I guess that textures are streaming from system ram, which causes the frame drops and texture pop ins. But if 04dcarraher is running the game at ultra with high textures at avg 67 fps, I'd say it runs pretty well. I'm not denying that 2 GB cards will suffer in the future. In 2017 they could even be "screaming" as you said but by then I'm sure every multiplat on the ps4 is equivalent to low settings on pc or even worse.

When gpu's "run out" of room on memory pool it dumps the un-essential data and calls for the incoming needed data to be stored and processed. Frame rate drops are not necessarily directly tied into the gpu swapping out data, it can be caused by the cpu having to send the extra data eating into cpu cycles that is needed for other tasks. Now if the gpu does not have a wide enough memory bus and speed it can lead into frame rate drops. However for 1080p , 256bit bus with 6ghz GDDR5 is plenty for it not be a problem. Texture pop in is not always related to vram usage but the gpu's processing power, is the gpu fast enough to read and write the data to render what is needed to be displayed. Again texture pop in can also be caused by lack of system memory which inturns having to stream data off of the harddrive, cpu processing power can also contribute to the issue.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#199  Edited By Tighaman
Member since 2006 • 1038 Posts

900p at 30fps locked or 1080p and 30-60 fps your choice

Avatar image for clone01
clone01

29824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#200 clone01
Member since 2003 • 29824 Posts

I dunno. Maybe my eyes aren't all that good, but I can't tell, personally. Granted, this isn't comparable to looking at it on an actual console hooked to a TV. Methinks people worry about stupid shit too much.

http://www.gamespot.com/videos/middle-earth-shadow-of-mordor-graphics-comparison/2300-6421635/