Ubisoft bashes Monolith and offers 900p explanation.

  • 92 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51  Edited By deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

@Jankarcop:

No, the fact that developers take the time and effort to put out the best product they can, given the hardware. Even though I don't care as much for today's game design, what many developers accomplished on ps3 and 360 was very impressive.

On pc there's simply, "okay ladies here's you're requirements. Oh those are the minimum, to actually play the game half way decent you'll need THIS build."

Avatar image for millerlight89
millerlight89

18658

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#52 millerlight89
Member since 2007 • 18658 Posts

I'm glad I don't have ridiculous standards. I just like a good game and to have fun. The other shit is whatever.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#53  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:

@04dcarraher said:

You have no idea what the hell your talking about so stop it.... When the cpu can not feed the gpu correctly the gpu sits and waits for data which gives you lower gpu usage aka lower performance . Resolutions or settings do not matter when the gpu is starving for data. Its called a cpu bottleneck....

The CPU has all of its workloads plus feeding the frames for the gpu. Once it passes the data to gpu then the gpu can render the frames. The larger the frame/s the more processing is required, which means the longer it takes for the GPU to finish. At lower resolutions the frames are processed much faster, since the frame/s are smaller. The CPU has to more quickly feed the frames. In turn shows where the bottleneck is at. So yes resolutions and performance can be determined by how strong the cpu is being able to feed the gpu.

Now the fact that they kept the resolutions the same has no bearing that the PS4 could handle 1080, the 30 fps limit is. If the X1 and PS4 had much stronger cpu's 60 fps standard would have been possible and then you would have seen 900p 60 fps on X1 and PS4 1080/60 fps.

With the compute tests done are not in the engine.The problem with them switching some or most of the intensive jobs onto the gpu would have made them to recode the games engine.we may see them do that with newer titles. Even then you would still have to compromise something since then some of the gpu's resources are now going to the other tasks.

When you CPU can't feed your GPU what will be affected is frames,slow downs will happen and frame drops,not resolution drops.

Run a CPU intensive app on PC and switch resolution from 1080p to 720 the CPU usage will remain the same,because it is your CPU what is doing the work.

The GPU will stall at 900p or 1080p if your CPU can't keep up,is the same shit and is damage control by Ubi soft which on august make a benchmark on compute and the PS4 destroyed the xbox one by doubling it basically.

By that time this whole debacle wasn't own,UBI soft still hasn't say where the fu** they spent the PS4 extra GPU power,now they claim to have 25GB of baked light data on the game.

UBI is running Damage control and so are you,if you are CPU bottle neck at 1080p lowering the resolution will not lower CPU usage period that is stupid.

This is a yearly milk game that could have been 1080p on PS4 and Ubi didn't try because of MS..lol

1. Not talking about the "resolution parity" just namely the cpu causing the 30 fps cap... as with many new games.

2. No actually cpu usage can go up, Lowering the resolution of a game or program increases the load on a CPU. As the resolution lowers, less strain is placed on the graphics card because there are less pixels to render, but the strain is put on to the CPU. At lower resolutions, the frame rate is limited to the CPU's speed. Simply that at low resolutions, the GPU can render more speeding up the fps. So it becomes about how quickly the CPU can send those frames to the GPU.

3. its not damage control you cow, the X1 and PS4 cpu just does not have enough ass to do everything. They will need to transfer tasks to the gpu if they expect to get more complex worlds.

Avatar image for Cloud_imperium
Cloud_imperium

15146

Forum Posts

0

Wiki Points

0

Followers

Reviews: 103

User Lists: 8

#54 Cloud_imperium
Member since 2013 • 15146 Posts

@speedfreak48t5p said:

PC gaming for life! lol consoles.

Avatar image for Jankarcop
Jankarcop

11058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55  Edited By Jankarcop
Member since 2011 • 11058 Posts

@millerlight89 said:

I'm glad I don't have ridiculous standards. I just like a good game and to have fun. The other shit is whatever.

1080p is ridiculous standards now.

Consolites truley have low standards.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#56 lostrib
Member since 2009 • 49999 Posts

@Jankarcop said:

@millerlight89 said:

I'm glad I don't have ridiculous standards. I just like a good game and to have fun. The other shit is whatever.

1080p is ridiculous standards now.

Consolites truley have low standards.

So you have low standards? or do you just not game at all, since you're not a PC gamer:

Avatar image for jhonMalcovich
jhonMalcovich

7090

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57  Edited By jhonMalcovich
Member since 2010 • 7090 Posts

Peasants first world problems. Couldn´t care less.

Avatar image for hrt_rulz01
hrt_rulz01

22372

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58  Edited By hrt_rulz01
Member since 2006 • 22372 Posts

Doesn't matter what anybody or developer says, you're going to get idiots going on & on about the differences... however small they are. They have nothing better to do.

Avatar image for wolverine4262
wolverine4262

20832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 wolverine4262
Member since 2004 • 20832 Posts

@clyde46 said:

How fucking weak are those consoles man?

Im starting to wonder the same. The assumption going in was 1080p would be standard. But, the assumption last gen was 720p standard. Not the case either.

Now, we also gotta remember, these dev statements should be taken with a grain of salt. The GRAW team said they were using 80% of the 360s full potential. That was a load of shit.

Avatar image for Animal-Mother
Animal-Mother

27362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#60 Animal-Mother
Member since 2003 • 27362 Posts

If this is true and someone from Ubi came out and said this. Ubi is gonna have a bad time.

Shadow of mordor is one of the best games and surprises this year while they continue to pile drive assassins creed into the dirt.

Avatar image for Cloud_imperium
Cloud_imperium

15146

Forum Posts

0

Wiki Points

0

Followers

Reviews: 103

User Lists: 8

#61 Cloud_imperium
Member since 2013 • 15146 Posts

@cantloginnow said:

Its funny,the hermits.lems and sheep all accept the limitations of console hardware and really dont give a shit for the most part but the Cows are completely obsessed and in complete denial making fools of themselves everyday.You guys need to let it go.Its not something you can win and no one really gives a shit.If I owned a ps4 I'd be bitching more about how big of a disappointment this console and its crappy library is than what resolution a game is running at.

Sales and resolution,you guys really have nothing going on on the ps4 do you?

I don't blame them . They have been brainwashed by clever marketing of Sony since "teh powah of cell processor" days .

Avatar image for MK-Professor
MK-Professor

4214

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#62 MK-Professor
Member since 2009 • 4214 Posts

@04dcarraher said:

@ReadingRainbow4 said:

CPU=/ a reason for a lower resolution.

Hermits you have to try better than that.

if cpu cant feed the gpu enough data you get lower gpu performance.

and the solution to that is to increase the rez in order to fully utilize the GPU without any fps loss, but Ubisoft for some inexplicable reason didn't do that, that is why people are complaining.

Avatar image for EducatingU_PCMR
EducatingU_PCMR

1581

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#63 EducatingU_PCMR
Member since 2013 • 1581 Posts

PS4 is weak shit confirmed, lol

Tormentos' dreams crushed

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 btk2k2
Member since 2003 • 440 Posts

@04dcarraher said:

@ReadingRainbow4 said:

@04dcarraher said:

@ReadingRainbow4 said:

CPU=/ a reason for a lower resolution.

Hermits you have to try better than that.

if cpu cant feed the gpu enough data you get lower gpu performance.

That's assuming there's a bottleneck in effect.

With a game like AC unity where its rendering hundreds upon hundreds of npcs alot of the cpu cycles are going towards that task. In turn causes the gpu to sit and wait for the data lowering gpu performance. Bottlenecking is dependent on what is being done.

True for adding more NPC's to the scene and running the AI routines for them. However a resolution increase is entirely a GPU affair and will have no impact on the CPU. If Xbox One can do it at 900p PS4 can do it at 1080p and that is a fact.

Avatar image for nutcrackr
nutcrackr

13032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 1

#65 nutcrackr
Member since 2004 • 13032 Posts

Why are these idiots still talking. It gets worse whenever they say anything.

Avatar image for mikhail
mikhail

2697

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66 mikhail
Member since 2003 • 2697 Posts

@nutcrackr said:

Why are these idiots still talking. It gets worse whenever they say anything.

Watching these guys argue about the technical aspects of frame rate and resolution is like watching a few chimps argue about theoretical physics.

Avatar image for tdkmillsy
tdkmillsy

5857

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#67  Edited By tdkmillsy
Member since 2003 • 5857 Posts

Two separate things

PS4 couldn't handle 1080p resolution, probably because they would have to sacrifice effects or couldn't iron out the problems enough. So they kept it at 900p

FPS 1 or 2 more on PS4 so they kept them both at 30fps. GPU can take some load of CPU hence the increase in 1 or 2 frames. But the whole setup isnt large enough to create 60fps so rather than have varied fps they locked it.

Console owners should be glad they not running at 1080p as it should equal better effects and graphics overall. This is what it should be like going forward.

The shit about keeping PC the same is just wrong though.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

1. Not talking about the "resolution parity" just namely the cpu causing the 30 fps cap... as with many new games.

2. No actually cpu usage can go up, Lowering the resolution of a game or program increases the load on a CPU. As the resolution lowers, less strain is placed on the graphics card because there are less pixels to render, but the strain is put on to the CPU. At lower resolutions, the frame rate is limited to the CPU's speed. Simply that at low resolutions, the GPU can render more speeding up the fps. So it becomes about how quickly the CPU can send those frames to the GPU.

3. its not damage control you cow, the X1 and PS4 cpu just does not have enough ass to do everything. They will need to transfer tasks to the gpu if they expect to get more complex worlds.

1-I didn't argue about the frames.

2-Again lowering your resolution will not free CPU time,if you are CPU bound at 1080p you will be if you lower the resolution to 900p as well.It is a fact not my opinion been CPU limited has 2 fixes,increase the CPU power or lower CPU usage and you don't lower CPU usage by lowering the resolution which is something that totally rest on the GPU.

Rendering more pixels isn't the issue here,since you still are bound by 1920x1080p where ever you have 1 NPC or 300 NPC the pixel number stay the same,if you are CPU bound at 1080p because you have 200 dancers, you still have 200 dancers at 900p and the one doing the pixel rendering is the GPU not the CPU in any way.

3-Yes it is damage control and UBI has a charge which show the PS4 destroyed the xbox one in GPU side,they could have done it,they could care less because this is a yearly milking game,the fact that Ubi admit that it hold back the PS4,the fact that Ubi hid setting on PC to not shame consoles prove my point it wasn't the PS4 problem the issue it was UBI.

Hell an employee was saying yesterday that the frame difference was 1 or 2 frames so it got lock at 30FPS,yeah like an R265 will have a 2 frame advantage over a 250X..lol

@Cloud_imperium said:

@cantloginnow said:

Its funny,the hermits.lems and sheep all accept the limitations of console hardware and really dont give a shit for the most part but the Cows are completely obsessed and in complete denial making fools of themselves everyday.You guys need to let it go.Its not something you can win and no one really gives a shit.If I owned a ps4 I'd be bitching more about how big of a disappointment this console and its crappy library is than what resolution a game is running at.

Sales and resolution,you guys really have nothing going on on the ps4 do you?

I don't blame them . They have been brainwashed by clever marketing of Sony since "teh powah of cell processor" days .

You are an idiot and considering that you defend the xbox one and hide on PC even more,that dude claim lems accept limitations of hardware in what fu**ing forum they do that.? Because here they don't have you seen the last few thread about this games.? All are a war because lemmings refuse to admit the xbox one is a weaker console,from magic DX12 API to hidden GPU by blackace arguments it is a joke some of you are just to blind.

@MK-Professor said:

and the solution to that is to increase the rez in order to fully utilize the GPU without any fps loss, but Ubisoft for some inexplicable reason didn't do that, that is why people are complaining.

This and 100 times this...

And you know we basically never agree,some people are playing blind UBI hid PC settings on watchdods to not make the console version look even worse after the whole downgrade crap on PS4,this company fu** up.

They had a parity clause with MS which is probably what preventing them from having a 1080p version on PS4.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 tormentos
Member since 2003 • 33784 Posts
@tdkmillsy said:

Two separate things

PS4 couldn't handle 1080p resolution, probably because they would have to sacrifice effects or couldn't iron out the problems enough. So they kept it at 900p

FPS 1 or 2 more on PS4 so they kept them both at 30fps. GPU can take some load of CPU hence the increase in 1 or 2 frames. But the whole setup isnt large enough to create 60fps so rather than have varied fps they locked it.

Console owners should be glad they not running at 1080p as it should equal better effects and graphics overall. This is what it should be like going forward.

The shit about keeping PC the same is just wrong though.

1-No not when the PS4 has 560Gflops of extra power,it has been proven already,in fact AC black flag is 1080p on PS4 900p on xbox one and has better AA on PS4 to.

Games like Shadow Of Mordor has higher resolution,better shadows,and more foliage,while beating the xbox one version 1080p to 900p.In fact they could lock both version at 30FPS and 900p,since the PS4 has extra power instead of running mid low setting it will be running mid high setting or even high setting no matter what the extra power was there.

That is if you believe that bullshit every fru**ing benchmark out there show the contrary when the fu** an R265 has been just 2 frames faster than an R250x.? Or a 7850 2 frames faster than a 7770.?

That is total bullshit and go against UBIsoft own fu**ing benchmark which showed that if they moved NPC to GPU compute the PS4 would have destroy the xbox one version,they performance they where getting on PS4 was double of the xbox one.

http://www.anandtech.com/bench/product/1127?vs=1126

2 frames my ass.

Pay attention to the test on the end as well the ones about GPU compute,see how the difference is like 30% more but on PS4 by ubi test the difference was 100% that is because the PS4 is modify to take advantage of compute which the xbox one isn't the PS4 isn't a mighty power house of a console but is not hard to see it is stronger than the xbox one and that UBI fu** up and are damage controlling left and right.

That is funny because that is what UBI did with the PS4 version because of a deal with MS.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#70  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:
@tdkmillsy said:

Two separate things

PS4 couldn't handle 1080p resolution, probably because they would have to sacrifice effects or couldn't iron out the problems enough. So they kept it at 900p

FPS 1 or 2 more on PS4 so they kept them both at 30fps. GPU can take some load of CPU hence the increase in 1 or 2 frames. But the whole setup isnt large enough to create 60fps so rather than have varied fps they locked it.

Console owners should be glad they not running at 1080p as it should equal better effects and graphics overall. This is what it should be like going forward.

The shit about keeping PC the same is just wrong though.

1-No not when the PS4 has 560Gflops of extra power,it has been proven already,in fact AC black flag is 1080p on PS4 900p on xbox one and has better AA on PS4 to.

Games like Shadow Of Mordor has higher resolution,better shadows,and more foliage,while beating the xbox one version 1080p to 900p.In fact they could lock both version at 30FPS and 900p,since the PS4 has extra power instead of running mid low setting it will be running mid high setting or even high setting no matter what the extra power was there.

That is if you believe that bullshit every fru**ing benchmark out there show the contrary when the fu** an R265 has been just 2 frames faster than an R250x.? Or a 7850 2 frames faster than a 7770.?

That is total bullshit and go against UBIsoft own fu**ing benchmark which showed that if they moved NPC to GPU compute the PS4 would have destroy the xbox one version,they performance they where getting on PS4 was double of the xbox one.

http://www.anandtech.com/bench/product/1127?vs=1126

2 frames my ass.

Pay attention to the test on the end as well the ones about GPU compute,see how the difference is like 30% more but on PS4 by ubi test the difference was 100% that is because the PS4 is modify to take advantage of compute which the xbox one isn't the PS4 isn't a mighty power house of a console but is not hard to see it is stronger than the xbox one and that UBI fu** up and are damage controlling left and right.

That is funny because that is what UBI did with the PS4 version because of a deal with MS.

Avatar image for PAL360
PAL360

30570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 31

User Lists: 0

#71 PAL360
Member since 2007 • 30570 Posts

'The game was 9fps 9 months ago. We only achieved 900p at 30fps weeks ago.'

Hmm...i rebember, about 9 months ago, Ubisoft confirming that Unity would run at 1080p60 on both consoles. Well, they are lying now or they were back then!

Avatar image for speedfog
speedfog

4966

Forum Posts

0

Wiki Points

0

Followers

Reviews: 18

User Lists: 0

#72  Edited By speedfog
Member since 2009 • 4966 Posts

Alrighty, I just hope that the game is fun that's all. If you really care about resolution/graphics then get the pc version. Don't excpect to much from $400 consoles.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73 btk2k2
Member since 2003 • 440 Posts
@04dcarraher said:

@ReadingRainbow4 said:

CPU=/ a reason for a lower resolution.

Hermits you have to try better than that.

if cpu cant feed the gpu enough data you get lower gpu performance.

Yes but increasing resolution will not make a difference to CPU utilisation so if the Xbox One can do it at 900p the PS4 can do it at 1080p.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#74 04dcarraher
Member since 2004 • 23829 Posts

@btk2k2 said:
@04dcarraher said:

@ReadingRainbow4 said:

CPU=/ a reason for a lower resolution.

Hermits you have to try better than that.

if cpu cant feed the gpu enough data you get lower gpu performance.

Yes but increasing resolution will not make a difference to CPU utilisation so if the Xbox One can do it at 900p the PS4 can do it at 1080p.

However the frame rate may be even more unstable with the increased resolution

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75 btk2k2
Member since 2003 • 440 Posts

@04dcarraher: provided the xbox one can handle 900p with a stable framerate the ps4 can hamdle.it at 1080p with a similar framerate.

If the xbox one is struggling then you would expect the ps4 at 1080p to struggle also.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#76 04dcarraher
Member since 2004 • 23829 Posts

@btk2k2 said:

@04dcarraher: provided the xbox one can handle 900p with a stable framerate the ps4 can hamdle.it at 1080p with a similar framerate.

If the xbox one is struggling then you would expect the ps4 at 1080p to struggle also.

Not necessarily, if the PS4 is using higher quality effects and assets raising the resolution without adjusting the higher quality assets to compensate can lead into unstable framerates.

Avatar image for cantloginnow
cantloginnow

381

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#77 cantloginnow
Member since 2013 • 381 Posts

@tormentos: You're one fucked up guy,holy shit.

Its imposible to even respond to your crap,where do someone begin and how much time are they willing to waste on you?This post is all the effort and time Im willing to give you.Keep dancing monkey.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#78 StormyJoe
Member since 2011 • 7806 Posts

I love how people honestly believe there is some sort of "great divide" in performance between the two. While it undeniable that there is a difference, cows make it out to be PS2 v Xbox; which it most certainly is not even close.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#79 StormyJoe
Member since 2011 • 7806 Posts

@Sagemode87 said:

@magicalclick said:

When parity achieved between PS3 and Xbox 360, lems don't scream around. When parity achieved between PS4 and XboxOne, cows scream around.

That's because PS3 had the advantage of being the more capable machine. It was just harder to get results from. X1 has no advantage and is much weaker than PS4 ..... How is that the same to you...

No matter how much you scream "There is a huge difference", that does not make it so.

Avatar image for parkurtommo
parkurtommo

28295

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#80 parkurtommo
Member since 2009 • 28295 Posts

I think this is a fake dev impersonation.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#81  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

@StormyJoe said:

@Sagemode87 said:

@magicalclick said:

When parity achieved between PS3 and Xbox 360, lems don't scream around. When parity achieved between PS4 and XboxOne, cows scream around.

That's because PS3 had the advantage of being the more capable machine. It was just harder to get results from. X1 has no advantage and is much weaker than PS4 ..... How is that the same to you...

No matter how much you scream "There is a huge difference", that does not make it so.

It's definitely not small.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#82 StormyJoe
Member since 2011 • 7806 Posts

@ReadingRainbow4 said:

@StormyJoe said:

@Sagemode87 said:

@magicalclick said:

When parity achieved between PS3 and Xbox 360, lems don't scream around. When parity achieved between PS4 and XboxOne, cows scream around.

That's because PS3 had the advantage of being the more capable machine. It was just harder to get results from. X1 has no advantage and is much weaker than PS4 ..... How is that the same to you...

No matter how much you scream "There is a huge difference", that does not make it so.

It's definitely not small.

Link?

The proof is in the pudding, so to say: If you are going to sit there and tell me that on a console connected to a TV, you can see some big difference between 900p and 1080p, I have one thing to say to you: You're a goddamned liar.

Sorry for being so blunt, but truth is truth, and I am tired of the fallacies being spread by cows on this subject.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#83  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

@StormyJoe said:

@ReadingRainbow4 said:

@StormyJoe said:

@Sagemode87 said:

@magicalclick said:

When parity achieved between PS3 and Xbox 360, lems don't scream around. When parity achieved between PS4 and XboxOne, cows scream around.

That's because PS3 had the advantage of being the more capable machine. It was just harder to get results from. X1 has no advantage and is much weaker than PS4 ..... How is that the same to you...

No matter how much you scream "There is a huge difference", that does not make it so.

It's definitely not small.

Link?

The proof is in the pudding, so to say: If you are going to sit there and tell me that on a console connected to a TV, you can see some big difference between 900p and 1080p, I have one thing to say to you: You're a goddamned liar.

Sorry for being so blunt, but truth is truth, and I am tired of the fallacies being spread by cows on this subject.

http://gdcvault.com/play/1020939/Efficient-Usage-of-Compute-Shaders%20%20Reply

Granted this goes beyond just resolution, having that much a lead for gpu power is not a small difference.

But just to humor you, as a pc gamer you bet I can tell the difference between something like 1600X900 vs 1080p, it comes down to clarity.

Why you guys feel the need to constantly defend the bones meager GPU I'll never understand.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#85 ReadingRainbow4
Member since 2012 • 18733 Posts

@scottpsfan14 said:

Since resolution is handled by the GPU alone, I still don't get why PS4 isn't doing the exact same graphics as XB1 but at 1080p. It has almost 100% more pixel fillrate. And since the CPU feeding the GPU buffer is primarily in draw calls, that has nothing to do with pixel resolution, but actual graphical data. Strange.

Regardless, this game should have been released when Sony and MS released their GPGPU dev kits to 3rd parties. Would have made a lot of difference to a game like this as that ubisoft benchmark showed with the dancers.

It's most likely due to the parity clause they have with MS.

Ubisoft already shot themselves in the foot with a shotgun due to releasing several contradictory statements, the first of which was directly referring to avoiding "debates and stuff."

Avatar image for SolidTy
SolidTy

49991

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#86 SolidTy
Member since 2005 • 49991 Posts

@ReadingRainbow4 said:

It's definitely not small.

It should also be pointed out that those numbers in the graph are Ubisoft's own figures from testing the machines...since this is a Ubisoft thread after all.

I'm posting for others who may ask where that graph came from, not to you per se.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

@btk2k2 said:

@04dcarraher: provided the xbox one can handle 900p with a stable framerate the ps4 can hamdle.it at 1080p with a similar framerate.

If the xbox one is struggling then you would expect the ps4 at 1080p to struggle also.

Not necessarily, if the PS4 is using higher quality effects and assets raising the resolution without adjusting the higher quality assets to compensate can lead into unstable framerates.

Now this is something we can agree on,if the xbox one is running medium and the PS4 is running high yeah both can be 900p 30FPS if the PS3 has higher quality effects.

@cantloginnow said:

@tormentos: You're one fucked up guy,holy shit.

Its imposible to even respond to your crap,where do someone begin and how much time are they willing to waste on you?This post is all the effort and time Im willing to give you.Keep dancing monkey.

So who's alter are you to come out with such rage and bitterness towards me..hahahaa

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#88 btk2k2
Member since 2003 • 440 Posts

@04dcarraher said:

@btk2k2 said:

@04dcarraher: provided the xbox one can handle 900p with a stable framerate the ps4 can hamdle.it at 1080p with a similar framerate.

If the xbox one is struggling then you would expect the ps4 at 1080p to struggle also.

Not necessarily, if the PS4 is using higher quality effects and assets raising the resolution without adjusting the higher quality assets to compensate can lead into unstable framerates.

Why would you presume that I would expect a res bump and an effects bump? I would not expect both as that would be unreasonable.

Besides if as Ubi say it is CPU limited increasing effects is unlikely to happen as a fair few require more CPU performance which is why I think a resolution bump, again if they are being truthful about the CPU limit, would be the best way to use the power.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#89  Edited By StormyJoe
Member since 2011 • 7806 Posts

@ReadingRainbow4 said:

@StormyJoe said:

@ReadingRainbow4 said:

@StormyJoe said:

@Sagemode87 said:

@magicalclick said:

When parity achieved between PS3 and Xbox 360, lems don't scream around. When parity achieved between PS4 and XboxOne, cows scream around.

That's because PS3 had the advantage of being the more capable machine. It was just harder to get results from. X1 has no advantage and is much weaker than PS4 ..... How is that the same to you...

No matter how much you scream "There is a huge difference", that does not make it so.

It's definitely not small.

Link?

The proof is in the pudding, so to say: If you are going to sit there and tell me that on a console connected to a TV, you can see some big difference between 900p and 1080p, I have one thing to say to you: You're a goddamned liar.

Sorry for being so blunt, but truth is truth, and I am tired of the fallacies being spread by cows on this subject.

http://gdcvault.com/play/1020939/Efficient-Usage-of-Compute-Shaders%20%20Reply

Granted this goes beyond just resolution, having that much a lead for gpu power is not a small difference.

But just to humor you, as a pc gamer you bet I can tell the difference between something like 1600X900 vs 1080p, it comes down to clarity.

Why you guys feel the need to constantly defend the bones meager GPU I'll never understand.

I hthink you missed the point of that presentation. It was about shifting the shaders to the GPU to improve performance in DX 11. That final chart is shader performance on the GPU using Ubisoft's technique. It represents a specific area of performance for a specific task, and is not attempting to generalize a performance difference between the two. The article does offer one generalization - the PS4's GPU is 34% more powerful than XB1's. In the computer world, that's really not that much (slides 16 & 18).

As for "telling the difference", I clearly stated "consoles with a TV", and explained why someone would more easily see the difference on a PC. If you are going to comment on my post, please read all of my posts.

Avatar image for shawty_beatz
Shawty_Beatz

1269

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#90 Shawty_Beatz
Member since 2014 • 1269 Posts

@StormyJoe said:

@ReadingRainbow4 said:

@StormyJoe said:

@Sagemode87 said:

That's because PS3 had the advantage of being the more capable machine. It was just harder to get results from. X1 has no advantage and is much weaker than PS4 ..... How is that the same to you...

No matter how much you scream "There is a huge difference", that does not make it so.

It's definitely not small.

Link?

The proof is in the pudding, so to say: If you are going to sit there and tell me that on a console connected to a TV, you can see some big difference between 900p and 1080p, I have one thing to say to you: You're a goddamned liar.

Sorry for being so blunt, but truth is truth, and I am tired of the fallacies being spread by cows on this subject.

I can easily spot the difference on PC though and I doubt that it would be different on consoles, I don't own a X1 though, so I can't test it.

Avatar image for doozie78
Doozie78

1123

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#91  Edited By Doozie78
Member since 2014 • 1123 Posts

"the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others".

Way to cancel yourself out....

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#92  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

@StormyJoe said:

@ReadingRainbow4 said:

@StormyJoe said:

@ReadingRainbow4 said:

@StormyJoe said:

@Sagemode87 said:

@magicalclick said:

When parity achieved between PS3 and Xbox 360, lems don't scream around. When parity achieved between PS4 and XboxOne, cows scream around.

That's because PS3 had the advantage of being the more capable machine. It was just harder to get results from. X1 has no advantage and is much weaker than PS4 ..... How is that the same to you...

No matter how much you scream "There is a huge difference", that does not make it so.

It's definitely not small.

Link?

The proof is in the pudding, so to say: If you are going to sit there and tell me that on a console connected to a TV, you can see some big difference between 900p and 1080p, I have one thing to say to you: You're a goddamned liar.

Sorry for being so blunt, but truth is truth, and I am tired of the fallacies being spread by cows on this subject.

http://gdcvault.com/play/1020939/Efficient-Usage-of-Compute-Shaders%20%20Reply

Granted this goes beyond just resolution, having that much a lead for gpu power is not a small difference.

But just to humor you, as a pc gamer you bet I can tell the difference between something like 1600X900 vs 1080p, it comes down to clarity.

Why you guys feel the need to constantly defend the bones meager GPU I'll never understand.

As for "telling the difference", I clearly stated "consoles with a TV", and explained why someone would more easily see the difference on a PC. If you are going to comment on my post, please read all of my posts.

Dude, I play everything on my 1080p tv, what difference would that make from a monitor. A higher resolution is a higher resolution.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#93 StormyJoe
Member since 2011 • 7806 Posts

@ReadingRainbow4 said:

@StormyJoe said:

@ReadingRainbow4 said:

@StormyJoe said:

@ReadingRainbow4 said:

@StormyJoe said:

@Sagemode87 said:

@magicalclick said:

When parity achieved between PS3 and Xbox 360, lems don't scream around. When parity achieved between PS4 and XboxOne, cows scream around.

That's because PS3 had the advantage of being the more capable machine. It was just harder to get results from. X1 has no advantage and is much weaker than PS4 ..... How is that the same to you...

No matter how much you scream "There is a huge difference", that does not make it so.

It's definitely not small.

Link?

The proof is in the pudding, so to say: If you are going to sit there and tell me that on a console connected to a TV, you can see some big difference between 900p and 1080p, I have one thing to say to you: You're a goddamned liar.

Sorry for being so blunt, but truth is truth, and I am tired of the fallacies being spread by cows on this subject.

http://gdcvault.com/play/1020939/Efficient-Usage-of-Compute-Shaders%20%20Reply

Granted this goes beyond just resolution, having that much a lead for gpu power is not a small difference.

But just to humor you, as a pc gamer you bet I can tell the difference between something like 1600X900 vs 1080p, it comes down to clarity.

Why you guys feel the need to constantly defend the bones meager GPU I'll never understand.

As for "telling the difference", I clearly stated "consoles with a TV", and explained why someone would more easily see the difference on a PC. If you are going to comment on my post, please read all of my posts.

Dude, I play everything on my 1080p tv, what difference would that make from a monitor. A higher resolution is a higher resolution.

Because on a monitor your eyes are 18" from the screen. Seriously, do you guys just look at numbers and say "one is bigger, so it's obviously better"? There are caveats to almost everything.

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#94  Edited By AzatiS
Member since 2004 • 14969 Posts

So this guy just said that 7850 GPU can give only 1-2 frames more vs 7770 on same resolution and graphic settings

Who the hell will buy this shit i just saw before my eyes !! As a PC gamer , i rofled !