GG Says: Killzone:SF MP is native 1080p

This topic is locked from further discussion.

Avatar image for TruthBToldShow
TruthBToldShow

352

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 TruthBToldShow
Member since 2011 • 352 Posts

Guerrilla Sauce

So, for some time now, games have used low-res assets within rendering a game to ease the workload on the system. Once all of this process completes, frames are sent via video signal to our displays.

"Games often employ different resolutions in different parts of their rendering pipeline. Most games render particles and ambient occlusion at a lower resolution, while some games even do all lighting at a lower resolution. This is generally still called native 1080p"

This means that ALL OF US have pointed to a "native" resolution of a game, not knowing/seeing that in the process of rendering that game, low-res assets are incorporated. Yet, we still call it native, we still call it 720p, 1080p, etc.

Now to Killzone Shadowfall MP........ We first must understand that over the course of these 3-4 months since the game's launch, no one, not a single pixel-counter came out saying Killzone MP was running at 960x1080. Why is that? Why did it take Eurogamer mentioning GG's temporal reprojection technique for people to even know this?

Answer: Because all of that is done internally, as part of the rendering process, just like games that "do all lighting at a lower resolution". Because when counting the pixels, you would see a "native" 1080p resolution of 1920x1080. And if the game was rendered at 960x1080 and upscaled in the conventional sense, pixel-counters would have seen it....... just like sub-HD games last gen or 900p games this gen.

I am no game development guru, would never claim anything close to that. I know there are some very knowledgeable people here in SW....... some. I can GUARANTEE you that some of best Xbox 360 and Xbox One games use similar techniques with similar assets. And this will be proven over the next few weeks as people look into how Xbone games are rendered, or how last-gen games were rendered on PS3/360.

PA-LEASE, dont come in with "bu, bu, but its rendering at 960x1080" or "its 1080i"

this isnt a video, its a game. And the complexity of what its doing is beyond most of you, for example:

  • We keep track of three images of “history pixels” sized 960x1080
    • The current frame
    • The past frame
    • And the past-past frame
  • For each pixel we store its color and its motion vector – i.e. the direction of the pixel on-screen
  • We also store a full 1080p, “previous frame” which we use to improve anti-aliasing

Then we have to reconstruct every odd pixel in the frame:

  • We track every pixel back to the previous frame and two frames ago, by using its motion vectors
  • By looking at how this pixel moved in the past, we determine its “predictability”
  • Most pixels are very predictable, so we use reconstruction from a past frame to serve as the odd pixel
  • If the pixel is not very predictable, we pick the best value from neighbors in the current frame

If using low-res assets at any point in the "pipeline" means the resolution isn't native........ lems are prepping for some self-ownage.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 MonsieurX
Member since 2008 • 39858 Posts

inb4lock

dat DC

Avatar image for Chutebox
Chutebox

50552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Chutebox
Member since 2007 • 50552 Posts

Sounds confusing as hell and I'm not even going to pretend to know what this means. But, and I'm sure you know, be prepared for people plugging their ears.

Avatar image for Nengo_Flow
Nengo_Flow

10644

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 Nengo_Flow
Member since 2011 • 10644 Posts

Meh.......................

Shit looks and runs good tho.

I dont even have a full 1080p TV, I been wanting to get one but a decent 40+ inch is still like $700-$800

Avatar image for Ghost120x
Ghost120x

6058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 Ghost120x
Member since 2009 • 6058 Posts

I can agree with this since someone should have found out at launch if it wasn't native 1080p. The game still looks good and that's all that matters to me.

Avatar image for TruthBToldShow
TruthBToldShow

352

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 TruthBToldShow
Member since 2011 • 352 Posts

@sts106mat said:

Lol turdBtold thread

Oh snap! You took my username and swapped words out! Awesome! I have never, ever, ever seen that before.

With your genius already on display, maybe you could lend some of that genius to the discussion?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#8  Edited By ronvalencia
Member since 2008 • 29612 Posts
@TruthBToldShow said:

Guerrilla Sauce

So, for some time now, games have used low-res assets within rendering a game to ease the workload on the system. Once all of this process completes, frames are sent via video signal to our displays.

"Games often employ different resolutions in different parts of their rendering pipeline. Most games render particles and ambient occlusion at a lower resolution, while some games even do all lighting at a lower resolution. This is generally still called native 1080p"

This means that ALL OF US have pointed to a "native" resolution of a game, not knowing/seeing that in the process of rendering that game, low-res assets are incorporated. Yet, we still call it native, we still call it 720p, 1080p, etc.

Now to Killzone Shadowfall MP........ We first must understand that over the course of these 3-4 months since the game's launch, no one, not a single pixel-counter came out saying Killzone MP was running at 960x1080. Why is that? Why did it take Eurogamer mentioning GG's temporal reprojection technique for people to even know this?

Answer: Because all of that is done internally, as part of the rendering process, just like games that "do all lighting at a lower resolution". Because when counting the pixels, you would see a "native" 1080p resolution of 1920x1080. And if the game was rendered at 960x1080 and upscaled in the conventional sense, pixel-counters would have seen it....... just like sub-HD games last gen or 900p games this gen.

I am no game development guru, would never claim anything close to that. I know there are some very knowledgeable people here in SW....... some. I can GUARANTEE you that some of best Xbox 360 and Xbox One games use similar techniques with similar assets. And this will be proven over the next few weeks as people look into how Xbone games are rendered, or how last-gen games were rendered on PS3/360.

PA-LEASE, dont come in with "bu, bu, but its rendering at 960x1080" or "its 1080i"

this isnt a video, its a game. And the complexity of what its doing is beyond most of you, for example:

  • We keep track of three images of “history pixels” sized 960x1080
    • The current frame
    • The past frame
    • And the past-past frame
  • For each pixel we store its color and its motion vector – i.e. the direction of the pixel on-screen
  • We also store a full 1080p, “previous frame” which we use to improve anti-aliasing

Then we have to reconstruct every odd pixel in the frame:

  • We track every pixel back to the previous frame and two frames ago, by using its motion vectors
  • By looking at how this pixel moved in the past, we determine its “predictability”
  • Most pixels are very predictable, so we use reconstruction from a past frame to serve as the odd pixel
  • If the pixel is not very predictable, we pick the best value from neighbors in the current frame

If using low-res assets at any point in the "pipeline" means the resolution isn't native........ lems are prepping for some self-ownage.

"On occasion the prediction fails and locally pixels become blurry, or thin vertical lines appear. However, most of the time the prediction works well and the image is identical to a normal 1080p image. We then increase sub-pixel anti-aliasing using our 1080p “previous frame” and motion vectors, further improving the image quality.

The temporal reprojection technique gave subjectively similar results and it makes certain parts of the rendering process faster"

--------------

Notice the following words

1. subjectively vs objectively.

2. identical vs same.

"We also store a full 1080p, 'previous frame'" = the old interlacing pixels composite frame.

1920x1080p = progressive frame rate render. Interlacing pixels with older frames doesn't classify it as progressive frame rate render.

In general, 1920x1080p render usually refers to geometry edge's resolution i.e. the geometry's resolution is rendered as 1920x1080p and the textures are then filled within geometry's edges. The textures has it's own resolution e.g. Metro 2033 Xbox 360 has 1024x01024 resolution textures while the PC has 2048x2048 resolution textures.

Effects' resolution can be lower than geometry's resolution e.g. different SSAO/HDAO/HBAO setting on the PC.

If a gamer wants a pure pixel count superiority, get a gaming PC with flagship class GPU ASIC designs e.g. NV GK110 (780, Titan, 780 Ti) and AMD Hawaii (R9-290, R9-290X). Gaming PCs gives it's gamers the power to set frame rate vs quality or having both.

Note that GS's system wars doesn't equal console wars i.e. a counter argument against PS4/cows doesn't automatically equals lems.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#9  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts

Technically it is. Just like technically all games on the Xbox One and PS4 are 1080p as the hardware does upscaling and sends a single image to the TV. Each frame sent to the source is a 1920x1080 image to be displayed in a single scan, thus making it progressive.

However this is misleading by quite a bit. It's not rendered in a single pass but over at least two passes in the software. It's rendered at half resolution.

They aren't wrong in saying it's a 1080p image as the game doesn't rely on any hardware upscaling to achive 1920x1080 output, but they are misleading people to cover their asses. Kind of a scumbag move. They don't want to fess up that they aren't rendering 1080p in a single pass. This probably has a lot to do with the marketing department of Guerrilla games and Sony. They want that 1080p @60fps bullet point on the box even if it's misleading.

Their software interlacing does look a bit better than traditional hardware upscaling, but it's still not rendering 1920x1080 in a single pass.

Avatar image for applefan1991
applefan1991

3397

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10  Edited By applefan1991  Moderator
Member since 2009 • 3397 Posts

Unfortunately this thread will have to be locked. Users are required to have 500 posts before they can create a thread. For more information, please refer to the System Wars survival guide sticky. Thanks!