DeusGladiorum's comments

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

The most disappointing part of this is that they were as bold to title the entire entry "WWII", when it follows a single American division across a single theater of operations -- and one they've shown before, no less. I'm still interested, but considerably less so.

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

@Iemander: So let me just go ahead and recapitulate your own rationale here; so GameSpot was paid off to not give this product a score so as to give a good impression of the product... yet couldn't be paid off enough to just give this product a good review in the first place?

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

Edited By DeusGladiorum

Can't complain. Got it for free and fixes and updates on the way? Sounds good.

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

Edited By DeusGladiorum

@Zelda99: Oh definitely does, thank you. So if I'm getting this right, basically HDR is two things: A type of signal, and a standard that incorporates the processing of the signal along with some other constraints into its requirements. So the brightness requirements are for purposes of ensuring a wide, vibrant color gamut, because from what I can gather, the intensity of a backlight determines how far reaching the gamut can be (not depth, but gamut)? As for color, so it's possible that an HDR signal can be displayed on smaller-bit panels such as 8-bit which can display 256^3 colors (though I hear true 8-bit panels are rare and most are 6-bit with dithering) but to get the most accurate and vibrant colors you'll want (the presumably extremely rare true) 10-bit panels which can display 1024^3 colors, especially because the wider your color gamut, the more shades of color you want in between each distinct color? An HDR signal is merely a higher-bandwidth signal with more color information then, and decoding the signal to display it doesn't require a processor with any special micro-architecture necessarily, but just needs to be quick enough to decode the larger signal and display the image before the following frame. Also, I presume that SDR signals already have more display information than what can be displayed on panels with low-coverage sRGB backlights, because an SDR signal sent to most such low-coverage sRGB TN panels will look washed out, but the same SDR signal sent to and displayed on an IPS display will almost always look instantly more vibrant and accurate. But then, it's presumable, that while SDR signals can transmit more color data than what's capable of being displayed on, say, a 6-bit low-coverage sRGB TN panel, it can't send a signal that fully displays every color available to, say, a 8-bit high-coverage sRGB IPS panel, which is why HDR is theoretically capable of making a difference on 8-bit displays over SDR, but still needs 10-bit with wide-enough color gamut to fully display all the colors in an HDR signal, yes? I hope my thinking was correct here, but I might be way off. If I'm not off though, or at least not super off, my question is if some of my current displays can take advantage of HDR? For example, I have a Dell XPS 15 9550 1080p IPS display, which is renowned for its color accuracy and extreme brightness (covers 98% of sRGB, though I think its only 8-bit). So would it be possible to view an HDR signal on it and notice a difference? After all, it has specs that come close to HDR, and I don't see why the CPU or GPU couldn't handle signal decoding on its own?

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

I'm glad to have more clarity on this, but this still doesn't answer my questions about HDR entirely. So HDR is... a type of panel and/or lighting solution that forms a display system like IPS or TN or OLED? Well, that can't be right, because it's alluded to in the article that while unlikely, a television could be both HDR-capable and OLED, and on top of that, why would a system or content be necessary to take advantage of color gamut when things like IPS and OLED do not? I'm just confused about what HDR is from a technical perspective.

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

Fallout 4 is easily the most disappointing game I've ever played. So uninspired, so much bastardization of lore and mechanics in a franchise that could've had so much and that's now been so adulterated, and this news makes it all the worse. Still praying for the day Obsidian is allowed to take the reins again.

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

As a rule of thumb, Ubisoft tends to look at the mark, close their eyes, and shoot at the dirt.

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

Edited By DeusGladiorum

A suggestion, it's hard to consider GS reviews of graphics cards as my go-to source of reviews when they're this basic and stripped down. FPS averages have been standard fare forever, obviously, but more recently is minimum frame rates, and in the more detailed benchmarks I like to look at, 70th and 95th percentile lows so I know just how frequently and just how bad these minimums are, which are every bit as important and telling as the average frame rate (not much point having a 70 fps average if there are frequent-enough-to-be-annoying dips to 45 and in the worst case dips to 35). I know it also said in the article that these were maxing out the settings, but does that include maximum AA for example in applicable games? A lot of places don't include AA when they say maximum settings, but I don't believe GS specified. It'd be a lot more preferable if it listed the actual in-game name of the preset and any extra eye candy separate from that. Hoping for more detail next time, Jimmy.

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

@Rushaoz: Nope, Chrome on Windows 10, two different PCs. It's harder to notice on my desktop, but it's definitely there, and on my laptop the stuttering is incredibly noticeable, and my desktop is definitely up to spec, but it shouldn't have to be so powerful just so I don't see any stutter.

Avatar image for DeusGladiorum
DeusGladiorum

89

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

Can't tell if the micro-stutter is from the gameplay, or from GameSpot's terrible, flash-based implementation of a video player that refuses to play nice with Google Chrome.