en3sge's forum posts

#1 Posted by en3sge (321 posts) -

Any game that hovers between 30 and 60 fps will stutter - the effect of the stutter on the eyes depends on two things:

- How sensitive the individual's eyes are to the stutter

- The degree of motion blur or some other temporal trickery that can make games appear smoother than they are

KZSF was a stutter fest so thank GOODNESS for the recent 30fps lock update.

TR:DE also stutters as it can't quite reach 60fps much of the time. No 30fps lock as yet for that...

This generation could well be ruined by the quest for 60fops by developers. Unfortunately, having 1080p/60fps on a review or on the box just generates more sales. Even if the game can't attain 60fps for much of the time. If it gets there 'sometimes' that's enough for the marketing team to declare 60fps. And us poor gamers with sensitive vision get games that are near unplayable.

I have said this before and I will say this again. If near constant 60fps can't be reached, devs should focus on 30fps, turn on lots of graphical bells and whistles and add a little bit of motion blur to disguise the rougher motion. Or at least add a 30fps lock just like the KZ devs.

#2 Edited by en3sge (321 posts) -

My Gamespot ID is en3sge and my PSN ID is macaco84.

I own a PS3 and a PS4 (although the latter is unfortunately being packed up tonight as it is my xmas present from the wife and I have "had enough time testing the damn thing now so put it back in its box" so she says...

I enjoy driving games, FPS, action and FIFA. I will be playing KZ, AC4 and FIFA from Xmas onwards.

At the moment I have zero PSN friends as I used to do all my multiplayer on Xbox 360 (I know, boooo!) but I want to become better networked on Playstation so welcome all friend requests :)

#3 Posted by en3sge (321 posts) -
I think its more than just a rumour. See the below link: http://www.eurogamer.net/articles/digitalfoundry-spec-analysis-xbox-one If we take the GPU in isolation, we can infer (with good accuracy) that the PS4 is 50% more powerful overall. The question mark lies within the 'secret sauce' of the Xbox One - the ESRAM. This could close the gap or it may make no differece at all.
#4 Posted by en3sge (321 posts) -

Thanks all, I will give it a check :)

#5 Posted by en3sge (321 posts) -

It depends if you have the relevent settings on or off in the BIOS

How would i check? Thx
#6 Posted by en3sge (321 posts) -

I have an i7 2600K clocked to 4.7Ghz. I can switch between different clock speeds from BIOS but normally stick to this one to eek out every frame in the more demanding games (Borderlands 2 in particular loves CPU speed).

My question is this (and yep, it is a bit of an amateur question) - does the processor run at 4.7Ghz also when browsing around windows/the internet etc or does it clock itself down? I want to reduce power consumption as much as possible and generally in one session on my pc I play a game or two and also browse the net. I would rather not reboot on another overclock setting after i finish gaming each time.

Thanks all.

#7 Posted by en3sge (321 posts) -
[QUOTE="NamelessPlayer"]I don't know if I can wait another year. I've been waiting five years already to move on from the Q6600 and 8800 GT, and with games like PlanetSide 2 being more demanding than the likes of Crysis, it's only a matter of time before I can put up with such low framerates due to CPU limitations. Here's hoping that Haswell can manhandle PS2 at 60 FPS even during the busiest battles.

And there was me thinking my gtx 580 needed an upgrade. You sir are well overdue!
#8 Posted by en3sge (321 posts) -

I'm going to go against the grain here and say that playing something like COD (or more realistically, Battlefield) would be beneficial in the army. It wouldn't make you a good soldier but your honed reflexes and being able to spot tiny movements from long distances etc could be helpful and give you an edge.

I have to say, if I went paint-balling (yes I know, nothing like the army but as close as I am going to get) with the guy at the front of the BF3 leaderboards I would be pretty damn scared and expect to be ripped a new one.

#9 Posted by en3sge (321 posts) -

this happens with CPUs also. Intel released almost too powerful a cpu with the i3's, i5 and i7's. I bought my i7 2600k almost 2 years ago now and it still destroys anything amd has... I think if Intel was to do it again they maybe would have released a bit weaker cpu back than and later came out with the stronger ones when the Bulldozer came out.

It wouldnt surprise me if AMD and Nvidia exchange tech specs for their new chips just to keep things balanced... its surprising how similar and almost exactly the same the performance is between the two video card makers... but we pretty much know Nvidia could release a super rigged video card if they wanted, but they'll milk things along and be happy with the competition with AMD.

Yeah exactly. I guess the two companies decide on some ballpark figures on card throughput and then agree not to exceed this and then its up to them how they achieve this and the power levels the cards need to get to these levels. I think (as someone in this thread alluded to earlier) we have seen the results when this 'agreement' doesn't happen or doesn't come off well. The 8800 series blew the ATI competition out the water at the time. If ATI went bust at that point then Nvidia would inherit the ATI customers in the next generation but would then come under close scrutiny for having an absolute monopoly on the graphics card industry - thus potentially opening the doors for another competitor and you know, as they say, better the devil you know! And yep I can see this happening in CPUs too. Heck, the more I think about it the more I can see it happening everywhere.
#10 Posted by en3sge (321 posts) -

I agree with the posts here that mention the monopoly politics that would occur if one of the main players (Nvidia or AMD) put the other out of business.

Lets be honest here, unless you are a completely loyal customer of either brand, you will always buy the card that performs best (this used to be overall performance although now I guess we have to factor in performance per Watt, drivers and SLI capability). Therefore, if (say) Nvidia released the GTX 780 and found a 100% improvement on the GTX680 but AMD found only a 15% improvement on their 7970 then its quite likely AMD would be absolutely mauled at market.

So there must be some background gentlemans handshakes to ensure the two companies stay at least in the same ball-park as eachother.

I do agree that competition drives performance but I also fell that competition can also throttle performance gains somewhat when you look at the politics of the situation.

I don't want to bring consoles into the debate too much as this is a PC forum but I always find it interesting that competing consoles seem to follow this pattern too. SNES was roughly in the same ballpark as Genesis/MegaDrive and the same with PS1/N64 (although perhaps this was more a substantial gap) and then with Xbox/PS2 and Xbox 360/PS3. No one console completely blows the other out of the water in terms of technology.