Titanfall having fps issues?

  • 57 results
  • 1
  • 2

This topic is locked from further discussion.

#1 Posted by REforever101 (11156 posts) -

i have a gtx 780 with the most recent driver (which apparently was made for titanfal) and a core i7 with 16gb of ram, and i still get some occasional random fps drops during the training missions. also i can't believe they dont offer triple buffered vysnc, what is this, the early 2000's?

I should try it with d3doverrider

#2 Edited by PredatorRules (7380 posts) -

i have a gtx 780 with the most recent driver (which apparently was made for titanfal) and a core i7 with 16gb of ram, and i still get some occasional random fps drops during the training missions. also i can't believe they dont offer triple buffered vysnc, what is this, the early 2000's?

I should try it with d3doverrider

They'll patch it sooner or later, it's either bad coding or bad optimization - whatever you wish to call it.

#3 Posted by SerOlmy (1649 posts) -

i have a gtx 780 with the most recent driver (which apparently was made for titanfal) and a core i7 with 16gb of ram, and i still get some occasional random fps drops during the training missions. also i can't believe they dont offer triple buffered vysnc, what is this, the early 2000's?

I should try it with d3doverrider

It is called shitty coding. They are using the Source engine for this game, one of, if not THE best optimized engine available and they still can't make it run well. Source has had triple buffered v-synch for ages so no idea why they left that out, the rest is just lazy/inept coding trying to bring this PoS to market in less than 2 years. The results in terms of performance (and IMO enjoyment as well) are predictable.

Just look at L4D2 for a comparison; same engine, large maps, high number of enemies on screen, better animations, and it runs 1000x better and has smoother framerate.

#4 Edited by Jimmy_Russell (543 posts) -

Nobody uses vsync in first person shooters, that's just dumb. And the guy above me comparing Left for Dead 2 with Titanfall, must be trolling ... Regardless I can't tell you about the game because it's not on Steam. If it ever shows up on Steam, I will give it a run.

#5 Edited by SerOlmy (1649 posts) -

Uh, a lot of people use v-synch in shooters. Nothing breaks immersion worse than screen tearing, and if your monitor refresh doesn't match your FPS you get tearing. V-synch adds less than 15ms in lag if it is done properly (and it is done properly on Source), which IMO is not noticeable. And before you start trolling, I play A LOT of TF2 and am almost always in the top 3. V-synch or lack there of is a big deal unless you can ignore the screen tearing, which a lot of people cannot.

Also, my comparison is valid so go troll somewhere else. If Valve can have 50+ zombies on screen and modded L4D2 servers can have 16+ player count with zero FPS issues on reasonably modern hardware (i5 and 570GTX) then the only reason Titanfail is having issues is piss poor coding.

#6 Posted by JangoWuzHere (16116 posts) -

Most games these days don't offer triple buffering for vsync, so I don't know what your point is there.

@SerOlmy said:

@REforever101 said:

i have a gtx 780 with the most recent driver (which apparently was made for titanfal) and a core i7 with 16gb of ram, and i still get some occasional random fps drops during the training missions. also i can't believe they dont offer triple buffered vysnc, what is this, the early 2000's?

I should try it with d3doverrider

It is called shitty coding. They are using the Source engine for this game, one of, if not THE best optimized engine available and they still can't make it run well. Source has had triple buffered v-synch for ages so no idea why they left that out, the rest is just lazy/inept coding trying to bring this PoS to market in less than 2 years. The results in terms of performance (and IMO enjoyment as well) are predictable.

Just look at L4D2 for a comparison; same engine, large maps, high number of enemies on screen, better animations, and it runs 1000x better and has smoother framerate.

I heard this game's engine was modified completly to the point where source is hardly in it. You can easily tell because no other source game looks like it. The L4D2 comparison is kinda dumb. It's no competition, TitanFall looks much better. It has far greater texture detail, advanced lighting, dynamic shadows, and greater effects. Left 4 Dead 2 was not a good looking game when it came out, so I would hope that it ran well for the majority. That game easily showed its age when it released while Titan Fall looks like a next gen game.

#7 Edited by SerOlmy (1649 posts) -

@JangoWuzHere: News flash bud, Titanfail is also nothing special in the visuals department. It looks just like any other Source engine game, with (arguably) better textures. Better textures is not a big performance hit unless you have low GPU memory. And I would strongly argue against your whole "modified completely" thing, if they wanted to rip it up and rebuilt it they would have built their own engine from scratch. The whole point of using Source was that it was well optimized and easy to work with. The fact that the managed to f*** it up and have such bad performance speaks volumes.

And most games I have played recently do have triple buffered v-sync, including every other Source engine game.

#8 Posted by BSC14 (3657 posts) -

@SerOlmy said:

@JangoWuzHere: News flash bud, Titanfail is also nothing special in the visuals department. It looks just like any other Source engine game, with (arguably) better textures. Better textures is not a big performance hit unless you have low GPU memory. And I would strongly argue against your whole "modified completely" thing, if they wanted to rip it up and rebuilt it they would have built their own engine from scratch. The whole point of using Source was that it was well optimized and easy to work with. The fact that the managed to f*** it up and have such bad performance speaks volumes.

And most games I have played recently do have triple buffered v-sync, including every other Source engine game.

tHE ART DIRECTION AND ATTENTION TO DETAIL IS WHAT MAKES IT LOOK GOOD.....crap caps, my bad

#9 Posted by SerOlmy (1649 posts) -

@BSC14: Yeah, you can say that and get away with it on indi games like Bastion (which I love), but not AAA titles. It is a Source engine game that only looks marginally better than any other Source engine game. It has nothing to justify the performance hit other than bad coding. TF2 and DoD Source have 32 player servers, DoD Source has similar sized maps, L4D2 has way more on screen models (zombies) at once. The half-assed rush job to squeeze out this turd in less than 2 years is the reason it performs poorly.

#10 Posted by BSC14 (3657 posts) -

@SerOlmy said:

@BSC14: Yeah, you can say that and get away with it on indi games like Bastion (which I love), but not AAA titles. It is a Source engine game that only looks marginally better than any other Source engine game. It has nothing to justify the performance hit other than bad coding. TF2 and DoD Source have 32 player servers, DoD Source has similar sized maps, L4D2 has way more on screen models (zombies) at once. The half-assed rush job to squeeze out this turd in less than 2 years is the reason it performs poorly.

I can't say why it's having issues...I have not gotten home yet to try it. My point is only that it's a nice looking game and it's not the textures.

#11 Edited by REforever101 (11156 posts) -

its really starting to piss me off a bit. other people with my same video card aren't reporting any issues maxed out. i wonder if its the new driver. it was made specifically for titan fall, maybe if i roll it back

#12 Posted by dethtrain (384 posts) -

I get nasty input lag with Portal 2 if I have triple buffered enabled. I'm not sure what I'm using in team fortress 2 but I get no input lag in there.

#13 Edited by REforever101 (11156 posts) -

no rolling back the drivers didn't help

#14 Edited by amafi (10 posts) -

Haven't played the retail game yet, but I've got what sounds like a fairly similar system (i7 4770k, 32gb ram, single gtx 780) and I didn't have any issues with stuttering or frame drops or dips during the open beta, during the training missions or during multiplayer matches, and I had everything pretty much cranked.

#15 Posted by REforever101 (11156 posts) -

its odd, regardless of my settings, from time to time the frame rate just seems to take a dip

#16 Posted by Cwagmire21 (5887 posts) -

Certain maps seems to take a bigger system hit than others. Some I get 50-60. Some I get 30-40. Still playing with the settings. I only have a 6870 so I wasn't expecting to max or anything.

#17 Posted by REforever101 (11156 posts) -

herm actually, according to fraps, my fps just about never falls off 60fps. even when the framerate does seem to drop, so perhaps thats actually server lag? though my ping is never above 25, though i suppose lag spikes do happen

#18 Posted by SEANMCAD (5464 posts) -

Nobody uses vsync in first person shooters, that's just dumb. And the guy above me comparing Left for Dead 2 with Titanfall, must be trolling ... Regardless I can't tell you about the game because it's not on Steam. If it ever shows up on Steam, I will give it a run.

what?

what the fuck would you use vsync for then? the sims? of course you would use it in a FPS game

#19 Posted by FelipeInside (25310 posts) -

@SEANMCAD said:

@jimmy_russell said:

Nobody uses vsync in first person shooters, that's just dumb. And the guy above me comparing Left for Dead 2 with Titanfall, must be trolling ... Regardless I can't tell you about the game because it's not on Steam. If it ever shows up on Steam, I will give it a run.

what?

what the fuck would you use vsync for then? the sims? of course you would use it in a FPS game

I think what he meant was for competitive online MP FPS games. For online you need all the performance you can get. Hell, people even lower graphics.

#20 Edited by SEANMCAD (5464 posts) -

@SEANMCAD said:

@jimmy_russell said:

Nobody uses vsync in first person shooters, that's just dumb. And the guy above me comparing Left for Dead 2 with Titanfall, must be trolling ... Regardless I can't tell you about the game because it's not on Steam. If it ever shows up on Steam, I will give it a run.

what?

what the fuck would you use vsync for then? the sims? of course you would use it in a FPS game

I think what he meant was for competitive online MP FPS games. For online you need all the performance you can get. Hell, people even lower graphics.

your inverting

#21 Posted by FelipeInside (25310 posts) -

@SEANMCAD said:

@FelipeInside said:

@SEANMCAD said:

@jimmy_russell said:

Nobody uses vsync in first person shooters, that's just dumb. And the guy above me comparing Left for Dead 2 with Titanfall, must be trolling ... Regardless I can't tell you about the game because it's not on Steam. If it ever shows up on Steam, I will give it a run.

what?

what the fuck would you use vsync for then? the sims? of course you would use it in a FPS game

I think what he meant was for competitive online MP FPS games. For online you need all the performance you can get. Hell, people even lower graphics.

your inverting

My inverting?

Maybe you meant "you're inventing"?

In any case I'm guessing that is what he meant, but what I said about performance is true.

#22 Posted by Jimmy_Russell (543 posts) -

Vsync causes lag, why would you want that? I get it, screen tearing, but c'mon this is an online first person shooter, we need max framerates here, not pretty looking graphics.

#23 Posted by kraken2109 (13007 posts) -

@SEANMCAD said:

@jimmy_russell said:

Nobody uses vsync in first person shooters, that's just dumb. And the guy above me comparing Left for Dead 2 with Titanfall, must be trolling ... Regardless I can't tell you about the game because it's not on Steam. If it ever shows up on Steam, I will give it a run.

what?

what the fuck would you use vsync for then? the sims? of course you would use it in a FPS game

Why would you want tons of input lag in what's supposed to be a competitive fps?

#24 Edited by 8-Bitterness (3707 posts) -

Lol, people really use vsync? Wow.

No performance issues here.

#25 Posted by SerOlmy (1649 posts) -

Why would you want tons of input lag in what's supposed to be a competitive fps?

This is a very big exaggeration in most cases. When v-sync is done properly the equivalent latency for buffering a frame is only 17ms. Which IMO if you are playing on a decent server with sub-50ms ping is basically no noticeable. I will grant you, that if it isn't coded properly you can get the syrupy input lag that you see in some games. But the majority of the time you wouldn't notice it. I play TF2 with v-sync on and have ZERO issues topping the scoreboards on a regular basis as sniper.

#26 Posted by kraken2109 (13007 posts) -

@SerOlmy said:
@kraken2109 said:

Why would you want tons of input lag in what's supposed to be a competitive fps?

This is a very big exaggeration in most cases. When v-sync is done properly the equivalent latency for buffering a frame is only 17ms. Which IMO if you are playing on a decent server with sub-50ms ping is basically no noticeable. I will grant you, that if it isn't coded properly you can get the syrupy input lag that you see in some games. But the majority of the time you wouldn't notice it. I play TF2 with v-sync on and have ZERO issues topping the scoreboards on a regular basis as sniper.

Weird, I couldn't even play portal 2 after i reinstalled it and it set vsync to on by default.

#27 Edited by SerOlmy (1649 posts) -

Portal 2 had notoriously bad input lag related to v-sync on the opening menu, but it didn't occur once you got into the game. I believe it stemmed from the fact that they were actively rendering the background for the startup screen.

#28 Posted by Postmortem123 (7640 posts) -

Vsync causes lag, why would you want that? I get it, screen tearing, but c'mon this is an online first person shooter, we need max framerates here, not pretty looking graphics.

Yeah it gives disgusting input lag. Unplayable.

#29 Edited by JigglyWiggly_ (23459 posts) -

Using vsync in am mp fps... lol. Why would you put settings that would make you play significantly worse?

Also triple buffering adds another frame of input lag ontop of vsync.

With that said titanfall has some major issues, game is locked to 60fps with vsync off... and 60fps is yucky, so you can enable vsync to get it to run at 144fps(vg248qe) which is better than 60hz and no vsync, but not by much lol.

#30 Edited by JigglyWiggly_ (23459 posts) -

@SerOlmy said:
@kraken2109 said:

Why would you want tons of input lag in what's supposed to be a competitive fps?

This is a very big exaggeration in most cases. When v-sync is done properly the equivalent latency for buffering a frame is only 17ms. Which IMO if you are playing on a decent server with sub-50ms ping is basically no noticeable. I will grant you, that if it isn't coded properly you can get the syrupy input lag that you see in some games. But the majority of the time you wouldn't notice it. I play TF2 with v-sync on and have ZERO issues topping the scoreboards on a regular basis as sniper.

It's sad when people post like they know what they are talking about when they actually are clueless.

First off, 17ms of input lag and 50ms ping are completely unrelated.

When you have 50ms ping, your input is not behind; movement and your controls are not delayed. Different games have different ways to compensate for ping delay. Some games favor full client sided hit detection which regs your shots on your screen without referencing the server's values. Some are fully server sided (which is worse for aimers) as it checks the server's location for the enemy player's hitbox and is not in relation to their playermodel. Most games that are server sided use interpolation, like hitbox prediction, lag compensation, etc.

One method is backwards reconciliation with unlagged style netcode

When you shoot someone the server moves the person you shot to where you saw them when you fired. The server then checks if the player you shot was actually there and compares the timestamp to when you shot vs their position in the history. This is a very good tradeoff in the netcode, a shame few games have good netcode like this. CPMA does with cg_laghax, cg_xerpclients.

Some newer fps games are using combinations of all of these e.g Loadout. The netcode is extremely good, you can play on 200 ping fine and there isn't much death behind cover. Feels like a client sided netcode with server reconciliation since I have not noticed any desync.

Your shots regardless will register later on their screen regardless of the method used.

Back to what you were saying, 17ms of input lag feels horrendous. All your movements, all your aiming is delayed by 17ms. Any decent gamer would find that unacceptable, and the fact that you reference pubs in tf2 is hilarious.

#31 Posted by SerOlmy (1649 posts) -

@JigglyWiggly_: I had whole post lined out and then I realized that I had spent over half an hour pulling to requisite references on human visual perception and reaction time to respond to a thread on the internet. So instead of posting a long and thorough rebuttal on why you are wrong that .17 seconds is noticeable in human visual perception I'm just going to go ahead and summarize.

You are wrong in stating that .17 seconds of input lag is perceivable by anyone but the absolute lowest outliers in terms of human visual perception and reaction time. That amount of time is well within the variance between individual reactions times (between .1 seconds and .25 seconds depending on the individual and the complexity of the information). So unless you are a MLB slugger, pro tennis player, NHL goalie, or on a top tier MLG team, then .17 ms is not a significant loss of reaction time.

#32 Edited by JigglyWiggly_ (23459 posts) -

@SerOlmy said:

@JigglyWiggly_: I had whole post lined out and then I realized that I had spent over half an hour pulling to requisite references on human visual perception and reaction time to respond to a thread on the internet. So instead of posting a long and thorough rebuttal on why you are wrong that .17 seconds is noticeable in human visual perception I'm just going to go ahead and summarize.

You are wrong in stating that .17 seconds of input lag is perceivable by anyone but the absolute lowest outliers in terms of human visual perception and reaction time. That amount of time is well within the variance between individual reactions times (between .1 seconds and .25 seconds depending on the individual and the complexity of the information). So unless you are a MLB slugger, pro tennis player, NHL goalie, or on a top tier MLG team, then .17 ms is not a significant loss of reaction time.

There is a difference between continuous reaction time and click reaction time. Your input for tracking is faster than actuating your finger to click. The 17ms of input lag is also continous, your actions are delayed. Also you can see that your cursor is delayed even if you cannot react to it. You can see a flash of light that only showed up in milliseconds, you just react to it after. With input lag not only does the light shine later, but you react slower to it as well.

Also I can average 135ms of human benchmark when I'm in good shape(which is rare tbh lol), 17ms extra would put me at 152, which while still very fast is not as good.

http://www.humanbenchmark.com/tests/reactiontime/

17ms of input lag + 60hz lag + display lag + game lag is a very large number. It becomes impractical to play at those settings.

Also lol mlg... although quake live was in MLG for a day before 2gd got it kicked off. I am the number one ranked lightning gun dueler in quake live north america though.

#33 Posted by SerOlmy (1649 posts) -

@JigglyWiggly_: You miss my point. You are an outlier if what you claim is true. For the other 99.5% of gamers it is not going to be significant. Hence why the VAST majority of gamers cannot tell the difference between v-sync and not v-sync other than screen tearing. And it is also a major tradeoff, screen tearing is going to be distracting which is going to lower your reaction time, which is going to negate any benefit from the loss of .17ms of input latency.

#34 Edited by JigglyWiggly_ (23459 posts) -

No, anyone can notice these things. Linus who is a super casual even complained about v-sync input lag and how g-sync fixed that. It is so obvious, it is as big of a difference as going from 60hz to 120hz.

The fact that other people in the thread are mentioning it as well shows how big of an issue it is.

#35 Posted by Cwagmire21 (5887 posts) -

@SerOlmy said:

@JigglyWiggly_: You miss my point. You are an outlier if what you claim is true. For the other 99.5% of gamers it is not going to be significant. Hence why the VAST majority of gamers cannot tell the difference between v-sync and not v-sync other than screen tearing. And it is also a major tradeoff, screen tearing is going to be distracting which is going to lower your reaction time, which is going to negate any benefit from the loss of .17ms of input latency.

We could discuss whether Candy Crush players would detect a .17 sec input delay or not, but we're all on this game forum as we are game enthusiasts so I'd think we'd consider most of us in those "outliers".

We build PCs to experience games at better resolutions, framerate, stability, etc than consoles. I'd think that would make us in the "outliers" too.

I wouldn't consider this a casual gamer board.

#36 Posted by SerOlmy (1649 posts) -

@Cwagmire21: I have zero issue running v-sync on nearly every game I play from WoW to TF2 to Flight Sims and RTS games. I can acquit myself very well in games like CSS and TF2 (in pub matches on well know servers, I am no pro gamer as Wiggly claims to be) so I would consider my reflexes above average (although they are getting worse since I passed 30) and my associative memory and so forth is well above average. I can play games without the v-sync but it is HIGHLY undesirable due to the constant tearing which is very distracting. What I would gain in input lag would be canceled out by the effect of the distraction of the screen tearing. I can easily tell the difference between different levels of AA, different refresh rates and frame rates, and could probably tell you what post processing effects a game is running just by watching, but I cannot not see a noticeable impact in my gaming when v-sync is disabled. Granted in some games where it is badly done you can feel it, but in most games it isn't noticeable.

#37 Posted by kraken2109 (13007 posts) -

Portal 2 at 60fps vsync or 200+fps no vsync is painfully obvious in terms of responsiveness. CS is too.

#38 Posted by wis3boi (31110 posts) -

@BSC14 said:

@SerOlmy said:

@JangoWuzHere: News flash bud, Titanfail is also nothing special in the visuals department. It looks just like any other Source engine game, with (arguably) better textures. Better textures is not a big performance hit unless you have low GPU memory. And I would strongly argue against your whole "modified completely" thing, if they wanted to rip it up and rebuilt it they would have built their own engine from scratch. The whole point of using Source was that it was well optimized and easy to work with. The fact that the managed to f*** it up and have such bad performance speaks volumes.

And most games I have played recently do have triple buffered v-sync, including every other Source engine game.

tHE ART DIRECTION AND ATTENTION TO DETAIL IS WHAT MAKES IT LOOK GOOD.....crap caps, my bad

titanfall has art direction? Looks like mud to me

#39 Posted by BSC14 (3657 posts) -

@wis3boi said:

@BSC14 said:

@SerOlmy said:

@JangoWuzHere: News flash bud, Titanfail is also nothing special in the visuals department. It looks just like any other Source engine game, with (arguably) better textures. Better textures is not a big performance hit unless you have low GPU memory. And I would strongly argue against your whole "modified completely" thing, if they wanted to rip it up and rebuilt it they would have built their own engine from scratch. The whole point of using Source was that it was well optimized and easy to work with. The fact that the managed to f*** it up and have such bad performance speaks volumes.

And most games I have played recently do have triple buffered v-sync, including every other Source engine game.

tHE ART DIRECTION AND ATTENTION TO DETAIL IS WHAT MAKES IT LOOK GOOD.....crap caps, my bad

titanfall has art direction? Looks like mud to me

Ok...I'm sure it looking like mud is exactly why the game has gotten so much hype.

#40 Posted by FelipeInside (25310 posts) -

@BSC14 said:

@wis3boi said:

@BSC14 said:

@SerOlmy said:

@JangoWuzHere: News flash bud, Titanfail is also nothing special in the visuals department. It looks just like any other Source engine game, with (arguably) better textures. Better textures is not a big performance hit unless you have low GPU memory. And I would strongly argue against your whole "modified completely" thing, if they wanted to rip it up and rebuilt it they would have built their own engine from scratch. The whole point of using Source was that it was well optimized and easy to work with. The fact that the managed to f*** it up and have such bad performance speaks volumes.

And most games I have played recently do have triple buffered v-sync, including every other Source engine game.

tHE ART DIRECTION AND ATTENTION TO DETAIL IS WHAT MAKES IT LOOK GOOD.....crap caps, my bad

titanfall has art direction? Looks like mud to me

Ok...I'm sure it looking like mud is exactly why the game has gotten so much hype.

Definately has art direction. The two beta levels were very similar, but now that I have seen other levels the detail put into them is really good. (and that's not talking about the details of the soldiers and titans)

#41 Edited by wis3boi (31110 posts) -

@BSC14 said:

@wis3boi said:

@BSC14 said:

@SerOlmy said:

@JangoWuzHere: News flash bud, Titanfail is also nothing special in the visuals department. It looks just like any other Source engine game, with (arguably) better textures. Better textures is not a big performance hit unless you have low GPU memory. And I would strongly argue against your whole "modified completely" thing, if they wanted to rip it up and rebuilt it they would have built their own engine from scratch. The whole point of using Source was that it was well optimized and easy to work with. The fact that the managed to f*** it up and have such bad performance speaks volumes.

And most games I have played recently do have triple buffered v-sync, including every other Source engine game.

tHE ART DIRECTION AND ATTENTION TO DETAIL IS WHAT MAKES IT LOOK GOOD.....crap caps, my bad

titanfall has art direction? Looks like mud to me

Ok...I'm sure it looking like mud is exactly why the game has gotten so much hype.

I never mentioned hype or how good/bad the game is, but feel free to put words in people's mouths

#42 Posted by BSC14 (3657 posts) -

@wis3boi said:

@BSC14 said:

@wis3boi said:

@BSC14 said:

@SerOlmy said:

@JangoWuzHere: News flash bud, Titanfail is also nothing special in the visuals department. It looks just like any other Source engine game, with (arguably) better textures. Better textures is not a big performance hit unless you have low GPU memory. And I would strongly argue against your whole "modified completely" thing, if they wanted to rip it up and rebuilt it they would have built their own engine from scratch. The whole point of using Source was that it was well optimized and easy to work with. The fact that the managed to f*** it up and have such bad performance speaks volumes.

And most games I have played recently do have triple buffered v-sync, including every other Source engine game.

tHE ART DIRECTION AND ATTENTION TO DETAIL IS WHAT MAKES IT LOOK GOOD.....crap caps, my bad

titanfall has art direction? Looks like mud to me

Ok...I'm sure it looking like mud is exactly why the game has gotten so much hype.

I never mentioned hype or how good/bad the game is, but feel free to put words in people's mouths

Point is that if it looked like mud there would not be as much hype around it.

#43 Edited by JigglyWiggly_ (23459 posts) -

I don't think it has very good art/technical graphics for the budget they had considering how it runs lol

#44 Edited by FelipeInside (25310 posts) -

I don't think it has very good art/technical graphics for the budget they had considering how it runs lol

Game runs fine for 99% of gamers. There's always going to be some people having initial issues till they sort out a few bugs.

#45 Posted by BattleSpectre (5963 posts) -

@JigglyWiggly_ said:

I don't think it has very good art/technical graphics for the budget they had considering how it runs lol

Game runs fine for 99% of gamers. There's always going to be some people having initial issues till they sort out a few bugs.

I've heard numerous amount of people complaining about frame rate issues, so here's hoping.

#46 Posted by Arthas045 (5100 posts) -

@FelipeInside said:

@JigglyWiggly_ said:

I don't think it has very good art/technical graphics for the budget they had considering how it runs lol

Game runs fine for 99% of gamers. There's always going to be some people having initial issues till they sort out a few bugs.

I've heard numerous amount of people complaining about frame rate issues, so here's hoping.

Yeah I have heard quite a few people in game saying FPS issues after playing for a certain amount of time. Maybe its something like a memory leak, but I don't know. I have 0 issues out of the game I have played for about 6 hours so far.

#47 Posted by FelipeInside (25310 posts) -

@BattleSpectre said:

@FelipeInside said:

@JigglyWiggly_ said:

I don't think it has very good art/technical graphics for the budget they had considering how it runs lol

Game runs fine for 99% of gamers. There's always going to be some people having initial issues till they sort out a few bugs.

I've heard numerous amount of people complaining about frame rate issues, so here's hoping.

Yeah I have heard quite a few people in game saying FPS issues after playing for a certain amount of time. Maybe its something like a memory leak, but I don't know. I have 0 issues out of the game I have played for about 6 hours so far.

Yeah, the people that don't have issues aren't going to be saying anything, but it's natural since it's a new launch. A few patches and they will fix it up. Same thing happened with BF4 launch. All my friends are reporting no issues at all in framerate both on PC and XBOX.

#48 Edited by JigglyWiggly_ (23459 posts) -

It has more to do with what you consider 'fine'.

#49 Posted by Cwagmire21 (5887 posts) -

@Arthas045 said:

@BattleSpectre said:

@FelipeInside said:

@JigglyWiggly_ said:

I don't think it has very good art/technical graphics for the budget they had considering how it runs lol

Game runs fine for 99% of gamers. There's always going to be some people having initial issues till they sort out a few bugs.

I've heard numerous amount of people complaining about frame rate issues, so here's hoping.

Yeah I have heard quite a few people in game saying FPS issues after playing for a certain amount of time. Maybe its something like a memory leak, but I don't know. I have 0 issues out of the game I have played for about 6 hours so far.

Yeah, the people that don't have issues aren't going to be saying anything, but it's natural since it's a new launch. A few patches and they will fix it up. Same thing happened with BF4 launch. All my friends are reporting no issues at all in framerate both on PC and XBOX.

I had few issues with beta. I'm thinking certain maps are causing trouble IMO.

#50 Posted by FelipeInside (25310 posts) -

@FelipeInside said:

@Arthas045 said:

@BattleSpectre said:

@FelipeInside said:

@JigglyWiggly_ said:

I don't think it has very good art/technical graphics for the budget they had considering how it runs lol

Game runs fine for 99% of gamers. There's always going to be some people having initial issues till they sort out a few bugs.

I've heard numerous amount of people complaining about frame rate issues, so here's hoping.

Yeah I have heard quite a few people in game saying FPS issues after playing for a certain amount of time. Maybe its something like a memory leak, but I don't know. I have 0 issues out of the game I have played for about 6 hours so far.

Yeah, the people that don't have issues aren't going to be saying anything, but it's natural since it's a new launch. A few patches and they will fix it up. Same thing happened with BF4 launch. All my friends are reporting no issues at all in framerate both on PC and XBOX.

I had few issues with beta. I'm thinking certain maps are causing trouble IMO.

Yep, could be that a certain maps need tweaking.

All good, let's hope Respawn bring out a patch soon.