Announcement

Collapse
No announcement yet.

A discussion on Framerates

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    The argument that will not die!

    West suggests that a 166ms response is where gamers notice controller lag
    Which if true, backs up my point from he very beginning of this discussion that even though 60fps may be better for input lag, the benefit is imperceptible to the average player and thefore the argument that 60fps equals better for gameplay is flawed.

    30FPS games have a minimum potential lag of 100ms, but many exceed this
    Which agrees with my point that 30fps games do not have to be double the input lag of 60fps games (and disagrees with the original articles that were posted way back when as skater's FACT). Although it says many games exceed this, if the point above is true then it doesn't matter if they don't hit this target anyway.

    Comment


      Yes I know better than to respond to you Brats but I'm in a good mood, so why not

      Originally posted by Brats View Post
      Which if true, backs up my point from he very beginning of this discussion that even though 60fps may be better for input lag, the benefit is imperceptible to the average player and thefore the argument that 60fps equals better for gameplay is flawed.
      The input delay isn't imperceptible. It's a matter of tolerance. PC gamers used to a significantly lower delay can really notice the difference going to a 30fps console title. Console only gamers are simply used to the delay; many people don't even notice it and argue it doesn't exist! They're familiar with it. That's not to say they wouldn't notice a sudden halving of the delay.

      Do some tests on a game you can alter the frame rate and see how you get on with it. If you notice no difference, then good for you.

      One person suggesting 166ms is tolerable doesn't make it a fact. If so, why are the Criterion guys working so hard to get the delay as low as possible in Burnout, for example? They could take some pressure off their renderer by tripple buffering, but they realise that adding just one extra frame of delay can impact their title. Why did IW get the input tester built? It's because it matters. The devs know what they're talking about.

      Which agrees with my point that 30fps games do not have to be double the input lag of 60fps games (and disagrees with the original articles that were posted way back when as skater's FACT). Although it says many games exceed this, if the point above is true then it doesn't matter if they don't hit this target anyway.
      Re-read the article. The minimum a 60fps title could hit, in theory, is 50ms; likewise with 30fps it would be 100ms. That's reading the input at the perfect time in the game logic code, rendering it next frame, and flipping the buffer the frame after. 3 frames. That's best case scenario if the devs have spent some time really working at it (didn't someone post that it was a waste of devs times doing that at some point?).

      Add in triple buffering, it's an extra frame; 16ms or 33ms depending on your framerate. Can you see how just one more frame can have a big difference?

      In the test figures given, most 60fps are 67ms delay, sometimes hitting 84ms (so an additional frame). The best a 30fps title hits is 100ms, and often it's 133ms, and sometimes hitting 200ms!!

      Input lag matters. Those who feel it doesn't are in for a prosperous future with cloud gaming.

      The fact DF are doing regular articles about it, and that devs are devising ways to accurately measure it, show it's becoming an increasing priority. As they say, it may be what gives a dev that cutting edge.
      Last edited by Matt; 19-09-2009, 22:12.

      Comment


        Originally posted by Chain View Post
        The input delay isn't imperceptible. It's a matter of tolerance. PC gamers used to a significantly lower delay can really notice the difference going to a 30fps console title. Console only gamers are simply used to the delay; many people don't even notice it and argue it doesn't exist! They're familiar with it. That's not to say they wouldn't notice a sudden halving of the delay.

        Do some tests on a game you can alter the frame rate and see how you get on with it. If you notice no difference, then good for you.
        There's a difference though between noticing a change in delay and perceiving the delay where there is no change. If I'm a passenger in a car driving at 70mph on an open road, I'll notice immediately if it speeds up to 80mph. But if the car stays contantly at 70mph, I wouldn't be able to tell I was stuck at the speed limit. That's an important distinction.

        One person suggesting 166ms is tolerable doesn't make it a fact.
        No, which is why I said 'If true'. But it is one guy who seems to know a lot about this stuff and is in charge of a game series where latency is key.

        If so, why are the Criterion guys working so hard to get the delay as low as possible in Burnout, for example? They could take some pressure off their renderer by tripple buffering, but they realise that adding just one extra frame of delay can impact their title. Why did IW get the input tester built? It's because it matters. The devs know what they're talking about.
        It could be because they are utter perfectionists, which leads them to becoming involved in areas that don't affect the real world. Ever read the articles on the Bugatti Veyron? Seriously the amount of pain and resources VW went through on things that no-one would ever notice on that car. They became utterly obsessed (known as Group Think in business).

        We all know 60fps looks gorgeous and smooth and both IW and Criterion's tech is great, but Criterion themselves admitted that the biggest factor in lag is someone's TV. If lag mattered, why did we all upgrade from CRTs to LCDs?

        Re-read the article. The minimum a 60fps title could hit, in theory, is 50ms; likewise with 30fps it would be 100ms. That's reading the input at the perfect time in the game logic code, rendering it next frame, and flipping the buffer the frame after. 3 frames. That's best case scenario if the devs have spent some time really working at it (didn't someone post that it was a waste of devs times doing that at some point?).
        I did read it. In theory yes you're right, but according to the article, in reality at least one 30fps game (Halo 3) has achieved 100ms lag, whereas the article states that for 60fps titles 'tests show that 67ms lag is the best we can hope for'.

        Whilst not proof, it suggests that whilst the theorectical minimum is achievable with 30fps titles, it cannot be achieved with 60fps titles. Which if true, means that the difference between the best 60fps game and the best 30 fps game is only 33ms. That doesn't seem like very much at all. Obviously you are going to get some titles where the developer doesn't give a crap about lag, but where the developer cares, that difference is very small.

        And this is the point. Yes input lag is important (as players of Heavenly Sword will attest - 300ms = ouch!).........to a degree.

        However the differences between the best 60fps and 30fps titles can be small enough as to not be at all important, bar the most particular of players. So back to the question (posted as fact) where this all began, is 60fps inherently better than 30fps for gameplay alone? The evidence is waning.

        I still say that if input lag was such a big issue, there would have been gamers rioting at the streets when we bought our shiny new LCD TVs and found that all our games had suddenly turned into laggy pieces of ****. But aside from a couple of tiny murmurs, nothing happened. The only conclusion I can draw from that is that unless the game reaches unnecessarily high levels of lag, that it's not such a big deal. IW and Criterion employing someone to reduce lag isn't enough of a reason as to why you believe it's important.

        You can try and dress it up as something else, but the odd internet article doesn't help explain why we didn't all send our LCD TVs back. If you can give me a explanation for that, whilst giving a rock solid argument as to why I should care about these small amounts of lag, I'd love to hear it.
        Last edited by Brats; 20-09-2009, 00:13.

        Comment


          If you don't care about lag, then ultimately it's you as a gamer losing out. Growing accustomed and accepting a limiting factor of game control doesn't mean we should ignore the problem and not try to improve it.

          Accepting LCD TV latency isn't ideal, and once OLEDs come out we'll all say, "Wow, it makes SUCH a difference!". There's no reason the same can't be said for getting the input inherent in a game [which remember, some in this thread said didn't exist when I first posted about it] to the lowest number possible.

          As I say, if the top devs are working on this, it DOES matter. Believe me, going triple buffer gives you some great framerate advantages [stability], yet they're working with a double buffer instead [quick mention to the small amount of memory saved from going double instead of triple, just in case anyone puts that forward as the reason].

          Plus of course, just because your LCD adds a considerable delay (trying Gaming mode if you can, mine made a nice difference) doesn't mean it's not worthwhile for devs to try and minimise things their end. If not, when does, "Oh it's only 33ms / 66ms / whatever" matter? "Oh we're already on 330ms, another 10% on top won't matter". It all matters, it's the cumulation of factors.

          Originally posted by Brats View Post
          So back to the question (posted as fact) where this all began, is 60fps inherently better than 30fps for gameplay alone? The evidence is waning.
          ?? How? My original statement was the same game running at 60fps would be better to play than the same game at 30fps. I see nothing but evidence to back that statement up.

          As well as the more responsive input, how about more control over your character? You're thinking, right now, purely in terms of say shooting a gun - you're thinking "What's the difference if I fire with a 66ms delay compared to a 100/133ms delay?". It's not just when you PRESS a button, the same factors come into it when you RELEASE it.

          I know it's the internet so it's very easy to dismiss things and be, "Oh it doesn't matter!" but as presumably dedicated gamers (at a guess, most people on this forum), I can't see how you can readily dismiss the importance of game control. The argument that keeps coming up when discussing games is, "It doesn't matter how good it looks if it plays like ****". This discussion is about the game control, one of the most important factors in any game.

          I see people refusing to believe it and living quite happily in 30fps land, and more power to them. If it's not of concern to you, then great. As dedicated gamers, I'd of thought this would be high up on people's agenda. Dismissing something which is of obvious concern to the developers we all praise (in this exampe, IW and Criterion) is bizarre.

          I'm going to pull an old quote from you, back when I first posted about input lag:

          Originally posted by Brats
          An input lag of 1/10th of a second is massive. I'm very sensitive to lag and I have to adjust my amp the nerest millisecond to get the picture and sound in sync. I don't believe all 30 fps games have a delay input of 1/10th of a second. They'd be unplayable.
          You've said 100ms is massive, and are now dismissing it? Maybe you aren't as sensitive to it as you believe. We're all accustomed to it.

          As you can see, all 30fps games do have a delay of 1/10th of a second. Unplayable? No. But could it be better? Yes, of course. And that, after all, was my original point.

          Comment


            Originally posted by Chain View Post
            If you don't care about lag, then ultimately it's you as a gamer losing out. Growing accustomed and accepting a limiting factor of game control doesn't mean we should ignore the problem and not try to improve it.
            I don't get your logic here. How am I 'losing out' to a factor that I can't percieve? I agree I'd be losing out if all games were like Heavenly Sword, but that game is the exception.

            Accepting LCD TV latency isn't ideal, and once OLEDs come out we'll all say, "Wow, it makes SUCH a difference!".
            Really? If so, why did we all not say "Wow I can see SUCH a difference" in a negative way when we moved from non-laggy displays (CRTs) to laggy ones (LCDs)? We all know that something getting worse is more noticeable than something getting better (like going back to SD after being used to HD).

            The fact that we can still buy CRTs renders this assumption somewhat moot. In fact, I had a 24" CRT monitor until a couple of years ago and I replaced it with an LCD. Can't say I (like millions of other people) noticed any difference in lag. I not denying it was there, only that the difference was imperceptible.

            As I say, if the top devs are working on this, it DOES matter. Believe me, going triple buffer gives you some great framerate advantages [stability], yet they're working with a double buffer instead [quick mention to the small amount of memory saved from going double instead of triple, just in case anyone puts that forward as the reason].
            Just because they're working on it doesn't prove your point. I'm sure you're familiar with the LOTR films. Have you seen the part in the making ofs where the armourer shows that they engraved the inside of Theoden's armour? Now this attention to detail has absolutely zero affect on the quality of the final film, but it is a symptom of what happens when a team is utterly committed to its cause.

            I love the fact that IW and Criterion take this thing to the ultimate level. I don't think it's necessary for all developers too though. I think it's more a symptom of both these develoeprs attention to detail, rather than because it's such an important factor.

            ?? How? My original statement was the same game running at 60fps would be better to play than the same game at 30fps. I see nothing but evidence to back that statement up.
            Because the difference is so small between the best titles and because there is evidence that until it reaches a much larger number, lag is imperceptible. There's no hard proof of that, but it is a strong possibility.

            I know it's the internet so it's very easy to dismiss things and be, "Oh it doesn't matter!" but as presumably dedicated gamers (at a guess, most people on this forum), I can't see how you can readily dismiss the importance of game control. The argument that keeps coming up when discussing games is, "It doesn't matter how good it looks if it plays like ****". This discussion is about the game control, one of the most important factors in any game.
            Let's be perfectly clear here - I'm never dismissed the importance of game control. Reading the reviews of Scribblenauts breaks my heart because it seems that a potentially fantastic game is let down by poor control. But game control is a very wide subject of which lag is but a small part of.

            As I said before, lag is important to a degree. Like a lot of things though, it reaches a point of diminishing returns. I'm glad developers are taking it seriously so that we don't end up with more 300ms delays in games, but whether they are able to shave an extra whisker off a game that already displays very little lag....sorry, but there are much more important areas in game design imo.

            I see people refusing to believe it and living quite happily in 30fps land, and more power to them. If it's not of concern to you, then great. As dedicated gamers, I'd of thought this would be high up on people's agenda. Dismissing something which is of obvious concern to the developers we all praise (in this exampe, IW and Criterion) is bizarre.
            Dismissing anything that doesn't directly affect you is not bizarre at all. For most people, this is just a talk on paper. In reality they see no difference. If I never played games online, I wouldn't give a stuff about developers trying to improve net code either. If you think about it, there's nothing remotely bizarre about that at all.

            You've said 100ms is massive, and are now dismissing it? Maybe you aren't as sensitive to it as you believe. We're all accustomed to it.

            As you can see, all 30fps games do have a delay of 1/10th of a second. Unplayable? No. But could it be better? Yes, of course. And that, after all, was my original point.
            You've got me there, I admit I was originally wrong. All games have lag, however the difference in lag between the best 60fps and 30fps titles is much less than first discussed.

            I am very susceptible to lag between audio and video. I have to have that set up just right. I don't know why controller lag doesn't affect me. Maybe it's because the Neversoft guy is right in that anything under 133ms is imperceptible? Given the lack of furore when we all added lag to our displays, this arguments holds some value. If it is true, then once lag reaches 100ms, then any efforts to get it down even lower are not going to affect how good the game is to play in the real world, but more power to the guys who want to do it anyway .

            Comment


              The difference isn't much less than first discussed. In many instances, it is double. Which was my original point, that double the frame rate = half the input delay. Plus more frames = more control.

              I know many console gamers are on the whole quite happy with 30fps. Next gen when we start seeing more 60fps, maybe they'll see the difference it can make. Even if you don't see the difference, that's not to say it isn't better.

              Some people in this thread have dismissed mathematical statistics. When I posted the first article link, it was dismissed as being either too inaccurate, or in many cases being flat out wrong. The author was called an amateur. Now there are proper testing conditions, the argument has moved from, "It doesn't exist you're making it up!" to, "I don't notice it so it's not important!" The argument is actually, "I don't notice it / refuse to acknowledge it, therefore it is not important to me!" Which is entirely fine. If it's not important to you, then great.

              I think unless you've done the PC test I proposed many moons ago though, you may not be in the best place to say it doesn't matter. If you play the same game at 30fps and then at 60fps, then back to 30fps, and you genuinely, 100% cannot tell the difference, then fine. Many of us can feel and see the difference, and appreciate the devs that do strive for 60fps.

              I'm a stats kinda guy, I appreciate analytical data and what the DF guys are looking to do - further gaming knowledge. Their framerate vids are great, and they break things down nice and simply for the non-tech gamer.

              Comment


                Originally posted by Chain View Post
                The difference isn't much less than first discussed. In many instances, it is double. Which was my original point, that double the frame rate = half the input delay. Plus more frames = more control.
                But in some cases it's not double, Halo 3 vs COD4 for example. Of course there will be games where lag is less important (like GTA for instance) and for comparison purposes these games should be disregarded. Are far as we stand today though and based on the information available, the obtainable minimum lag for 30fps is not double 60 fps.

                Some people in this thread have dismissed mathematical statistics. When I posted the first article link, it was dismissed as being either too inaccurate, or in many cases being flat out wrong. The author was called an amateur. Now there are proper testing conditions, the argument has moved from, "It doesn't exist you're making it up!" to, "I don't notice it so it's not important!" The argument is actually, "I don't notice it / refuse to acknowledge it, therefore it is not important to me!" Which is entirely fine. If it's not important to you, then great.
                To be fair, the statistics in the first article were questionable as the testing conditions were ropey to say the least. Now we have a more valid set of data, the conclusion isn't quite as clear cut as you first made out. Yes lag is less with 60fps, but whether that difference makes real change to gameplay is open to debate (and not fact as you previously stated).

                I agree the argument has moved, but that doesn't make your point any more or less valid.

                I think unless you've done the PC test I proposed many moons ago though, you may not be in the best place to say it doesn't matter. If you play the same game at 30fps and then at 60fps, then back to 30fps, and you genuinely, 100% cannot tell the difference, then fine. Many of us can feel and see the difference, and appreciate the devs that do strive for 60fps.
                I'm not a PC gamer, but a very similar test exists on consoles too. As that article states, if you take v-sync off Bioshock, the input lag reduces by half. I have tested this (not specifically for input lag but as a general test), but I put v-sync back on immediately. I didn't perceive any benefit (in control or otherwise) and the tearing was a negative, so back on it went.

                I'm not massively bothered by tearing, so you think that if the lower input lag was such a benefit, I'd have stuck with v-sync off.

                I still don't get the TV thing. If lag is a problem, why wasn't there more outcry when we bought LCDs?

                Comment


                  So you didn't appreciate the reduction in input delay. Then fine, that's your opinion and your experience. Funny how we've gone from "I'd notice 100ms immediately!" to "I didn't perceive any benefit."

                  I know little about LCDs. From what I understand, there is some delay through picture processing, which can be turned off on some TVs. I'm guessing there is still a little delay either with it off, but I do not know. The DF article implies the TV is adding 1 frame (33ms) to the latency figure, which I'm guessing has then been subtracted for the test. I do not know. OLEDs have "fast response time" but, thinking about it, that's almost certainly a quick screen response time, not input->output (ie. less smearing).

                  If it was that much of an issue, either DF are ignoring it (unlikely) or will have an article about it in the future. They're taking the tech side of things very seriously and aren't missing anything, so have to see where they go with it.

                  If you read the article, it is actually saying that beyond 166ms is where players notice input latency; lowering it gives players a better "feel" even if they can't define it.

                  Anyway, until someone puts forward an article about LCD latency, there's nothing more I can add to this current line of discussion. People can believe what they want, I know the numbers, that's enough for me

                  Comment


                    The best LCDs you're going to get will add around 33ms, regardless of whether or not you've got a gaming mode, as far as I know. I spent ages looking at TVs and researching until I found one I was happy with. That was a year ago, so it could be that there's one with no delay like a CRT, but I doubt it. Most "decent" (branded?) LCDs are/were in the 33-66ms range. I don't know if PC monitors are different, I'd imagine they might be as they'll do less/no processing?

                    Comment


                      Lets revive this one.



                      "It means that framerate is still important to us here at Insomniac, but it?s not on the same pedestal it was before. And that Ratchet and Clank Future: A Crack in Time will probably be Insomniac?s last 60fps game."

                      Comment


                        Framerate is important to me. I dont play the GTA games because the framerate isnt good enough although i did do Vice City because i loved the 80s style even though throughout i was bothered by the framerate.

                        For me games released today should be 60fps with minimal to no slowdown. If whilst games developers are creating a game the framerate starts to drop then they should either reduce the graphics quality or the size of the game.

                        I wouldnt put up with a racing game or a footy game with a framerate of lower than 60fps.

                        Although when it comes to RPG'S if the story, characters, graphics & gameplay are good then i would accept a smooth 30fps.

                        Black borders that us pal gamers used to get on games used to infuriate me but thankfully that has pretty much gone from games.

                        Comment


                          Interesting comments from Insomniac. I like this one -

                          Framerate should be as consistent as possible and should never interfere with the game. However, a drop in framerate is interestingly seen by some players as a reward for creating or forcing a complex setup in which a lot of things must happen on the screen at once. As in, “Damn! Did you see that? That was crazy!”

                          That's really fascinating, and quite believable. If the game provides some kind of visual evidence that the hardware is being pushed (and thus, the framerate being compromised), people feel like there has been real effort made with the game. Would they feel differently if it was a rock-solid 60fps?

                          Comment


                            @Shakey. Yes, this is even forced in some games I think. Like Street Fighter finishers? I do wonder if some shmups really need to slow down during boss battles or if the effect is added in!

                            Personally I detest a framerate drop or screen tearing. I'd rather the whole game ran at 30 than 90% at 60. Skate on PS3 used to drop frames at the worst imaginable times. Game breakingly bad!

                            This discussion is quite apt right now, with the recent introduction of 3d gaming.

                            Comment


                              Earth Defense Force certainly gets my approval for misappropriation of framerate. It is so over the top ridiculous that you can't help but laugh at it while you try and bring it to its knees.

                              Good times!

                              Comment


                                What a weird thing to say (insomniac games). Some games work great with 60, others work great with 30.

                                I look forward to seeing their next game which is so mind meltingly jizz worthy in the graphics department that no one every bothers with 60 again. Yeah right.

                                Comment

                                Working...
                                X