Announcement

Collapse
No announcement yet.

Resolutiongate (resolution discussion)

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Funnily enough, motion blur tends to be more common with 30fps games because it is used to disguise the lower (or inconsistent) framerates. I hate it as well, but it's rarely used with 60fps games.

    A good demonstration of the difference between 30fps and 60fps is comparing WipEout HD with WipEout 2097. We remember 2097 as being extremely smooth, but play the game after playing WipEout HD and the difference is plain to see. Or on the Saturn, comparing Virtua Fighter 2 (60fps) with Fighters Megamix (30fps).

    And perhaps most importantly, 30fps has double the input lag of 60fps - 33ms vs. 16ms.

    Comment


      This was fun to try:
      Blur Busters UFO Motion Tests with ghosting test, 30fps vs 60fps vs 120hz vs 144hz vs 240hz, PWM test, motion blur test, judder test, benchmarks, and more.


      If I stare at the screen I can't see the difference between 30 vs 60. But, if I star into the distance, as in trying to focus my eye on a point beyond my monitor, then I can tell the difference. You know how you can see things out of the corner of your eye without looking directly? The 60fps is easier to do this.

      But only with the stars. Disable those and there's no difference at all for me.
      Last edited by Sketcz; 02-03-2014, 14:25.

      Comment


        That's probably because your eyes are better at dealing with motion in your peripheral vision, but also because the resolution of your screen is so low, they can all only move 25% of the screen with each frame...


        I suspect their is something actually wrong with you or the way you have everything set up if you cannot see the difference between this

        Blur Busters UFO Motion Tests with ghosting test, 30fps vs 60fps vs 120hz vs 144hz vs 240hz, PWM test, motion blur test, judder test, benchmarks, and more.

        Comment


          Never seen that site before, simple but effective test and as EB says the difference between all comparisons is pretty evident. To me anyway.

          Comment


            So 60 FPS vs 30 FPS will actually help the clarity of the image?

            Comment


              On moving images, yes. Which is why rapidly panning shots in the cinema make me feel a bit sick (I think).
              Last edited by charlesr; 02-03-2014, 16:48.

              Comment


                Originally posted by GluedOnBeard View Post
                So 60 FPS vs 30 FPS will actually help the clarity of the image?
                Yes but only because of the way LCD retains data from the last frame. One of those tests allows you to insert a black frame between each frame in 30fps and that makes it much better.

                In cinema it's because each frame is exposed longer, more blur is caught within each frame. This is why the high frame rate hobbit films look much clearer and crisper everywhere on screen in the cinema.

                Comment


                  Try Sonic on Megadrive at 30fps. Then you will see!!

                  Comment


                    When devs have the option of 720 @ 60 or 1080 @ 30 I don't understand why they ever choose 30fps. The detail gained from 1080 is at least partially lost once the image starts moving (which in most games is all the time). I still think that in many ways COD4 on last gen was one of the best looking games despite it having (I believe?) less than 720p resolution simply because the 60fps meant that I got to appreciate the detail that was there all the time.

                    Comment


                      Originally posted by Brad View Post
                      When devs have the option of 720 @ 60 or 1080 @ 30 I don't understand why they ever choose 30fps.
                      Probably to avoid stressing available resources too much? With many games still spanning across PS3, X360, PS4, and XB1, eventual middlewares might not give enough headroom, thus leading developers to choose what probably is the least resource-intensive choice (higher resolution, but half the times a GPU has to re-render a second).

                      Comment


                        Oh sure, I meant that if it was a case of simply choosing one or the other, not taking effort into account. I know things are different on PC but basically if you have a lower end GPU then the user can make the choice to drop resolution in order to hit 60fps, which I would do every single time if I had to.

                        Comment


                          The more the game requires quicker reflexes and demanding hand to eye the more I desire the game to be closer to 60fps.

                          Comment


                            How do the CPU's in these systems compare to the last gen?

                            From reading about them it seems like they were intended for low end laptops and tablets and are not well suited to games. This could impact on things like AI and physics? Maybe there not the same or comparable though.

                            Comment


                              ^It's easy to misinterpret that fact, but it's not as damning as it might seem. Laptop CPUs are not necessarily low end, but they tend to be focused on low power consumption and low heat dissipation. This will mean that the high end desktop CPUs will always be more powerful than the best laptop processor, but you're not going to get consoles using such high end components anyway. For consoles, laptop CPUs are a perfect fit because they remain powerful while taking into account power and heat which are important factors with consoles.

                              The processors in the next generation consoles leave the previous gen consoles for dead, and are much more suitable for gaming than the Cell processor was.

                              Originally posted by Brad View Post
                              When devs have the option of 720 @ 60 or 1080 @ 30 I don't understand why they ever choose 30fps. The detail gained from 1080 is at least partially lost once the image starts moving (which in most games is all the time). I still think that in many ways COD4 on last gen was one of the best looking games despite it having (I believe?) less than 720p resolution simply because the 60fps meant that I got to appreciate the detail that was there all the time.
                              The Digital Foundry article explains this quite well, but dropping the resolution isn't necessarily going to result in a significant increase in framerate because fill rate might not be where the bottleneck is. I agree that framerate should take priority though (within reason).
                              Last edited by sj33; 03-03-2014, 04:20.

                              Comment


                                Originally posted by Sketcz View Post
                                This was fun to try:
                                Blur Busters UFO Motion Tests with ghosting test, 30fps vs 60fps vs 120hz vs 144hz vs 240hz, PWM test, motion blur test, judder test, benchmarks, and more.


                                If I stare at the screen I can't see the difference between 30 vs 60. But, if I star into the distance, as in trying to focus my eye on a point beyond my monitor, then I can tell the difference. You know how you can see things out of the corner of your eye without looking directly? The 60fps is easier to do this.

                                But only with the stars. Disable those and there's no difference at all for me.
                                What about if you reduce the speed to 120 pixels per second? Isn't the difference between the UFO judder and the stars blurring really obvious?

                                Comment

                                Working...
                                X