The inherent motion blur of chemically photographed film is supposed to cover up things like that, but some people can see it.
Announcement
Collapse
No announcement yet.
A discussion on Framerates
Collapse
X
-
My problem is I was shown it by my mate during a film. I think it was that Iron Man desk shot.
I doubt I would've noticed it as much (or at least not paid so much attention to it) if he hadn't, as it's always been there, and I've been going to the cinema quite regularly for years. 2008 has not been kind to my eyes
Ok, it's not that bad, but still
Comment
-
Dazzyman
Cinema I can get really affected of 24p I threw up when I went to see Gladiator with the beginning fight scene because of it. Home BR 24p vs pulldown I hardly notice any difference. Got the chance to swap mine with parents for 24p as theirs has it built in but I was unimpressed as they both judder anyway so stuck with the extra 6"
The judder in rambo IV aka new one is a forced added affect of seeing thru Rambos eyes
Comment
-
Probably worth posting this article here:
Deals with some of what we were discussing last year - namely input latency. Obviously this relates directly to framerate.
No, not looking into getting into the same discussion as last year, when evidence was ignored are dismissed - the same evidence used by Digital Foundry and discussed by Criterion in this article:
Every Saturday, Digital Foundry takes over the Eurogamer homepage, offering up a mixture of technical retrospectives, p…
Feel it's worth linking these here so when the inevitable happens and this thread springs back to life, there's some more technical knowledge rather than opinions.
Comment
-
Based purely on the argument of input lag, I'm still not convinced by the argument that 60fps makes much (if any) difference to how the game plays (as opposed to how nice it looks). Both those articles state that there are greater factors that have an impact on lag, such as the display type.
If lag is such a big issue, we should all be campaigning for displays with less lag before worrying about frame rates (or refusing to upgrade from CRTs), but there's nary a whisper on that subject. That seems to suggest that most people can't detect the lag we already get, so I can't see how a small improvement to that from improving the frame rate is going to be noticed.
Maybe there's something else that 60fps makes better, but the small amount of improvement that a better frame rate gives to input lag seems like something only someone playing at very competitive level would worry about.
Comment
-
Interesting articles. My conclusion is:
For 60fps games e.g. SFIV, COD, Burnout - set your TV to game mode or switch off all the fancy processing in order to get the best reaction time.
For 30fps you may as well let your TV do as much fancy post processing as you like because the frame lag is such a major contributing factor.
I wonder if I totally missed the point?
Comment
-
Not sure about that, I'd always turn off as much TV processing as possible
But yeah at 30fps, I guess depending on the game, so long as your TV isn't really an issue, you may not notice any difference leaving it on.
I understand OLEDs are going to have amazing refresh rates, but then with the image processing it'll **** it all up anyway. My Samsung has a Game Mode, which makes a bit of a difference. Of course, if we were all competitive gamers, we'd be on CRTs still
Comment
-
Originally posted by anephric View PostA lot of gamers moan about input lag and LCDs, actually, and the fact that most reviewers don't test it. Part of the reason I went with plasma. You can certainly "feel" input lag in games like Rock Band (and then you get people moaning that its input lag calibration is off too).
I think some gamers moan about input lag, but mainly it seems to be a fairly small subject.
Comment
-
Originally posted by anephric View PostI remember reading a feature a while ago on framerates and input lag in Heavenly Sword (and how terrible that game actually is for it) but I can't remember where I saw it now.
The majority of Japanese action games are 60fps because the designers want their games to feel responsive and smooth. Most western devs just don't have the same respect for the 'feel' of a game. They aim their games at the massmarket, who don't appreciate finesse.
Sadly, a lot of serious gamers have forsaken their standards, to a point where they will defend 30fps, judder, clunkiness etc...
30fps shouldn't exist in the professional games market. It's ridiculous.
Comment
-
DF / EG have another article up about input issues. Sums up what I wrote a while back, and confirms that 66ms is about the best a 60fps title can get and 133ms for 30fps (though 30fps can hit 100ms). The numbers very clearly show the difference that 60fps can make to control, thus proving 60fps would enhance the feel of a game [compared to the identical game running at 30fps].
The game is unresponsive. It's laggy. The joypad acts in a merely advisory manner. The control is rubbish. Game reviewe…
It's very telling that the top devs are trying to minimise it.
This bit made me laugh...
In-game latency, or the level of response in our controls, is one of the most crucial elements in game-making, not just in the here and now, but for the future too. It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency to the point where cloud gaming services such as OnLive and Gaikai rely heavily upon it.
Comment
Comment