Originally posted by andrewfee
[*]Some sort of "interlaced" mode. Now, this is just an idea I had, so I don't know if it would work/how well it would work. As it is now, things like broadcast television are at 50/60Hz. On a CRT, while it's only half-frames, you are seeing 60 updates per-second on the screen. With an LCD, the image is de-interlaced and shown at 30fps. While it is still technically the same amount of full-frames, the way they are delivered is different, and as such, your eye perceives the interlaced images as having smoother motion, as there are more updates/second.
I have two ideas on how to possibly solve this, although perhaps they have already been tried & tested, and do not work. The first, was just to show interlaced images as the are, and black out the "gaps." Unfortunately, as it will then be upscaled, I can see this resulting in rather large scanlines, which would be visible from quite some distance. My second idea is to first properly de-interlace the image, scale it up to the panel's native resolution, and then split the scaled image again, but this time each "scanline" would be every other pixel, meaning it's fine enough to be invisible. This way you are getting 60 different updates per second, and you should perceive things to be much smoother, rather than the current 30 updates/second we see, which results in "blurring" (although it is the way you perceive the image that makes it blur, it actually isn't)
Unfortunately, if the image is not properly de-interlaced in the first place, then this could cause some artefacting, but the advantages may outweigh any disadvantages it might have. Of course, this would have an on/off option if you don't like it. (or allow you to choose between which type of "interlacing" is used.
I have two ideas on how to possibly solve this, although perhaps they have already been tried & tested, and do not work. The first, was just to show interlaced images as the are, and black out the "gaps." Unfortunately, as it will then be upscaled, I can see this resulting in rather large scanlines, which would be visible from quite some distance. My second idea is to first properly de-interlace the image, scale it up to the panel's native resolution, and then split the scaled image again, but this time each "scanline" would be every other pixel, meaning it's fine enough to be invisible. This way you are getting 60 different updates per second, and you should perceive things to be much smoother, rather than the current 30 updates/second we see, which results in "blurring" (although it is the way you perceive the image that makes it blur, it actually isn't)
Unfortunately, if the image is not properly de-interlaced in the first place, then this could cause some artefacting, but the advantages may outweigh any disadvantages it might have. Of course, this would have an on/off option if you don't like it. (or allow you to choose between which type of "interlacing" is used.
I'd love this, in the same way I always use emulators with scanlines on
Comment