Announcement

Collapse
No announcement yet.

Random LCD quesstion.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Random LCD quesstion.

    Do all current 1366 x 768 HD TVs support a 1080i mode?

    Are they all capable of scaling to said resolution?

    In particular, looking at the LE32A457

    Thanks

    #2
    Yes it's part of the HDready standard. Is there a particular reason you need it? 720p is the better resolution to use.

    Comment


      #3
      I have a Sammy LE32 with that resolution and yes, it does do 1080i. But like Boris says, you'll get much better picture quality by using 720p, especially in fast / 60fps games.

      Better yet, hook your machinery up by VGA for lovely 1:1 resolution quality

      Comment


        #4
        Originally posted by MattyD View Post
        I have a Sammy LE32 with that resolution and yes, it does do 1080i. But like Boris says, you'll get much better picture quality by using 720p, especially in fast / 60fps games.

        Better yet, hook your machinery up by VGA for lovely 1:1 resolution quality
        Oh I see. So how would that work? Wouldn't that depend on the native res of the TV? Wouldn't it have to be a 720p (which are nearly non-existent) rather than a 768p..?

        Comment


          #5
          If you use a 720p signal it will get upscaled very slightly by the TV. It doesn't introduce any lag that I've noticed but you do get a very slight reduction in image quality, mostly in the form of some aliasing on sharp angles and small text. You can reduce this to a bare minimum if you use HDMI because Sammy's have a Just Scan option, which reduces the size disparity between the displayed and actual image.

          If you use VGA on the other hand, you can (on 360 at least) set the machine to output at the TV's native resolution. Any scaling work gets done by the console (which is always better than the TV) and since the output matches the physical resolution of the TV, it maps 1 image pixel to 1 physical pixel so you get the best possible image quality.

          As for the 1080i thing, this is a controversial topic which I think comes down to personal preference more than anything else, but I prefer a progressive image over interlaced any day of the week.

          Comment


            #6
            Originally posted by MattyD View Post
            As for the 1080i thing, this is a controversial topic which I think comes down to personal preference more than anything else, but I prefer a progressive image over interlaced any day of the week.
            Isn't 1080i input useful for receiving HD broadcasting channels?

            Comment


              #7
              360 can do 1360x768 over HDMI, that's what I had mine set to (changed it to 720p due to tearing on Capcom games though).

              Comment


                #8
                Originally posted by honeymustard View Post
                360 can do 1360x768 over HDMI, that's what I had mine set to (changed it to 720p due to tearing on Capcom games though).
                Are you sure? Do you need some super-duper cable for that? I'm pretty sure mine only gives me the option for 720p / 1080i when I use HDMI. Or maybe my TV just can't accept any other resolution over HDMI, I'm not sure.

                Originally posted by Cornflakes View Post
                Isn't 1080i input useful for receiving HD broadcasting channels?
                I've never used HD television so I'm not sure, but don't they always transmit in 720p / 1080p anyway? I think the only 1080i broadcasts were on some US / Japanese cable channels when HD was in its infancy and we didn't even have HDTV sets in this country.

                Comment


                  #9
                  Originally posted by MattyD View Post
                  If you use a 720p signal it will get upscaled very slightly by the TV. It doesn't introduce any lag that I've noticed but you do get a very slight reduction in image quality, mostly in the form of some aliasing on sharp angles and small text. You can reduce this to a bare minimum if you use HDMI because Sammy's have a Just Scan option, which reduces the size disparity between the displayed and actual image.

                  If you use VGA on the other hand, you can (on 360 at least) set the machine to output at the TV's native resolution. Any scaling work gets done by the console (which is always better than the TV) and since the output matches the physical resolution of the TV, it maps 1 image pixel to 1 physical pixel so you get the best possible image quality.

                  As for the 1080i thing, this is a controversial topic which I think comes down to personal preference more than anything else, but I prefer a progressive image over interlaced any day of the week.
                  I see.. I should've known that. Cos I've got my 360 hooked up to my 3 yr old 17" monitor at 1024 x 768 to match the monitors native res, and it looks lovely and sharp!

                  Once I get an HD tv, I'll probably get an hdmi aswel and compare the two.

                  Would VGA also support native 1920x1080p ?

                  Also, regarding the 1080i issue, it's not actually interlaced is it? I'm sure I've seen this on wiki before; there are no LCDs or Plasmas capable of directly scanning an interlaced image.. In order for an interlaced image to be displayed, the set needs to de-interlace which doubles the lines (often causing artifacts like combing or tearing)

                  Whereas, 720p is purely progressive with minimal artifacts and more clarity.

                  So why the feck do they call it 1080i? They should rename it 1080d!

                  Comment


                    #10
                    You're exactly right on that, which is why I never use 1080i.

                    I believe you can output 1080p over VGA but bear in mind that hardly any games support it natively. I think it's five or six and they're mostly sports games. Having said that, a picture that's been upscaled to 1080p by the Xbox will always look and perform better it that's your TVs native res, than using a lower output res and letting the TV handle it.

                    1080i is basically obsolete now. I think the only reason it's included is for legacy compatability, as back in the days before LCD and plasma tech was adequate and affordable for consumers, most HD sets were CRT or front / rear projection. CRTs handle interlaced formats quite well of course and a tiny handful of broadcasts were made using it on US cable channels.

                    Also, the certification standards for HD-DVD stated that you weren't allowed to let component users output 1080p, so 1080i was the next best available resolution. This is obviously a moot point now that HD-DVD is dead.

                    Comment


                      #11
                      So why the feck do they call it 1080i? They should rename it 1080d!
                      They specified all of this HD stuff sometime in the 90s (or even the 1980s, not sure) when Interlaced CRT was very much the norm. 1080i signals were really imagined to be shown on CRTs, but Europe was so slow to adopt this stuff that by the time people here were first able to buy HD displays, they were being sold this tech in a flat panel TV.

                      Edit: yes, it was in the 80s.
                      Last edited by Lyris; 30-04-2009, 16:28.

                      Comment

                      Working...
                      X