Announcement

Collapse
No announcement yet.

Blu-ray player that will play multi-region DVDs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #46
    Originally posted by Spatial101 View Post
    My PS3 picture is very, very dark and looses a hell of a load of detail but with this it's all a lot more vibrant and with a greater depth of colour reproduction. I put The Dark Knight in as a test when I got it and ended up watching over an hour with my jaw on the floor - I was amazed at how much detail I'd been missing (and means I have to revisit most of my Blu-ray collection as a result).
    On the PS3 have you got RGB Full Range (HDMI) set to Full or Limited?

    I'm quite tempted to get one of these now, as I also thought that the PS3 was meant to be a quality player.

    Do you know if this player decodes dts-hd and the various other HD sountracks? As my amp doesn't, I might have to stick with the PS3.
    Last edited by Goldenballs; 02-05-2010, 16:58.

    Comment


      #47
      I think a blu ray player have better output as the picture looked nicer in dixons shop display. But again I don't use anything apart from PS3 for blu ray. But I will upgrade to buy a Blu Ray player perhaps next year.

      Comment


        #48
        I'd not judge anything based on the display in Dixons

        I'm wondering if it's not as Goldenballs says, the colour range is setup wrong. Or different TV settings applied to different HDMI inputs.

        From what I know, the PS3 offers exceptionally accurate colour reproduction on still images. How that equates to BR playback I cannot say.

        Simply, I've only ever read that the PS3 was a great BR player, and many reviewers use (or maybe used?) it as their reference player. So hearing here that a £58 player is better has me genuinely intrigued.

        Comment


          #49
          Hey all, was asked to come into this thread. I've not read it in a while so stick with me if I've missed anything.

          Originally posted by Spatial101
          Can be made multi region for DVD and Blu-ray with a code input and only £58. Works fine and the picture quality is stunning - makes my PS3 playback look terrible in comparison.
          If you're talking about the picture quality of DVD, that's a possibility.

          The PS3's video output for BD is basically perfect. It reproduces what's on the disc without any mangling (although I once had a compatibility issue when I paired it up with some Panasonic Plasmas, for some reason it'd produce inaccurate colour in that situation, but most people wouldn't notice the difference).

          1080p/24 (23.976fps actually) BD content is stored on the disc as 1080p/23.976 with 4:2:0 sampling. This is upgraded to 4:2:2 (or sometimes 4:4:4) sampling for output over HDMI, since HDMI does not support 4:2:0. So, unless the BD player has extra video processing inside that is mangling the picture (the PS3 does not) then the only differences will be in chroma resolution, which is a very small difference. You would likely not notice differences in HD chroma upsampling on television-sized displays.

          Originally posted by Matt
          Wow. I had no idea there'd be any difference in BR players, given it's all digital.
          Digital doesn't always equal perfection, but it has been a huge leveller. In the days of analogue formats (or analogue outputs, like the first DVD players), then you usually did get slightly better performance by spending megabucks (although most DVDs are low-pass filtered, which limited their performance anyway, so in many cases there was no point in buying an uber-expensive DVD player that could reproduce high frequencies because these simply weren't present on the disc!)

          I digress. Digital technology does not mean that everything is going to be the same, but in the case of BD players, the reason for differences is usually bad design rather than cost-cutting. In other words, BD player performance can vary, but it rarely does, and when it does, it's because someone f'd something up.

          Spatial101, what display device are you using to get such different results from the players?

          Originally posted by NekoFever
          I don't know where this misconception that all BD players are identical in terms of AV quality has come from because it seems really common but the fundamental technology of Blu-ray is exactly the same as DVD.
          Not really. They are the same in that they are digital and use MPEG (or MPEG-like) technologies, but here is the biggest difference: DVD was designed for backwards compatibility with interlaced, Standard def displays. The performance of DVD players has always varied because in the old days, we were converting the signal to analogue and outputting that to an analogue TV - room for things to go wrong. And now we have digital flat panel displays, we need DVD players with good deinterlacing, film cadence detection, scaling etc.

          Outputting a digital 1080p/24 stream to a digital 1080p/24 display, as is the case with BD, needs none of that. The performance of BD players still varies with interlaced content, but there is hardly any of this on the format.

          Back on the all-digital thing, it's true that there are fewer analogue steps for the video at least, but the player still does the decoding and takes it from a video file encoded on the disc in MPEG-4 or whatever to an uncompressed stream that your TV can understand. Different hardware will do this differently and any processing done to the video is going to change the output.
          There is actually no difference in the quality or output of MPEG-4 AVC or VC-1 decoders (and I believe the situation is the same for MPEG-2). Any differences come afterwards in the post-processing section, but these are usually very slight. Player manufacturers (especially in cheaper products) have taken a very "hands-off" approach so far.

          Originally posted by Matt
          Does anyone know what stages in the pipeline are analogue?
          None (unless you're using analogue audio or video output from the player). The last analogue stage in modern Hollywood films is the CCD imager in the datacine/telecine during film scanning. That is a LONG way up the chain!

          Originally posted by Escape-to-88
          As, while the JP one's play on my 360, they look horrible due to the NTSC signal which results in a lot of colour bleed and motion judder.
          NTSC does not result in a lot of colour bleed inherently, although poor handling of it might. Although your DVD is actually not NTSC at all unless you're outputting it over Composite video. I assume you're using HDMI, so the signal is Y/Cb/Cr (Digital Component video).

          FWIW, I've encoded a few NTSC DVDs which are on the market and I can assure you there is no colour bleed beyond the slight smudge that DVD imposes

          Originally posted by Escape-to-88
          I think the PS3 is an excellent upscaler, but as ever it depends on the tranfer. Case in point my Criterion 'Fear and Loathing...' looks easily as good as the 360's HD-DVD version.
          The PS3 uses edge-adaptive upscaling which results in a very "smooth" look; it can do this thanks to the processing power behind it. The result looks very different to most standalone players, which are still using linear scalers. I can post images showing the difference if you like.

          In 10 minutes or so I can post some scientific backup for what I'm saying, stay with me...
          Last edited by Lyris; 02-05-2010, 18:28.

          Comment


            #50
            Originally posted by Goldenballs View Post
            On the PS3 have you got RGB Full Range (HDMI) set to Full or Limited?
            Tried both quite a bit over the last three years and neither of them makes any difference (lost track of how many times I've fiddled with the PS3 because the dark image bothered me, along with it's little foibles when outputting sound).

            I've tested various scenes (the opening of The Dark Kight, Wall-E and the desert scene in Iron Man) numerous times when fiddling with settings.

            Take Iron Man - on the PS3 when Stark is giving his presentation in the desert there's a distinct loss of detail in his black suit jacket. It's like the colours are all meshing into one on the PS3, yet on the F&H player you can actually see the slight stripe running through it.

            I wish I could grab some shots to show you all how much of a difference it is, but photographing a screen would be next to useless. It is instantly noticeable when watching a scene though.

            What the explanation is I don't know, perhaps the player is overdoing the saturation of the images as someone suggested earlier on, but if it is then it's far superior to me because this way I'm actually getting to see more detail rather than less via the PS3.


            Originally posted by Lyris View Post
            Spatial101, what display device are you using to get such different results from the players?
            It's an LCD - Toshiba Regza something (not sure of the model number without hunting for manual but it's something like 32WB68?!).

            It's being fed through and Onkyo amp and all the settings are the same, bar the actual device that's reading the Blu-ray.

            Comment


              #51
              Absolutely brilliant, thanks for jumping in and letting us all know

              So no need to buy another player unless I want multi-region. Which I actually do, and could use a "simple" player for the missus to use for CD playback, so maybe I'll pick it up next month.

              Comment


                #52
                So, this post will show you measurements taken from 10-step Greyscale patterns and 75% Colour patterns from a few different BD players.

                Keep in mind that the tiny fractions of differences you see here, especially at the left of the RGB Tracking chart, are down to the measuring probe itself (an i1Pro). Each time you take a measurement from the TV screen with one of these meters, the measurement is very, very slightly different. It's only big changes that would be worrying. All players were connected to a calibrated LCD (some LG I was reviewing at the time) using the same video input, same settings, and (not that it would make any difference) the same HDMI cable.

                If you haven't had your TV calibrated, measurements from your screen will be nowhere near as straight as these, because they aren't set up properly or individually at the factory (for understandable reasons). The biggest difference people can make to their viewing is not replacing their BD player, but having their TV calibrated in line with mastering standards so it isn't distorting the picture.

                [b]Playstation 3 Greyscale and Gamma:


                Oppo BDP-831 (unreleased prototype of EU version, identical to the US BDP-83:


                Sony BDP-S760 standalone:


                Playstation 3 Colour:


                Oppo colour:


                Sony BDP-S760 colour:


                As you can see, there are no differences in the picture from the players I measured. It's likely though that something else in the chain is causing people to see different results from players. The CE industry does a fantastic job of confusing the living crap out of end users in this way.

                --

                Also, here are some images of high frequency test patterns for both Luma and Chroma detail on the PS3 compared to the Oppo player I have here. These are 1-pixel thin lines - the smallest 'unit' of detail you can have on a BD. Basically, if one player was giving less detail, you would see it in these charts. For example, if one player really was less detailed, the tiny thin lines in this chart would be blurry. As you can see, they are pin-sharp and present on the Playstation 3. (Some Samsung players actually do blur the Chroma pattern; I don't know if they've fixed this yet - but it's very difficult to see in real world viewing).

                Luminance detail between players:



                Chroma detail between players:



                One things these pictures don't show you is differences in Gamma. Gamma is the distribution of lightness from the blackest to the whitest parts of the image. Some players mess around with this too, for example, elevate the brightness of shadows to make them appear more detailed (at the expensive of making the picture look less "rich"). Usually it's very subtle and can be turned off, though. And in any case, your TV or projector will be doing its own things to the Gamma unless it's been calibrated. (You can see Gamma tracking on the screen shots from the Calman software I posted above though, and it's the same on the players I compared).

                The issue of BD players has been a bit of a touchy one, usually with the people who have w....spent thousands on a "dream player". A while ago some people from a very well known AV forum (some of whom were saying I must be blind to not see the differences between the child's toy Playstation and the super-slow, super-expensive audiophile players) did some shootouts to see if they could pick which player was which, when the identities were concealed. Most of these people have never touched a video encoder in their life, but were content to tell me that I can't tell when my own encoding work was being manipulated by the player. Anyway, the results of the shootout were as random as you could hope for - EXCEPT on the shootout where the Playstation 3's on-screen display accidentally appeared, in which case I seem to recall they ranked it down. When the test was repeated without that little accident, the results were totally random.

                Keith Jack from Sigma Designs (decoder manufacturer) on the (non-)issue:
                I currently have a BD-30 running through a Dennon 3808 to a 56 Samsung 1080p DLP. My question is would a player with a better chip set provide a noticable diffrence when watching a Blu-Ray or does the chip make a diffrence when not upconverting? I have been getting spoiled watching movies on...

                "The outputs of the video decoder blocks on the SoCs are bit-accurate for H.264 and VC-1, meaning the decoded video quality at that point is exactly the same for all players.

                So, the differences in players is in the post-processing of the video, such as scaling, deinterlacing, edge enhancement, noise reduction, color correction, etc. This is where the "art" comes in."
                Last edited by Lyris; 02-05-2010, 19:32.

                Comment


                  #53
                  It's an LCD - Toshiba Regza something (not sure of the model number without hunting for manual but it's something like 32WB68?!).

                  It's being fed through and Onkyo amp and all the settings are the same, bar the actual device that's reading the Blu-ray.
                  That's strange. What is the video output mode on the PS3? Y/Cb/Cr or RGB?
                  Also, is "Super White" turned on or off?

                  Comment


                    #54
                    Lyris - as someone considering spending £350 on a new player, just to get the multi-region aspect, I'm most interested in your comments about calibrating your TV. I guess the simple question is - how do you do this?

                    Comment


                      #55
                      Thanks very much for taking the time to post that.

                      Comment


                        #56
                        @TonyDA: You can do basic calibration yourself; that's things like setting Brightness and Contrast so that no details are being cut out of the picture. My recommendation for that is the AVSHD test pattern disc which has pretty clear instructions on how to set the controls: http://www.avsforum.com/avs-vb/showthread.php?t=948496. Things like the THX Optimizer are pretty good for this too.

                        Unfortunately, the biggest improvements need a measuring device to read lightness and colour data from the TV screen. The good ones cost about a grand, not including the software they interface with. This allows things like Greyscale (making sure the colour of grey is correct so that no colour bias is added to the picture), Gamma, and Colour saturation, hue, and brightness to be measured, and so you can see what you're doing while you adjust the controls.

                        The quickest and cheapest way is usually to call a professional in. That'll cost about £300 or so, which is much, much better money spent than a new BD player. They'll spend about 3-5 hours adjusting everything video-related in the system to make sure it's in-spec, or at least as in-spec as possible. The improvement you see depends on what controls your TV has; it's possible to adjust Greyscale with all TVs and this makes the biggest difference. Some of the better ones have Colour Management controls to really take things up a notch.

                        If you're interested, I think these guys are based near you:


                        I think Iain at Illuminant AV also comes near London:
                        Last edited by Lyris; 02-05-2010, 18:43.

                        Comment


                          #57
                          Thanks again Lyris

                          Comment


                            #58
                            Seconded, thanks for all of that Lyris. Very helpful indeed.

                            Comment


                              #59
                              Originally posted by Lyris View Post
                              Not really. They are the same in that they are digital and use MPEG (or MPEG-like) technologies, but here is the biggest difference: DVD was designed for backwards compatibility with interlaced, Standard def displays. The performance of DVD players has always varied because in the old days, we were converting the signal to analogue and outputting that to an analogue TV - room for things to go wrong. And now we have digital flat panel displays, we need DVD players with good deinterlacing, film cadence detection, scaling etc.

                              Outputting a digital 1080p/24 stream to a digital 1080p/24 display, as is the case with BD, needs none of that. The performance of BD players still varies with interlaced content, but there is hardly any of this on the format.
                              Just to be clear, I meant that it's the same in that it's a digital video file being played from an optical disc, just like DVD. My point was that there's not some voodoo happening that should make the relative quality of players any different to how it's always been with DVD and CD. There's less to go wrong, assuming you're on a 1080p24 TV, but I doubt that any of them do no manipulation to the picture, unless they have a source direct feature.

                              But I don't know where the posts against the PS3's picture quality are coming from. That was never my issue with it as a BD player and it always impressed me.

                              Comment


                                #60
                                Yeah, it's kinda funny actually - if the belief that "processing power = better picture" actually was true, the PS3 would be at the top!

                                Comment

                                Working...
                                X