• Is 720p (or i) _broadcast_?

    From J. P. Gilliver@21:1/5 to All on Wed Apr 10 14:09:14 2024
    I was discussing quiz questions (the lack of science/tech ones in
    particular on some prog.s), and it occurred to me what might be a fair
    one might be how many lines is HD; you'd have to qualify it, something
    like "On terrestrial FreeView television as broadcast in the UK, how
    many picture lines does a broadcast described as HD have?"

    It then occurred to me to wonder: is 720 used _for broadcast_ at all?
    (Is there even much source material around in 720?)

    Also, formulating this, I also wondered: is interlacing used much, or at
    all, these days? Presumably it is, at least for SD, at least for
    material originally made in i. (Again, I'm talking about what's
    _broadcast_, not what assorted sets _display_.) The original _reason_
    for it - keep the flicker down while keeping the [vertical] resolution
    up - has more or less gone with the separation of the light source from
    the image display process (presumably progressive at 25 frames per
    second needs about the same _bitrate_ as interlaced at 50 fields per
    second).
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    "I'm a paranoid agnostic. I doubt the existence of God, but I'm sure there is some force, somewhere, working against me." - Marc Maron

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to J. P. Gilliver on Wed Apr 10 14:29:55 2024
    "J. P. Gilliver" <G6JPG@255soft.uk> wrote in message news:lrV370C69oFmFwsw@255soft.uk...
    I was discussing quiz questions (the lack of science/tech ones in
    particular on some prog.s), and it occurred to me what might be a fair one might be how many lines is HD; you'd have to qualify it, something like
    "On terrestrial FreeView television as broadcast in the UK, how many
    picture lines does a broadcast described as HD have?"

    It then occurred to me to wonder: is 720 used _for broadcast_ at all? (Is there even much source material around in 720?)

    Also, formulating this, I also wondered: is interlacing used much, or at
    all, these days? Presumably it is, at least for SD, at least for material originally made in i. (Again, I'm talking about what's _broadcast_, not
    what assorted sets _display_.) The original _reason_ for it - keep the flicker down while keeping the [vertical] resolution up - has more or less gone with the separation of the light source from the image display
    process (presumably progressive at 25 frames per second needs about the
    same _bitrate_ as interlaced at 50 fields per second).

    Since (in "PAL land") an SD picture is 704 or 720 x 576 and an HD picture is 1920 x 1080, material with 720 rows must either be down-scaled for SD or upscaled for HD. I suppose upscaled 720 on HD is better than upscaled 576
    from archive material before the master was HD.

    Is digital TV actually sent in interlaced format (ie odd-numbered rows,
    pixel by pixel, then even-numbered rows) or is everything sent progressive, with any interlacing generated at each TV? Indeed, are the pixels of a TV
    lit in sequence, like the phosphor glow from a CRT, or do they all change simultaneously, lit by a continuous backlight. A quick test on my PC (which uses a 100 Hz refresh, not 120) when viewed through the camera on my phone (which records at 30 fps) doesn't show the very obviously 10 Hz beating that you used to get between a CRT and a camera if a "PAL" camera was pointed at
    an "NTSC" TV. Having said that, the converse situation (120 Hz PC monitor
    and 25 fps camcorder) did generate beating, which is why I changed the PC's refresh rate from 120 to 100, for when I needed to show my PC screen in a
    video I was shooting.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver@21:1/5 to me@privacy.invalid on Thu Apr 11 21:53:58 2024
    In message <uv648t$vsko$1@dont-email.me> at Wed, 10 Apr 2024 14:29:55,
    NY <me@privacy.invalid> writes
    "J. P. Gilliver" <G6JPG@255soft.uk> wrote in message >news:lrV370C69oFmFwsw@255soft.uk...
    []
    It then occurred to me to wonder: is 720 used _for broadcast_ at all?
    (Is there even much source material around in 720?)

    Also, formulating this, I also wondered: is interlacing used much, or
    at all, these days? Presumably it is, at least for SD, at least for
    []
    Since (in "PAL land") an SD picture is 704 or 720 x 576 and an HD
    picture is 1920 x 1080, material with 720 rows must either be
    down-scaled for SD or upscaled for HD. I suppose upscaled 720 on HD is
    better than upscaled 576 from archive material before the master was HD.

    If the original is 576, I'm dubious about upscaling, but I suppose if
    your display _is_ 720, you've got to do something to make it fill the
    screen (vertically).

    Is digital TV actually sent in interlaced format (ie odd-numbered rows,
    pixel by pixel, then even-numbered rows) or is everything sent
    progressive, with any interlacing generated at each TV? Indeed, are the

    Interesting question.

    pixels of a TV lit in sequence, like the phosphor glow from a CRT, or
    do they all change simultaneously, lit by a continuous backlight. A

    I'm pretty sure modern displays use a continuous _backlight_, no
    question. (If you swing your eyeballs over a modern display, you don't
    get the stream of images you used to get with a CRT display.) I've
    always assumed that the actual pixels - now just variable-transmission
    things - did still change sequentially.

    quick test on my PC (which uses a 100 Hz refresh, not 120) when viewed >through the camera on my phone (which records at 30 fps) doesn't show
    the very obviously 10 Hz beating that you used to get between a CRT and
    a camera if a "PAL" camera was pointed at an "NTSC" TV. Having said
    that, the converse situation (120 Hz PC monitor and 25 fps camcorder)
    did generate beating, which is why I changed the PC's refresh rate from
    120 to 100, for when I needed to show my PC screen in a video I was
    shooting.
    Was that a CRT PC monitor? Is the one you're now trying with your 'phone
    a non-CRT one?
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    "Look, if it'll help you to do what I tell you, baby, imagine that I've got a blaster ray in my hand." "Uh - you _have_ got a blaster ray in your hand." "So you shouldn't have to tax your imagination too hard." (Link episode)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Carver@21:1/5 to J. P. Gilliver on Fri Apr 12 16:27:27 2024
    On 10/04/2024 14:09, J. P. Gilliver wrote:
    I was discussing quiz questions (the lack of science/tech ones in
    particular on some prog.s), and it occurred to me what might be a fair
    one might be how many lines is HD; you'd have to qualify it, something
    like "On terrestrial FreeView television as broadcast in the UK, how
    many picture lines does a broadcast described as HD have?"

    It then occurred to me to wonder: is 720 used _for broadcast_ at all?
    (Is there even much source material around in 720?)

    It's all to do the Kell Factor. Read up about it.

    Basically, 1080i50, gives similar subjective quality as 720p50.

    The Holly Grail is 1080p50. 1080p50 is not transmitted in the UK,
    because it takes (almost) double the bandwidth of 1080i50, however
    1080p50 is becoming standard within the studio environment now.

    It's really easy to 'downscale' 1080p50 to 720p50 (because no temporal conversion is required) and also really easy to upscale it to UHD (where
    there is no interlaced mode (thank god))

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver@21:1/5 to Mark Carver on Fri Apr 12 19:48:26 2024
    In message <l7t26vFnkupU1@mid.individual.net> at Fri, 12 Apr 2024
    16:27:27, Mark Carver <mark@invalid.com> writes
    []
    Basically, 1080i50, gives similar subjective quality as 720p50.

    Interesting.

    I was just wondering if anyone actually _broadcasts_ 720 (p or i). I
    suspect not - SD will be broadcast at 576i since there's nothing to gain
    from doing otherwise, and I presume HD is broadcast as 1080. I suspect
    there is little (probably none, other than from amateur sources) source material in 720.

    The Holly Grail is 1080p50. 1080p50 is not transmitted in the UK,
    because it takes (almost) double the bandwidth of 1080i50, however

    So HD is broadcast i.

    1080p50 is becoming standard within the studio environment now.

    That makes sense - if you've got the storage, and 1080 sensors, you
    might as well store at that for the future - even the conversion to i
    doesn't require much (less than one field of storage, which is
    negligible these days).

    It's really easy to 'downscale' 1080p50 to 720p50 (because no temporal

    But who is doing it? The broadcasters are presumably storing at 1080p50
    as you say, but broadcasting at 1080i50. So viewers with a 720 set will
    receive 1080i50 or 576i50, and down- or up-scale from that (presumably
    to 720i, though maybe 720p) - they won't have a source of 1080p.

    conversion is required) and also really easy to upscale it to UHD
    (where there is no interlaced mode (thank god))

    Indeed. I don't think CRTs - the main reason for interlacing - were made
    in UHD.

    Interesting that you say the pricklecup is 1080p50; presumably that's
    only temporary, and something larger will come along eventually, with
    the increasing size (resolution, really) of displays. Though IMO 50 -
    once flicker is no longer relevant - is actually more than enough for
    the majority of subjects, in fact overkill for most; for many types of
    material 10 or 12 is in fact more than adequate. (Though I doubt _less_
    than 50 will ever become common for general purpose studio production.)
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    Worst programme ever made? I was in hospital once having a knee operation and I watched a whole episode of "EastEnders". Ugh! I suppose it's true to life. But so is diarrhoea - and I don't want to see that on television. - Patrick Moore, in Radio Times 12-18 May 2007.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Carver@21:1/5 to J. P. Gilliver on Sat Apr 13 17:26:32 2024
    On 12/04/2024 19:48, J. P. Gilliver wrote:
    In message <l7t26vFnkupU1@mid.individual.net> at Fri, 12 Apr 2024
    16:27:27, Mark Carver <mark@invalid.com> writes
    []
    Basically, 1080i50, gives similar subjective quality as 720p50.

    Interesting.

    I was just wondering if anyone actually _broadcasts_ 720 (p or i).

    720i is not a supported mode. Some European and some US broadcasters,
    broadcast in 720p



    It's really easy to 'downscale' 1080p50 to 720p50 (because no temporal

    But who is doing it?

    The broadcasters mentioned above. Production/studio standards are 1080,
    720 is only really used for emission



    conversion is required) and also really easy to upscale it to UHD
    (where there is no interlaced mode (thank god))

    Indeed. I don't think CRTs - the main reason for interlacing - were made
    in UHD.

    No. The primary reason for interlacing was to save bandwidth. 50% of it.
    CRTs are (were) good for it, because the phosphor lag helped 'fill in
    the gaps' on opposing fields

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver@21:1/5 to Mark Carver on Sun Apr 14 01:30:23 2024
    In message <l7vq1qF5sd0U1@mid.individual.net> at Sat, 13 Apr 2024
    17:26:32, Mark Carver <mark@invalid.com> writes
    On 12/04/2024 19:48, J. P. Gilliver wrote:
    In message <l7t26vFnkupU1@mid.individual.net> at Fri, 12 Apr 2024 >>16:27:27, Mark Carver <mark@invalid.com> writes
    []
    conversion is required) and also really easy to upscale it to UHD
    (where there is no interlaced mode (thank god))

    Indeed. I don't think CRTs - the main reason for interlacing - were
    made in UHD.

    No. The primary reason for interlacing was to save bandwidth. 50% of it.

    Well, to save bandwidth _while preserving vertical resolution_
    (otherwise they could have just done n/2 at 50) _and_ reducing flicker (otherwise they could have just done n at 25, i. e. "p").

    CRTs are (were) good for it, because the phosphor lag helped 'fill in
    the gaps' on opposing fields

    Yes.

    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    "Victory does not bring with it a sense of triumph - rather the dull numbness of relief..." - Cecil Beaton quoted by Anthony Horowitz, RT 2015/1/3-9

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)