I was discussing quiz questions (the lack of science/tech ones in
particular on some prog.s), and it occurred to me what might be a fair one might be how many lines is HD; you'd have to qualify it, something like
"On terrestrial FreeView television as broadcast in the UK, how many
picture lines does a broadcast described as HD have?"
It then occurred to me to wonder: is 720 used _for broadcast_ at all? (Is there even much source material around in 720?)
Also, formulating this, I also wondered: is interlacing used much, or at
all, these days? Presumably it is, at least for SD, at least for material originally made in i. (Again, I'm talking about what's _broadcast_, not
what assorted sets _display_.) The original _reason_ for it - keep the flicker down while keeping the [vertical] resolution up - has more or less gone with the separation of the light source from the image display
process (presumably progressive at 25 frames per second needs about the
same _bitrate_ as interlaced at 50 fields per second).
"J. P. Gilliver" <G6JPG@255soft.uk> wrote in message >news:lrV370C69oFmFwsw@255soft.uk...[]
[]It then occurred to me to wonder: is 720 used _for broadcast_ at all?
(Is there even much source material around in 720?)
Also, formulating this, I also wondered: is interlacing used much, or
at all, these days? Presumably it is, at least for SD, at least for
Since (in "PAL land") an SD picture is 704 or 720 x 576 and an HD
picture is 1920 x 1080, material with 720 rows must either be
down-scaled for SD or upscaled for HD. I suppose upscaled 720 on HD is
better than upscaled 576 from archive material before the master was HD.
Is digital TV actually sent in interlaced format (ie odd-numbered rows,
pixel by pixel, then even-numbered rows) or is everything sent
progressive, with any interlacing generated at each TV? Indeed, are the
pixels of a TV lit in sequence, like the phosphor glow from a CRT, or
do they all change simultaneously, lit by a continuous backlight. A
quick test on my PC (which uses a 100 Hz refresh, not 120) when viewed >through the camera on my phone (which records at 30 fps) doesn't showWas that a CRT PC monitor? Is the one you're now trying with your 'phone
the very obviously 10 Hz beating that you used to get between a CRT and
a camera if a "PAL" camera was pointed at an "NTSC" TV. Having said
that, the converse situation (120 Hz PC monitor and 25 fps camcorder)
did generate beating, which is why I changed the PC's refresh rate from
120 to 100, for when I needed to show my PC screen in a video I was
shooting.
I was discussing quiz questions (the lack of science/tech ones in
particular on some prog.s), and it occurred to me what might be a fair
one might be how many lines is HD; you'd have to qualify it, something
like "On terrestrial FreeView television as broadcast in the UK, how
many picture lines does a broadcast described as HD have?"
It then occurred to me to wonder: is 720 used _for broadcast_ at all?
(Is there even much source material around in 720?)
Basically, 1080i50, gives similar subjective quality as 720p50.
The Holly Grail is 1080p50. 1080p50 is not transmitted in the UK,
because it takes (almost) double the bandwidth of 1080i50, however
1080p50 is becoming standard within the studio environment now.
It's really easy to 'downscale' 1080p50 to 720p50 (because no temporal
conversion is required) and also really easy to upscale it to UHD
(where there is no interlaced mode (thank god))
In message <l7t26vFnkupU1@mid.individual.net> at Fri, 12 Apr 2024
16:27:27, Mark Carver <mark@invalid.com> writes
[]
Basically, 1080i50, gives similar subjective quality as 720p50.
Interesting.
I was just wondering if anyone actually _broadcasts_ 720 (p or i).
It's really easy to 'downscale' 1080p50 to 720p50 (because no temporal
But who is doing it?
conversion is required) and also really easy to upscale it to UHDIndeed. I don't think CRTs - the main reason for interlacing - were made
(where there is no interlaced mode (thank god))
in UHD.
On 12/04/2024 19:48, J. P. Gilliver wrote:[]
In message <l7t26vFnkupU1@mid.individual.net> at Fri, 12 Apr 2024 >>16:27:27, Mark Carver <mark@invalid.com> writes
conversion is required) and also really easy to upscale it to UHDIndeed. I don't think CRTs - the main reason for interlacing - were
(where there is no interlaced mode (thank god))
made in UHD.
No. The primary reason for interlacing was to save bandwidth. 50% of it.
CRTs are (were) good for it, because the phosphor lag helped 'fill in
the gaps' on opposing fields
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 546 |
Nodes: | 16 (2 / 14) |
Uptime: | 10:19:35 |
Calls: | 10,387 |
Calls today: | 2 |
Files: | 14,060 |
Messages: | 6,416,691 |