• Who remembers how bad analogue television was?

    From Sylvia Else@21:1/5 to All on Thu Feb 27 12:52:40 2025
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a
    title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I suspect this same poor AGC was responsible for a shift in the
    detection of the sync pulse such that the text would be moved to the
    right of its proper position, which could result in distortion of the
    letters as the average brightness varied line by line.

    In the early days of television, using thermionic valves, it was
    probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.

    Were studio monitors any better, anyone know?

    Sylvia.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Edward Rawde@21:1/5 to Sylvia Else on Thu Feb 27 00:22:44 2025
    "Sylvia Else" <sylvia@email.invalid> wrote in message news:m2a9coFaisuU1@mid.individual.net...
    Leave aside the ghosting, which could largely be addressed by having a decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC),

    Philips' design quality was usually reasonable in my experience.

    and odd interactions between brightness and picture position.

    EHT breathing?


    The AGC should have been based on the amplitude of the sync pulses, which was 30% of the total. I'm sure this could have been
    done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing
    large white text, such as a title screen, would show a clear darkening to the sides of the text, while being decidedly grey over
    the rest of the screen.

    I don't recall specific issues like that.
    I do recall how bad a picture some people would tolerate before having the set serviced.
    I also recall knowing that the quality of the content produced by the TV stations and the technical quality of the TV system were
    two completely different things.
    You might ask how bad digital television is if you're talking about the content.
    I can't remember when I last watched a broadcast TV show at the time it was broadcast.


    I suspect this same poor AGC was responsible for a shift in the detection of the sync pulse such that the text would be moved to
    the right of its proper position, which could result in distortion of the letters as the average brightness varied line by line.

    In the early days of television, using thermionic valves, it was probably a miracle that these things worked at all, but surely in
    the transistor age, something better could have been provided.

    Although they have a finite life, valves are good for high voltage and fairly high powers.
    And it was easier to stabilize a valve scan and EHT generator just by varying the pentode's control grid voltage.


    Were studio monitors any better, anyone know?

    Probably. They would have to be.


    Sylvia.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeff Layman@21:1/5 to Sylvia Else on Thu Feb 27 08:10:08 2025
    On 27/02/2025 04:52, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I suspect this same poor AGC was responsible for a shift in the
    detection of the sync pulse such that the text would be moved to the
    right of its proper position, which could result in distortion of the
    letters as the average brightness varied line by line.

    In the early days of television, using thermionic valves, it was
    probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.

    Were studio monitors any better, anyone know?

    I suppose that's one of the problems with being "first past the post" -
    you get stuck with the system. The US had NTSC, valves, and 525 lines.
    In the UK colour broadcasting didn't start until over 12 years later -
    July 1967 - using PAL, transistors, and 625 lines. Most of western
    Europe was the same as the UK, but in France and eastern Europe it was
    SECAM and 819 lines. From what I understand, PAL was better than NTSC,
    and from what I remember, analogue colour in the UK wasn't bad at all.

    --
    Jeff

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Sylvia Else on Wed Feb 26 23:04:30 2025
    On 2/26/2025 9:52 PM, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a decent antenna.

    Analog television had two distinct issues: one was multipath problems (bummer), the other was that you COULD eek a signal out of the ether,
    even a bad one (contrast to digital which is essentially "all or nothing")

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness and picture position.

    My last set was a JVC AV2600US (IIRC):
    <https://i.ebayimg.com/images/g/cj4AAOSwMKdksA8u/s-l1600.webp>
    It was big and heavy, had funny "dumbo ears" -- but just kept on
    running, year after year (I think I purchased it in the
    early 80's and it lasted well into this millenium -- not bad for
    a $1K investment!)

    The AGC should have been based on the amplitude of the sync pulses, which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A
    black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text, while being decidedly grey over the rest of the screen.

    I suspect a lot may have to do with the distribution and proximity of
    "signal". Most places in the US have a high degree of coverage;
    when I lived in New England, I would pick up stations in Boston and
    NYC (separated by ~200 miles).

    Here (SoAZ), I have a tough time picking up the broadcasts from the
    other side of town (the transmitters on THIS side of town are
    within a dozen miles with a clear sight line -- as long as it isn't
    raining)

    OTOH, with each digital station carrying several "channels" of content,
    there are 40 or 50 to choose from (all shit, of course).

    I suspect this same poor AGC was responsible for a shift in the detection of the sync pulse such that the text would be moved to the right of its proper position, which could result in distortion of the letters as the average brightness varied line by line.

    In the early days of television, using thermionic valves, it was probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.

    Were studio monitors any better, anyone know?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Liz Tuddenham@21:1/5 to Sylvia Else on Thu Feb 27 09:47:22 2025
    Sylvia Else <sylvia@email.invalid> wrote:

    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I think you may be mis-remembering; something similar to the fault you
    describe was prevalent on B&W televisions which were built down to a
    price. Some of it was caused by average AGC and some was due to lack of
    DC coupling, or skimped DC restoration, in the video amplifier. Some of
    the better sets used back-porch AGC and, for the enthusiast, add-on
    circuits were published in Wireless World (designed by Mothersole, I
    think).

    From the beginnings of colour television the designers recognised that
    all three video amplifiers had to be DC coupled but the AGC was much
    simpler because they used inverted modulation, so a sync pulse
    corresponding to 100% modulation was always available. I can't imagine
    Philips would have produced a model with such gross errors as you
    describe,. Was your own set faulty or was this a common insurmountable
    problem caused by NTSC and positive modulation on the system in use in
    the U.S. at the time?

    In Europe, Philips and Mullard (their UK valve-making subsidiary)
    published large quantities of material to aid set designers and help
    them get the best out of their range of valves. I read it several years
    before the colour television service started in England and it included
    details on DC coupling and AGC. (The BBC did a lot of their preliminary experimental work using NTSC - but eventually decided to use PAL for the
    public broadcast system).


    --
    ~ Liz Tuddenham ~
    (Remove the ".invalid"s and add ".co.uk" to reply)
    www.poppyrecords.co.uk

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Jeff Layman on Thu Feb 27 09:49:30 2025
    On 27/02/2025 08:10, Jeff Layman wrote:
    On 27/02/2025 04:52, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish
    automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my
    experience was that instead it was based on the average amplitude of the
    demodulated signal. A black image containing large white text, such as a
    title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I suspect this same poor AGC was responsible for a shift in the
    detection of the sync pulse such that the text would be moved to the
    right of its proper position, which could result in distortion of the
    letters as the average brightness varied line by line.

    In the early days of television, using thermionic valves, it was
    probably a miracle that these things worked at all, but surely in the
    transistor age, something better could have been provided.

    Were studio monitors any better, anyone know?

    I suppose that's one of the problems with being "first past the post" -
    you get stuck with the system. The US had NTSC, valves, and 525 lines.

    I always believed that made it impossible for them not to have
    newscasters with flesh that slowly shifted between ghastly green and
    purple tones (or was clamped to unnatural pale orange like the
    Donald's). NTSC was called Never Twice the Same Colour in the UK for
    good reason. PAL was self correcting. My Japanese sets could do both.

    However, when I was in Japan I saw US style NTSC TV implemented
    correctly. It seems there was no reason that it could not be made to
    work well only that US makers couldn't be bothered to do it right.

    In the UK colour broadcasting didn't start until over 12 years later -
    July 1967 - using PAL, transistors, and 625 lines. Most of western
    Europe was the same as the UK, but in France and eastern Europe it was
    SECAM and 819 lines. From what I understand, PAL was better than NTSC,
    and from what I remember, analogue colour in the UK wasn't bad at all.

    Very early sets the colour was in pastel shades until they started
    doping the front screen with neodymium to take out the unwanted yellow component in the early phosphors (especially the blue). Barely gets a
    mention now but it made a big difference to colour saturation and
    purity. Put sets with and without Nd doping together and the difference
    was huge. But either one on their own would satisfy a punter. Premium
    price for the one with better more saturated colours.

    https://en.wikipedia.org/wiki/Cathode-ray_tube#Constructions

    Main problem though was reliability I recall the service engineer
    spending lots of time fiddling with valve swaps. ISTR the original valve
    EHT stack rectifier gave off X-rays and had a lead shield around it.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From john larkin@21:1/5 to Liz Tuddenham on Thu Feb 27 08:37:32 2025
    On Thu, 27 Feb 2025 09:47:22 +0000, liz@poppyrecords.invalid.invalid
    (Liz Tuddenham) wrote:

    Sylvia Else <sylvia@email.invalid> wrote:

    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish
    automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my
    experience was that instead it was based on the average amplitude of the
    demodulated signal. A black image containing large white text, such as a
    title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I think you may be mis-remembering; something similar to the fault you >describe was prevalent on B&W televisions which were built down to a
    price. Some of it was caused by average AGC and some was due to lack of
    DC coupling, or skimped DC restoration, in the video amplifier. Some of
    the better sets used back-porch AGC and, for the enthusiast, add-on
    circuits were published in Wireless World (designed by Mothersole, I
    think).

    From the beginnings of colour television the designers recognised that
    all three video amplifiers had to be DC coupled but the AGC was much
    simpler because they used inverted modulation, so a sync pulse
    corresponding to 100% modulation was always available. I can't imagine >Philips would have produced a model with such gross errors as you
    describe,. Was your own set faulty or was this a common insurmountable >problem caused by NTSC and positive modulation on the system in use in
    the U.S. at the time?

    In Europe, Philips and Mullard (their UK valve-making subsidiary)
    published large quantities of material to aid set designers and help
    them get the best out of their range of valves. I read it several years >before the colour television service started in England and it included >details on DC coupling and AGC. (The BBC did a lot of their preliminary >experimental work using NTSC - but eventually decided to use PAL for the >public broadcast system).

    When I was a kid we had an RCA 12" round-tube B+W TV, in a giant piece
    of furniture cabinet with a 12" speaker. It was all AC coupled, so the
    screen always averaged grey.

    It made a pretty good liquor cabinet, after I scrounged it for parts.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From bitrex@21:1/5 to Don Y on Thu Feb 27 11:54:07 2025
    On 2/27/2025 1:04 AM, Don Y wrote:
    On 2/26/2025 9:52 PM, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    Analog television had two distinct issues:  one was multipath problems (bummer), the other was that you COULD eek a signal out of the ether,
    even a bad one (contrast to digital which is essentially "all or nothing")

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish
    automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    My last set was a JVC AV2600US (IIRC):
       <https://i.ebayimg.com/images/g/cj4AAOSwMKdksA8u/s-l1600.webp>
    It was big and heavy, had funny "dumbo ears" -- but just kept on
    running, year after year (I think I purchased it in the
    early 80's and it lasted well into this millenium -- not bad for
    a $1K investment!)

    The first TV I have memory of was an early 80s Motorola/Quasar 19" that
    my parents probably bought not long after I was born and looked a lot
    like this one:

    <https://www.intervideo.co/quasar-wt5957-19-crt-television/amp/>

    Reception wasn't usually a problem we were 10 miles south of the main Boston-area antenna fields in Newton, just rabbit ears sufficed and it
    worked fine hooked up to a Nintendo...that served well into the 90s when
    we got a larger Mitsubishi tube-type TV that served well into the 2000s,
    and then that was about it for the tubes.

    --
    This email has been checked for viruses by AVG antivirus software.
    www.avg.com

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From john larkin@21:1/5 to john larkin on Thu Feb 27 08:40:05 2025
    On Thu, 27 Feb 2025 08:37:32 -0800, john larkin <jl@650pot.com> wrote:

    On Thu, 27 Feb 2025 09:47:22 +0000, liz@poppyrecords.invalid.invalid
    (Liz Tuddenham) wrote:

    Sylvia Else <sylvia@email.invalid> wrote:

    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish
    automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my
    experience was that instead it was based on the average amplitude of the >>> demodulated signal. A black image containing large white text, such as a >>> title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I think you may be mis-remembering; something similar to the fault you >>describe was prevalent on B&W televisions which were built down to a
    price. Some of it was caused by average AGC and some was due to lack of
    DC coupling, or skimped DC restoration, in the video amplifier. Some of >>the better sets used back-porch AGC and, for the enthusiast, add-on >>circuits were published in Wireless World (designed by Mothersole, I >>think).

    From the beginnings of colour television the designers recognised that
    all three video amplifiers had to be DC coupled but the AGC was much >>simpler because they used inverted modulation, so a sync pulse >>corresponding to 100% modulation was always available. I can't imagine >>Philips would have produced a model with such gross errors as you >>describe,. Was your own set faulty or was this a common insurmountable >>problem caused by NTSC and positive modulation on the system in use in
    the U.S. at the time?

    In Europe, Philips and Mullard (their UK valve-making subsidiary)
    published large quantities of material to aid set designers and help
    them get the best out of their range of valves. I read it several years >>before the colour television service started in England and it included >>details on DC coupling and AGC. (The BBC did a lot of their preliminary >>experimental work using NTSC - but eventually decided to use PAL for the >>public broadcast system).

    When I was a kid we had an RCA 12" round-tube B+W TV, in a giant piece
    of furniture cabinet with a 12" speaker. It was all AC coupled, so the
    screen always averaged grey.

    It made a pretty good liquor cabinet, after I scrounged it for parts.

    We only had three channels, but the programming was better than what
    we have now.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From KevinJ93@21:1/5 to Sylvia Else on Thu Feb 27 11:58:53 2025
    On 2/26/25 8:52 PM, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I suspect this same poor AGC was responsible for a shift in the
    detection of the sync pulse such that the text would be moved to the
    right of its proper position, which could result in distortion of the
    letters as the average brightness varied line by line.

    In the early days of television, using thermionic valves, it was
    probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.

    Were studio monitors any better, anyone know?

    Sylvia.

    Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
    used transistor signal processing stages and many had already changed to transistors for the power stages such as line and frame output as well
    as using chopper stabilised power supplies.

    Typically they used some form of gated AGC to not be affected by the
    video modulation.

    kw

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From KevinJ93@21:1/5 to Martin Brown on Thu Feb 27 12:01:39 2025
    On 2/27/25 1:49 AM, Martin Brown wrote:
    On 27/02/2025 08:10, Jeff Layman wrote:
    On 27/02/2025 04:52, Sylvia Else wrote:
    <...>
    I always believed that made it impossible for them not to have
    newscasters with flesh that slowly shifted between ghastly green and
    purple tones (or was clamped to unnatural pale orange like the
    Donald's). NTSC was called Never Twice the Same Colour in the UK for
    good reason. PAL was self correcting. My Japanese sets could do both.

    However, when I was in Japan I saw US style NTSC TV implemented
    correctly. It seems there was no reason that it could not be made to
    work well only that US makers couldn't be bothered to do it right.
    <...>

    Sony was notorious for not implementing PAL decoding fully by omitting
    the delay line and effectively processing PAL as if it was NTSC.

    kw

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lasse Langwadt@21:1/5 to john larkin on Thu Feb 27 20:12:00 2025
    On 2/27/25 17:40, john larkin wrote:
    On Thu, 27 Feb 2025 08:37:32 -0800, john larkin <jl@650pot.com> wrote:

    On Thu, 27 Feb 2025 09:47:22 +0000, liz@poppyrecords.invalid.invalid
    (Liz Tuddenham) wrote:

    Sylvia Else <sylvia@email.invalid> wrote:

    Leave aside the ghosting, which could largely be addressed by having a >>>> decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish >>>> automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my >>>> experience was that instead it was based on the average amplitude of the >>>> demodulated signal. A black image containing large white text, such as a >>>> title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I think you may be mis-remembering; something similar to the fault you
    describe was prevalent on B&W televisions which were built down to a
    price. Some of it was caused by average AGC and some was due to lack of >>> DC coupling, or skimped DC restoration, in the video amplifier. Some of >>> the better sets used back-porch AGC and, for the enthusiast, add-on
    circuits were published in Wireless World (designed by Mothersole, I
    think).

    From the beginnings of colour television the designers recognised that
    all three video amplifiers had to be DC coupled but the AGC was much
    simpler because they used inverted modulation, so a sync pulse
    corresponding to 100% modulation was always available. I can't imagine
    Philips would have produced a model with such gross errors as you
    describe,. Was your own set faulty or was this a common insurmountable
    problem caused by NTSC and positive modulation on the system in use in
    the U.S. at the time?

    In Europe, Philips and Mullard (their UK valve-making subsidiary)
    published large quantities of material to aid set designers and help
    them get the best out of their range of valves. I read it several years >>> before the colour television service started in England and it included
    details on DC coupling and AGC. (The BBC did a lot of their preliminary
    experimental work using NTSC - but eventually decided to use PAL for the >>> public broadcast system).

    When I was a kid we had an RCA 12" round-tube B+W TV, in a giant piece
    of furniture cabinet with a 12" speaker. It was all AC coupled, so the
    screen always averaged grey.

    It made a pretty good liquor cabinet, after I scrounged it for parts.

    We only had three channels, but the programming was better than what
    we have now.


    https://en.wikipedia.org/wiki/Rosy_retrospection ;)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to bitrex on Thu Feb 27 13:16:02 2025
    On 2/27/2025 9:54 AM, bitrex wrote:
    On 2/27/2025 1:04 AM, Don Y wrote:
    My last set was a JVC AV2600US (IIRC):
        <https://i.ebayimg.com/images/g/cj4AAOSwMKdksA8u/s-l1600.webp>
    It was big and heavy, had funny "dumbo ears" -- but just kept on
    running, year after year (I think I purchased it in the
    early 80's and it lasted well into this millenium -- not bad for
    a $1K investment!)

    The first TV I have memory of was an early 80s Motorola/Quasar 19" that my parents probably bought not long after I was born and looked a lot like this one:

    <https://www.intervideo.co/quasar-wt5957-19-crt-television/amp/>

    Reception wasn't usually a problem we were 10 miles south of the main Boston-area antenna fields in Newton, just rabbit ears sufficed and it worked fine hooked up to a Nintendo...that served well into the 90s when we got a larger Mitsubishi tube-type TV that served well into the 2000s, and then that was about it for the tubes.

    Our first set (B&W) was a Philco (?) with a roundish tube.
    But, it had a MOTORIZED channel selector; a button on the
    top of the cabinet would advance the selector for as long as
    you held it depressed.

    I can still recall the distinctive "rattling" sound that
    followed the selector's settling into each new position.

    The sort of memory like that of a score-motor in a pinball
    machine or the lifting mechanism for a shuffle-bowling
    machine -- very unique and memorable

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From john larkin@21:1/5 to All on Thu Feb 27 12:27:41 2025
    On Thu, 27 Feb 2025 20:12:00 +0100, Lasse Langwadt <llc@fonz.dk>
    wrote:

    On 2/27/25 17:40, john larkin wrote:
    On Thu, 27 Feb 2025 08:37:32 -0800, john larkin <jl@650pot.com> wrote:

    On Thu, 27 Feb 2025 09:47:22 +0000, liz@poppyrecords.invalid.invalid
    (Liz Tuddenham) wrote:

    Sylvia Else <sylvia@email.invalid> wrote:

    Leave aside the ghosting, which could largely be addressed by having a >>>>> decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish >>>>> automatic gain control (AGC), and odd interactions between brightness >>>>> and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my >>>>> experience was that instead it was based on the average amplitude of the >>>>> demodulated signal. A black image containing large white text, such as a >>>>> title screen, would show a clear darkening to the sides of the text, >>>>> while being decidedly grey over the rest of the screen.

    I think you may be mis-remembering; something similar to the fault you >>>> describe was prevalent on B&W televisions which were built down to a
    price. Some of it was caused by average AGC and some was due to lack of >>>> DC coupling, or skimped DC restoration, in the video amplifier. Some of >>>> the better sets used back-porch AGC and, for the enthusiast, add-on
    circuits were published in Wireless World (designed by Mothersole, I
    think).

    From the beginnings of colour television the designers recognised that >>>> all three video amplifiers had to be DC coupled but the AGC was much
    simpler because they used inverted modulation, so a sync pulse
    corresponding to 100% modulation was always available. I can't imagine >>>> Philips would have produced a model with such gross errors as you
    describe,. Was your own set faulty or was this a common insurmountable >>>> problem caused by NTSC and positive modulation on the system in use in >>>> the U.S. at the time?

    In Europe, Philips and Mullard (their UK valve-making subsidiary)
    published large quantities of material to aid set designers and help
    them get the best out of their range of valves. I read it several years >>>> before the colour television service started in England and it included >>>> details on DC coupling and AGC. (The BBC did a lot of their preliminary >>>> experimental work using NTSC - but eventually decided to use PAL for the >>>> public broadcast system).

    When I was a kid we had an RCA 12" round-tube B+W TV, in a giant piece
    of furniture cabinet with a 12" speaker. It was all AC coupled, so the
    screen always averaged grey.

    It made a pretty good liquor cabinet, after I scrounged it for parts.

    We only had three channels, but the programming was better than what
    we have now.


    https://en.wikipedia.org/wiki/Rosy_retrospection ;)

    Early TV had documentaries, science, concerts, live Play Of The Week,
    Kukla Fran and Ollie, westerns, variety shows.

    Now we have superheroes, ultraviolence, and nonstop swearing.

    We do get about an hour or so per week of decent stuff on PBS. Miss
    Marple or All Creatures sorts of things. Not much.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to All on Thu Feb 27 20:45:07 2025
    On 27/02/2025 19:58, KevinJ93 wrote:
    On 2/26/25 8:52 PM, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish
    automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my
    experience was that instead it was based on the average amplitude of
    the demodulated signal. A black image containing large white text,
    such as a title screen, would show a clear darkening to the sides of
    the text, while being decidedly grey over the rest of the screen.

    I suspect this same poor AGC was responsible for a shift in the
    detection of the sync pulse such that the text would be moved to the
    right of its proper position, which could result in distortion of the
    letters as the average brightness varied line by line.

    In the early days of television, using thermionic valves, it was
    probably a miracle that these things worked at all, but surely in the
    transistor age, something better could have been provided.

    Were studio monitors any better, anyone know?

    Sylvia.

    Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
    used transistor signal processing stages and many had already changed to transistors for the power stages such as line and frame output as well
    as using chopper stabilised power supplies.

    The first two colour TVs I recall owned by friends or family were about
    the time of Apollo 8 in 1968. Memorable for the Earth rise shot. Both
    were entirely valves and my uncle's caught fire leaving a nasty brown
    burn mark on their wool carpet and smoke damage on the ceiling.

    The earliest was at a school friends house and was in pastel shades pre
    Nd glass. It was in colour but only just... Joe 90 launch was the first programme I can recall watching there in colour. Test cards in shops
    don't count.

    I'd believe 1974 as a date for hybrid colour TVs that almost worked
    correctly and didn't need a service engineer visiting them every other
    week. By 1980 I'm pretty sure they were almost entirely semiconductor based.

    Typically they used some form of gated AGC to not be affected by the
    video modulation.

    I recall sound on vision being a bit of a problem too with certain types
    of check or dogtooth suits or other high contrast periodic fabrics.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Lasse Langwadt on Thu Feb 27 13:55:56 2025
    On 2/27/2025 12:12 PM, Lasse Langwadt wrote:
    https://en.wikipedia.org/wiki/Rosy_retrospection  ;)

    I found it particularly amusing when the "thought police" opted
    to do away with all of the "violent" scenes in cartoons and
    movies.

    Really? Was seeing Wiley J Coyote get his clock cleaned 10
    times each minute really harmful? Or, Daffy Duck's head being
    blasted (by Elmer or a clever Bugs)? Or, Hector/Spike/Butch
    tearing a new *sshole in Sylvester?

    Or, Kato and The Inspector "going at it" without restraint?
    Or, Moe slapping Curly/Shemp/Joe/Larry silly?

    Now, that violence (and more) is treated as normal (though the
    cartoons have gone down the shitter; they're just not as
    "clever")

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Jones@21:1/5 to Sylvia Else on Fri Feb 28 11:46:14 2025
    On 27/02/2025 3:52 pm, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I suspect this same poor AGC was responsible for a shift in the
    detection of the sync pulse such that the text would be moved to the
    right of its proper position, which could result in distortion of the
    letters as the average brightness varied line by line.

    In the early days of television, using thermionic valves, it was
    probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.

    Were studio monitors any better, anyone know?

    Sylvia.

    It sounds like you just had a crap telly.

    I find the only thing worse about it was the resolution.

    If reception conditions were poor, yes the picture could degrade a bit,
    but that is far preferable to the behaviour of digital systems that
    completely drop out halfway through the movie if the rain gets too
    heavy. If you're watching an off-air recording that is particularly frustrating, because moving the aerial could only have been done in the
    past. A good outdoor aerial is much more necessary than it used to be.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Chris Jones on Thu Feb 27 18:29:27 2025
    On 2/27/2025 5:46 PM, Chris Jones wrote:
    If reception conditions were poor, yes the picture could degrade a bit, but that is far preferable to the behaviour of digital systems that completely drop
    out halfway through the movie if the rain gets too heavy.

    Is it the presence of the water in the air that is the problem?
    I've noted the correlation but assumed it was because of the
    effects of wind and rain on the *tree* that is in my line-of-sight
    to the broadcast towers.

    I.e., if the transmitter had a COMPLETELY unobstructed path to my
    antenna, would rain *still* be a problem?

    If you're watching an
    off-air recording that is particularly frustrating, because moving the aerial could only have been done in the past. A good outdoor aerial is much more necessary than it used to be.



    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Jones@21:1/5 to Don Y on Fri Feb 28 12:49:49 2025
    On 28/02/2025 12:29 pm, Don Y wrote:
    On 2/27/2025 5:46 PM, Chris Jones wrote:
    If reception conditions were poor, yes the picture could degrade a
    bit, but that is far preferable to the behaviour of digital systems
    that completely drop out halfway through the movie if the rain gets
    too heavy.

    Is it the presence of the water in the air that is the problem?
    I've noted the correlation but assumed it was because of the
    effects of wind and rain on the *tree* that is in my line-of-sight
    to the broadcast towers.


    I don't know, and I expect it depends on many things.

    I.e., if the transmitter had a COMPLETELY unobstructed path to my
    antenna, would rain *still* be a problem?

    For satellite (12GHz) certainly yes, unless you have an extra-big dish.
    Not sure about terrestrial at UHF.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Sylvia Else@21:1/5 to Liz Tuddenham on Fri Feb 28 10:44:21 2025
    On 27-Feb-25 5:47 pm, Liz Tuddenham wrote:
    Sylvia Else <sylvia@email.invalid> wrote:

    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish
    automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    The AGC should have been based on the amplitude of the sync pulses,
    which was 30% of the total. I'm sure this could have been done, but my
    experience was that instead it was based on the average amplitude of the
    demodulated signal. A black image containing large white text, such as a
    title screen, would show a clear darkening to the sides of the text,
    while being decidedly grey over the rest of the screen.

    I think you may be mis-remembering; something similar to the fault you describe was prevalent on B&W televisions which were built down to a
    price. Some of it was caused by average AGC and some was due to lack of
    DC coupling, or skimped DC restoration, in the video amplifier. Some of
    the better sets used back-porch AGC and, for the enthusiast, add-on
    circuits were published in Wireless World (designed by Mothersole, I
    think).

    From the beginnings of colour television the designers recognised that
    all three video amplifiers had to be DC coupled but the AGC was much
    simpler because they used inverted modulation, so a sync pulse
    corresponding to 100% modulation was always available. I can't imagine Philips would have produced a model with such gross errors as you
    describe,. Was your own set faulty or was this a common insurmountable problem caused by NTSC and positive modulation on the system in use in
    the U.S. at the time?

    In Europe, Philips and Mullard (their UK valve-making subsidiary)
    published large quantities of material to aid set designers and help
    them get the best out of their range of valves. I read it several years before the colour television service started in England and it included details on DC coupling and AGC. (The BBC did a lot of their preliminary experimental work using NTSC - but eventually decided to use PAL for the public broadcast system).



    This is repost, because the original seems to have disappeared into a
    Usenet black-hole. At least, I can't see it.

    Also, it was a Philips TV.


    I was living in France, and there was a legal requirement for new
    televisions to have a SCART socket. There was a video output pin that
    was specified as being 1v peak to peak. So the sync pulses should have
    been 0.3v. In practice they varied between that and perhaps twice that, depending on the image brightness.

    Sylvia.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Sylvia Else@21:1/5 to Lasse Langwadt on Fri Feb 28 10:55:48 2025
    On 28-Feb-25 3:12 am, Lasse Langwadt wrote:


    https://en.wikipedia.org/wiki/Rosy_retrospection  ;)


    Either I'm immune to that, or my past was truly awful. Given that I'm
    still alive, I suspect the former [*].

    As a slight aside, I tried watching episodes of the first series of Star
    Trek a few years back. It was rather obvious that the display panels on
    the Enterprise's bridge were actually back-lit paper. Good enough for
    TVs of the time, apparently, but not for modern digital displays.

    Sylvia.

    [*] I don't want to overstate this - no need to set up a go-fund-me page
    in my name.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From KevinJ93@21:1/5 to Martin Brown on Thu Feb 27 19:03:22 2025
    On 2/27/25 12:45 PM, Martin Brown wrote:
    On 27/02/2025 19:58, KevinJ93 wrote:
    On 2/26/25 8:52 PM, Sylvia Else wrote:
    <...>

    Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
    used transistor signal processing stages and many had already changed
    to transistors for the power stages such as line and frame output as
    well as using chopper stabilised power supplies.

    The first two colour TVs I recall owned by friends or family were about
    the time of Apollo 8 in 1968. Memorable for the Earth rise shot. Both
    were entirely valves and my uncle's caught fire leaving a nasty brown
    burn mark on their wool carpet and smoke damage on the ceiling.

    The earliest was at a school friends house and was in pastel shades pre
    Nd glass. It was in colour but only just... Joe 90 launch was the first programme I can recall watching there in colour. Test cards in shops
    don't count.

    I'd believe 1974 as a date for hybrid colour TVs that almost worked
    correctly and didn't need a service engineer visiting them every other
    week. By 1980 I'm pretty sure they were almost entirely semiconductor
    based.

    My father bought a Ferguson 19" colour TV at the end of 1970 that was
    fully semiconductor (it was my first term at university and he got it
    just before I came back for Christmas). It seemed to work fairly well -
    he would tinker with it but I don't remember it needing any significant
    repair. I gather it was one of the first such sets.

    The set in the common room at university was an older valve based one
    with no blue channel but it was surprisingly watchable and it didn't
    stop all the students filling the room to bursting point when Top of the
    Pops was on on Thursday night.

    https://oldtechnology.net/colour.html#ferguson3703

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Sylvia Else on Thu Feb 27 20:28:57 2025
    On 2/27/2025 7:55 PM, Sylvia Else wrote:
    As a slight aside, I tried watching episodes of the first series of Star Trek a
    few years back. It was rather obvious that the display panels on the Enterprise's bridge were actually back-lit paper. Good enough for TVs of the time, apparently, but not for modern digital displays.

    Filming "monitors" leads to visual artifacts as the frame rate
    need not agree (i.e., with REAL film) or the synchronization
    can wander. So, it's was always best to fudge.

    Of course, nowadays, these issues are easily overcome.

    ObTrivia: "The Guide" (in the first series/movie) was
    actually projected from below. I guess hand-held tablets
    were technically impractical, for emulation, at the time.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeff Layman@21:1/5 to All on Fri Feb 28 08:15:42 2025
    On 28/02/2025 03:03, KevinJ93 wrote:
    On 2/27/25 12:45 PM, Martin Brown wrote:
    On 27/02/2025 19:58, KevinJ93 wrote:
    On 2/26/25 8:52 PM, Sylvia Else wrote:
    <...>

    Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
    used transistor signal processing stages and many had already changed
    to transistors for the power stages such as line and frame output as
    well as using chopper stabilised power supplies.

    The first two colour TVs I recall owned by friends or family were about
    the time of Apollo 8 in 1968. Memorable for the Earth rise shot. Both
    were entirely valves and my uncle's caught fire leaving a nasty brown
    burn mark on their wool carpet and smoke damage on the ceiling.

    The earliest was at a school friends house and was in pastel shades pre
    Nd glass. It was in colour but only just... Joe 90 launch was the first
    programme I can recall watching there in colour. Test cards in shops
    don't count.

    I'd believe 1974 as a date for hybrid colour TVs that almost worked
    correctly and didn't need a service engineer visiting them every other
    week. By 1980 I'm pretty sure they were almost entirely semiconductor
    based.

    My father bought a Ferguson 19" colour TV at the end of 1970 that was
    fully semiconductor (it was my first term at university and he got it
    just before I came back for Christmas). It seemed to work fairly well -
    he would tinker with it but I don't remember it needing any significant repair. I gather it was one of the first such sets.

    The first domestic UK colour sets were valve-based. However, it wasn't
    long before transistor sets came in. See page 22 at <https://americanradiohistory.com/UK/Practical-Television/60s/Practical-Television-1968-06.pdf#search=%22practical%20television%22>.
    This was the June 1968 edition of Practical Television, and it refers to
    the new 19" Marconiphone Model 4701 as being "fully transistorised".
    More details can be found in Practical TV July and September 1967.
    What's amazing to me is the price - "284 guineas". So just short of £300
    in 1968; equivalent to £4500 today!!!

    --
    Jeff

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Don Y on Fri Feb 28 12:08:59 2025
    On 28/02/2025 01:29, Don Y wrote:
    On 2/27/2025 5:46 PM, Chris Jones wrote:
    If reception conditions were poor, yes the picture could degrade a
    bit, but that is far preferable to the behaviour of digital systems
    that completely drop out halfway through the movie if the rain gets
    too heavy.

    Is it the presence of the water in the air that is the problem?
    I've noted the correlation but assumed it was because of the
    effects of wind and rain on the *tree* that is in my line-of-sight
    to the broadcast towers.

    Quite likely to be a factor. Wet tree leaves can be pretty destructive
    to signals to hence our local microwave internet requires clear line of
    sight or the microwave link is unreliable.

    Some VHF/UHF bands are less susceptible to water in their path.

    I.e., if the transmitter had a COMPLETELY unobstructed path to my
    antenna, would rain *still* be a problem?

    Obviously it depends a lot on the frequency, but my satellite feed drops
    out when there are tall cumulonimbus thunderclouds overhead irrespective
    of whether it is actually raining. TDTV holds good under most conditions
    expect when the local mast burned down spectacularly in a freak accident.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Bill Sloman@21:1/5 to Sylvia Else on Sat Mar 1 02:32:50 2025
    On 27/02/2025 3:52 pm, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    It always struck me as fine. We were late getting it in Northern
    Tasmania - the first transmissions that made it were from the 1956
    Melbourne Olympic Games, and the only people who bought sets were mad
    optimists because the signal vary rarely made it across Bass Strait.

    I didn't get to see it much until I started my university studies in
    Melbourne in 1960 when there was a TV in my residential college.

    It was a PAL system - pretty much the UK-German system

    But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
    and picture position.

    By 1976 I was working at EMI Central Research (where Alan Dower Blumlein
    had pretty much invented television in the late 1930s, and my boss had
    been involved in the legal dispute between RCA and EMI about whose
    patents had priority - RCA won mainly because the US court didn't want
    to see that quadrature decoding was was the same thing as sine-cosine decoding). I never owned a home TV - but when I got married in 1979 my
    wife insisted that we buy one, and it worked fine.

    By then I knew exactly what it was doing. At Cambridge Instrument we
    used standard consumer TV tubes as displays for our scanning electron microscope - we'd buy a couple of hundred out of a commercial batch, and
    bodge the cabinets to accommodate that batch. The next batch was always
    a slightly different shape, and the sheet metal shop could always adjust
    the cabinet to accommodate the new batch.

    For a while we generated a 624-line display - a timing PROM had gotten corrupted. One of the other engineers eventually worked out that the
    timing was wrong and I worked out how the PROM had been intended to be programmed.

    Were studio monitors any better, anyone know?

    TV based visual displays could be remarkably good. Cambridge Instruments offered a photo-monitor which presented a slow 4000 line display -
    intended to show slow high-resolution scans to be recorded on
    photographic film for publication. They used very special tubes with the
    inner surface metalised to provide electrostatic screening. A batch came
    in where the metal layer didn't look shiny and they almost worked ...

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Jones@21:1/5 to Martin Brown on Fri Feb 28 11:39:58 2025
    On 27/02/2025 8:49 pm, Martin Brown wrote:
    ISTR the original valve EHT stack rectifier gave off X-rays and had a
    lead shield around it.

    I think that was mostly from the shunt regulator tube that some early
    colour TVs used to regulate the EHT. The diode mostly didn't experience
    large voltages and large currents simultaneously, but the shunt
    regulator did.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Martin Brown on Fri Feb 28 10:37:12 2025
    On 2/28/2025 5:08 AM, Martin Brown wrote:
    On 28/02/2025 01:29, Don Y wrote:
    On 2/27/2025 5:46 PM, Chris Jones wrote:
    If reception conditions were poor, yes the picture could degrade a bit, but >>> that is far preferable to the behaviour of digital systems that completely >>> drop out halfway through the movie if the rain gets too heavy.

    Is it the presence of the water in the air that is the problem?
    I've noted the correlation but assumed it was because of the
    effects of wind and rain on the *tree* that is in my line-of-sight
    to the broadcast towers.

    Quite likely to be a factor. Wet tree leaves can be pretty destructive to signals to hence our local microwave internet requires clear line of sight or the microwave link is unreliable.

    Some VHF/UHF bands are less susceptible to water in their path.

    *Most* of the local broadcasters transmit from the same general
    location "on the mountain". We're pretty close to the *base* of
    said mountain so the line-of-sight to the transmitters is ~12 miles
    "over" and ~2 miles *up*. For most positionings of the (outdoor)
    antenna, this passes directly through the SOLE tall tree in the
    area. :< Of course, said tree will continue to grow taller and wider.

    We've noticed that reception suffers during Monsoon storms -- high
    winds and intense rains. But, I've not been able to determine
    definitively if it was the rain or the "wet, BLOWN tree" that was
    the problem.

    [This is important as it determines how much effort I should expend
    to re-site the receiving antenna]

    I.e., if the transmitter had a COMPLETELY unobstructed path to my
    antenna, would rain *still* be a problem?

    Obviously it depends a lot on the frequency, but my satellite feed drops out when there are tall cumulonimbus thunderclouds overhead irrespective of whether
    it is actually raining. TDTV holds good under most conditions expect when the local mast burned down spectacularly in a freak accident.


    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to All on Fri Feb 28 17:39:46 2025
    On 28/02/2025 03:03, KevinJ93 wrote:
    On 2/27/25 12:45 PM, Martin Brown wrote:

    The first two colour TVs I recall owned by friends or family were
    about the time of Apollo 8 in 1968. Memorable for the Earth rise shot.
    Both were entirely valves and my uncle's caught fire leaving a nasty
    brown burn mark on their wool carpet and smoke damage on the ceiling.

    The earliest was at a school friends house and was in pastel shades
    pre Nd glass. It was in colour but only just... Joe 90 launch was the
    first programme I can recall watching there in colour. Test cards in
    shops don't count.

    I'd believe 1974 as a date for hybrid colour TVs that almost worked
    correctly and didn't need a service engineer visiting them every other
    week. By 1980 I'm pretty sure they were almost entirely semiconductor
    based.

    My father bought a Ferguson 19" colour TV at the end of 1970 that was
    fully semiconductor (it was my first term at university and he got it
    just before I came back for Christmas). It seemed to work fairly well -
    he would tinker with it but I don't remember it needing any significant repair. I gather it was one of the first such sets.

    It was the valve ones from the mid to late 60's that were both seriously expensive and unreliable. So much so that most people at the time rented
    them since then routine service and repair was included. Enough money
    was made on the TV rental business to fund a new Cambridge college.

    https://www.robinson.cam.ac.uk/college-life/library/college-archive-and-history/sir-david-robinson

    There were others in the same white goods and TV rental business and
    several of the founders were big charity donors and philanthropists.

    The set in the common room at university was an older valve based one
    with no blue channel but it was surprisingly watchable and it didn't

    Valve sets ran hot and somehow valves would gradually work themselves
    out of sockets or just plain and simple fail at switch on. I saw one or
    two very old all valve sets survive into the 80's that kept the valve
    filaments warm continuously even when the set was nominally off.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeff Layman@21:1/5 to Martin Brown on Fri Feb 28 18:28:45 2025
    On 28/02/2025 17:39, Martin Brown wrote:

    Valve sets ran hot and somehow valves would gradually work themselves
    out of sockets or just plain and simple fail at switch on. I saw one or
    two very old all valve sets survive into the 80's that kept the valve filaments warm continuously even when the set was nominally off.

    That's how Colossus was run reliably at Bletchley Park. The valve
    filaments were always on.

    --
    Jeff

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Joe Gwinn@21:1/5 to '''newspam'''@nonad.co.uk on Fri Feb 28 13:48:02 2025
    On Fri, 28 Feb 2025 17:39:46 +0000, Martin Brown
    <'''newspam'''@nonad.co.uk> wrote:

    On 28/02/2025 03:03, KevinJ93 wrote:
    On 2/27/25 12:45 PM, Martin Brown wrote:

    The first two colour TVs I recall owned by friends or family were
    about the time of Apollo 8 in 1968. Memorable for the Earth rise shot.
    Both were entirely valves and my uncle's caught fire leaving a nasty
    brown burn mark on their wool carpet and smoke damage on the ceiling.

    The earliest was at a school friends house and was in pastel shades
    pre Nd glass. It was in colour but only just... Joe 90 launch was the
    first programme I can recall watching there in colour. Test cards in
    shops don't count.

    I'd believe 1974 as a date for hybrid colour TVs that almost worked
    correctly and didn't need a service engineer visiting them every other
    week. By 1980 I'm pretty sure they were almost entirely semiconductor
    based.

    My father bought a Ferguson 19" colour TV at the end of 1970 that was
    fully semiconductor (it was my first term at university and he got it
    just before I came back for Christmas). It seemed to work fairly well -
    he would tinker with it but I don't remember it needing any significant
    repair. I gather it was one of the first such sets.

    It was the valve ones from the mid to late 60's that were both seriously >expensive and unreliable. So much so that most people at the time rented
    them since then routine service and repair was included. Enough money
    was made on the TV rental business to fund a new Cambridge college.

    https://www.robinson.cam.ac.uk/college-life/library/college-archive-and-history/sir-david-robinson

    There were others in the same white goods and TV rental business and
    several of the founders were big charity donors and philanthropists.

    The set in the common room at university was an older valve based one
    with no blue channel but it was surprisingly watchable and it didn't

    Valve sets ran hot and somehow valves would gradually work themselves
    out of sockets or just plain and simple fail at switch on. I saw one or
    two very old all valve sets survive into the 80's that kept the valve >filaments warm continuously even when the set was nominally off.

    I remember those days. When in the 1970s the 1N4000-series rectifier
    diodes became strong enough and cheap enough, we would bridge the
    power switch with one such diode such that the filaments stayed warm
    (not hot) even when the TV was off.

    Joe

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Del Rosso@21:1/5 to Lasse Langwadt on Fri Feb 28 11:59:43 2025
    Lasse Langwadt wrote:
    On 2/27/25 17:40, john larkin wrote:

    We only had three channels, but the programming was better than what
    we have now.


    https://en.wikipedia.org/wiki/Rosy_retrospection ;)

    Commercial networks sometimes had operas, then even PBS stopped showing
    them, then the Bravo cable channel was created for them, then it became
    the Real Housewives channel.

    --

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From john larkin@21:1/5 to fizzbintuesday@that-google-mail-dom on Fri Feb 28 18:40:58 2025
    On Fri, 28 Feb 2025 11:59:43 -0500, "Tom Del Rosso" <fizzbintuesday@that-google-mail-domain.com> wrote:

    Lasse Langwadt wrote:
    On 2/27/25 17:40, john larkin wrote:

    We only had three channels, but the programming was better than what
    we have now.


    https://en.wikipedia.org/wiki/Rosy_retrospection ;)

    Commercial networks sometimes had operas, then even PBS stopped showing
    them, then the Bravo cable channel was created for them, then it became
    the Real Housewives channel.

    Television shows used to come from New York.

    Now they come from Hollywood.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Edward Rawde@21:1/5 to Jeff Layman on Sat Mar 1 20:52:46 2025
    "Jeff Layman" <Jeff@invalid.invalid> wrote in message news:vprrbe$3in6g$1@dont-email.me...
    On 28/02/2025 03:03, KevinJ93 wrote:
    On 2/27/25 12:45 PM, Martin Brown wrote:
    On 27/02/2025 19:58, KevinJ93 wrote:
    On 2/26/25 8:52 PM, Sylvia Else wrote:
    <...>

    Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
    used transistor signal processing stages and many had already changed
    to transistors for the power stages such as line and frame output as
    well as using chopper stabilised power supplies.

    The first two colour TVs I recall owned by friends or family were about
    the time of Apollo 8 in 1968. Memorable for the Earth rise shot. Both
    were entirely valves and my uncle's caught fire leaving a nasty brown
    burn mark on their wool carpet and smoke damage on the ceiling.

    The earliest was at a school friends house and was in pastel shades pre
    Nd glass. It was in colour but only just... Joe 90 launch was the first
    programme I can recall watching there in colour. Test cards in shops
    don't count.

    I'd believe 1974 as a date for hybrid colour TVs that almost worked
    correctly and didn't need a service engineer visiting them every other
    week. By 1980 I'm pretty sure they were almost entirely semiconductor
    based.

    My father bought a Ferguson 19" colour TV at the end of 1970 that was
    fully semiconductor (it was my first term at university and he got it
    just before I came back for Christmas). It seemed to work fairly well -
    he would tinker with it but I don't remember it needing any significant
    repair. I gather it was one of the first such sets.

    The first domestic UK colour sets were valve-based. However, it wasn't long before transistor sets came in. See page 22 at
    <https://americanradiohistory.com/UK/Practical-Television/60s/Practical-Television-1968-06.pdf#search=%22practical%20television%22>.
    This was the June 1968 edition of Practical Television, and it refers to the new 19" Marconiphone Model 4701 as being "fully
    transistorised".

    I think they worked with a Texas Instruments facility in the UK where the necessary transistor was produced to make it possible to
    do the line scan and EHT without valves.
    R2008B I think. Doesn't seem possible to find any data on it now. https://www.google.com/search?q=R2008B+npn+transistor

    More details can be found in Practical TV July and September 1967. What's amazing to me is the price - "284 guineas". So just
    short of £300 in 1968; equivalent to £4500 today!!!

    --
    Jeff

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Edward Rawde@21:1/5 to kevin_es@whitedigs.com on Sat Mar 1 20:15:49 2025
    "KevinJ93" <kevin_es@whitedigs.com> wrote in message news:vpqgb3$389ar$2@dont-email.me...
    On 2/27/25 1:49 AM, Martin Brown wrote:
    On 27/02/2025 08:10, Jeff Layman wrote:
    On 27/02/2025 04:52, Sylvia Else wrote:
    <...>
    I always believed that made it impossible for them not to have newscasters with flesh that slowly shifted between ghastly green
    and purple tones (or was clamped to unnatural pale orange like the Donald's). NTSC was called Never Twice the Same Colour in the
    UK for good reason. PAL was self correcting. My Japanese sets could do both. >>
    However, when I was in Japan I saw US style NTSC TV implemented correctly. It seems there was no reason that it could not be made
    to work well only that US makers couldn't be bothered to do it right.
    <...>

    Sony was notorious for not implementing PAL decoding fully by omitting the delay line and effectively processing PAL as if it was
    NTSC.

    They did that so that they didn't have to pay PAL license fees.
    But no-one cared because the picture on their first set sold in the UK, although small screen, was superior to any other set on the
    market.
    https://www.google.com/search?q=kv1320-ub
    They stopped doing it when all you needed for a PAL decoder was a chip, a crystal and a 64uS piece of glass (which had got much
    smaller by then due to bouncing the signal around in the glass).

    The history of Sony colour is interesting.
    Trinitron came about because this had to be abandoned: https://en.wikipedia.org/wiki/Chromatron


    kw

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Joerg@21:1/5 to Don Y on Tue Mar 11 13:03:05 2025
    On 2/26/25 10:04 PM, Don Y wrote:
    On 2/26/2025 9:52 PM, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    Analog television had two distinct issues:  one was multipath problems (bummer), the other was that you COULD eek a signal out of the ether,
    even a bad one (contrast to digital which is essentially "all or nothing")
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

    That's exactly the problem. I told one station manager that it would be
    a mistake to go full digital (well, the government made them) and even
    more of a mistake to give up their VHF channel without a fight. That it
    would erode viewership in the fringe areas, hence for people with more disposable income, who will migrate to the Internet, resulting in
    advertising income to drop, which will lead to painful staff cuts.

    He didn't believe me. And then pretty much all that happened.

    [...]

    --
    Regards, Joerg

    http://www.analogconsultants.com/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Joerg on Tue Mar 11 17:11:50 2025
    On 3/11/2025 1:03 PM, Joerg wrote:
    On 2/26/25 10:04 PM, Don Y wrote:
    On 2/26/2025 9:52 PM, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    Analog television had two distinct issues:  one was multipath problems
    (bummer), the other was that you COULD eek a signal out of the ether,
    even a bad one (contrast to digital which is essentially "all or nothing")
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

    That's exactly the problem. I told one station manager that it would be a mistake to go full digital (well, the government made them) and even more of a
    mistake to give up their VHF channel without a fight. That it would erode viewership in the fringe areas, hence for people with more disposable income, who will migrate to the Internet, resulting in advertising income to drop, which will lead to painful staff cuts.

    He didn't believe me. And then pretty much all that happened.

    I don't have much sympathy for broadcasters. The quality of content
    has fallen as the number of *available* broadcast channels has
    risen. I think we have *50* channels -- and there's STILL "nothing
    on". (we make extensive use of the public library's DVD collection)

    "Local news" being the one thing you would think a broadcaster could
    offer value. Yet, they waste the bandwidth showing talking heads
    instead of (visual) material that could enhance the issues they are
    discussing. E.g., why do I need to see a talking head instead of
    something -- ANYTHING -- that might better explain the issue
    being discussed?

    "The wildfire in east bumphuck has now grown to 30,000 acres..."
    Would it KILL you to show a MAP on the screen so we know WHERE
    east bumphuck is located? How does looking at a reporter's
    face add value? How does it justify the use of the public
    airwaves when the *content* could be easily fit into the bandwidth
    of an AM radio station??

    Do we need to get the weather forecast in 30 second pieces, spread
    over a 30 minute broadcast?

    I can't understand why no one has replaced The Newsroom with
    AI generated content and a modern version of Maxx Headroom.
    Think of all the salaries to be saved (and less reliance on
    commercial interruptions). "Hair and wardrobe provided by
    Joe Bloe..."

    It's just so much easier -- and takes less time -- to visit a few
    web sites for news and weather. And, (any) another service for
    entertainment content.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From john larkin @21:1/5 to All on Tue Mar 11 19:44:48 2025
    On Tue, 11 Mar 2025 13:03:05 -0700, Joerg <news@analogconsultants.com>
    wrote:

    On 2/26/25 10:04 PM, Don Y wrote:
    On 2/26/2025 9:52 PM, Sylvia Else wrote:
    Leave aside the ghosting, which could largely be addressed by having a
    decent antenna.

    Analog television had two distinct issues:  one was multipath problems
    (bummer), the other was that you COULD eek a signal out of the ether,
    even a bad one (contrast to digital which is essentially "all or nothing")
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

    That's exactly the problem. I told one station manager that it would be
    a mistake to go full digital (well, the government made them) and even
    more of a mistake to give up their VHF channel without a fight. That it
    would erode viewership in the fringe areas, hence for people with more >disposable income, who will migrate to the Internet, resulting in
    advertising income to drop, which will lead to painful staff cuts.

    He didn't believe me. And then pretty much all that happened.

    [...]

    It doesn't make much sense to blast megawatts of RF out into space,
    for a modest amount of one-way bandwidth.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)