Leave aside the ghosting, which could largely be addressed by having a decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC),
and odd interactions between brightness and picture position.
The AGC should have been based on the amplitude of the sync pulses, which was 30% of the total. I'm sure this could have been
done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing
large white text, such as a title screen, would show a clear darkening to the sides of the text, while being decidedly grey over
the rest of the screen.
I suspect this same poor AGC was responsible for a shift in the detection of the sync pulse such that the text would be moved to
the right of its proper position, which could result in distortion of the letters as the average brightness varied line by line.
In the early days of television, using thermionic valves, it was probably a miracle that these things worked at all, but surely in
the transistor age, something better could have been provided.
Were studio monitors any better, anyone know?
Sylvia.
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
I suspect this same poor AGC was responsible for a shift in the
detection of the sync pulse such that the text would be moved to the
right of its proper position, which could result in distortion of the
letters as the average brightness varied line by line.
In the early days of television, using thermionic valves, it was
probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.
Were studio monitors any better, anyone know?
Leave aside the ghosting, which could largely be addressed by having a decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness and picture position.
The AGC should have been based on the amplitude of the sync pulses, which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A
black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text, while being decidedly grey over the rest of the screen.
I suspect this same poor AGC was responsible for a shift in the detection of the sync pulse such that the text would be moved to the right of its proper position, which could result in distortion of the letters as the average brightness varied line by line.
In the early days of television, using thermionic valves, it was probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.
Were studio monitors any better, anyone know?
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
On 27/02/2025 04:52, Sylvia Else wrote:
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish
automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my
experience was that instead it was based on the average amplitude of the
demodulated signal. A black image containing large white text, such as a
title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
I suspect this same poor AGC was responsible for a shift in the
detection of the sync pulse such that the text would be moved to the
right of its proper position, which could result in distortion of the
letters as the average brightness varied line by line.
In the early days of television, using thermionic valves, it was
probably a miracle that these things worked at all, but surely in the
transistor age, something better could have been provided.
Were studio monitors any better, anyone know?
I suppose that's one of the problems with being "first past the post" -
you get stuck with the system. The US had NTSC, valves, and 525 lines.
In the UK colour broadcasting didn't start until over 12 years later -
July 1967 - using PAL, transistors, and 625 lines. Most of western
Europe was the same as the UK, but in France and eastern Europe it was
SECAM and 819 lines. From what I understand, PAL was better than NTSC,
and from what I remember, analogue colour in the UK wasn't bad at all.
Sylvia Else <sylvia@email.invalid> wrote:
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish
automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my
experience was that instead it was based on the average amplitude of the
demodulated signal. A black image containing large white text, such as a
title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
I think you may be mis-remembering; something similar to the fault you >describe was prevalent on B&W televisions which were built down to a
price. Some of it was caused by average AGC and some was due to lack of
DC coupling, or skimped DC restoration, in the video amplifier. Some of
the better sets used back-porch AGC and, for the enthusiast, add-on
circuits were published in Wireless World (designed by Mothersole, I
think).
From the beginnings of colour television the designers recognised that
all three video amplifiers had to be DC coupled but the AGC was much
simpler because they used inverted modulation, so a sync pulse
corresponding to 100% modulation was always available. I can't imagine >Philips would have produced a model with such gross errors as you
describe,. Was your own set faulty or was this a common insurmountable >problem caused by NTSC and positive modulation on the system in use in
the U.S. at the time?
In Europe, Philips and Mullard (their UK valve-making subsidiary)
published large quantities of material to aid set designers and help
them get the best out of their range of valves. I read it several years >before the colour television service started in England and it included >details on DC coupling and AGC. (The BBC did a lot of their preliminary >experimental work using NTSC - but eventually decided to use PAL for the >public broadcast system).
On 2/26/2025 9:52 PM, Sylvia Else wrote:
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
Analog television had two distinct issues:Â one was multipath problems (bummer), the other was that you COULD eek a signal out of the ether,
even a bad one (contrast to digital which is essentially "all or nothing")
But my memory of a Philips Colour TV (1984ish) was that it had rubbish
automatic gain control (AGC), and odd interactions between brightness
and picture position.
My last set was a JVC AV2600US (IIRC):
  <https://i.ebayimg.com/images/g/cj4AAOSwMKdksA8u/s-l1600.webp>
It was big and heavy, had funny "dumbo ears" -- but just kept on
running, year after year (I think I purchased it in the
early 80's and it lasted well into this millenium -- not bad for
a $1K investment!)
On Thu, 27 Feb 2025 09:47:22 +0000, liz@poppyrecords.invalid.invalid
(Liz Tuddenham) wrote:
Sylvia Else <sylvia@email.invalid> wrote:
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish
automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my
experience was that instead it was based on the average amplitude of the >>> demodulated signal. A black image containing large white text, such as a >>> title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
I think you may be mis-remembering; something similar to the fault you >>describe was prevalent on B&W televisions which were built down to a
price. Some of it was caused by average AGC and some was due to lack of
DC coupling, or skimped DC restoration, in the video amplifier. Some of >>the better sets used back-porch AGC and, for the enthusiast, add-on >>circuits were published in Wireless World (designed by Mothersole, I >>think).
From the beginnings of colour television the designers recognised that
all three video amplifiers had to be DC coupled but the AGC was much >>simpler because they used inverted modulation, so a sync pulse >>corresponding to 100% modulation was always available. I can't imagine >>Philips would have produced a model with such gross errors as you >>describe,. Was your own set faulty or was this a common insurmountable >>problem caused by NTSC and positive modulation on the system in use in
the U.S. at the time?
In Europe, Philips and Mullard (their UK valve-making subsidiary)
published large quantities of material to aid set designers and help
them get the best out of their range of valves. I read it several years >>before the colour television service started in England and it included >>details on DC coupling and AGC. (The BBC did a lot of their preliminary >>experimental work using NTSC - but eventually decided to use PAL for the >>public broadcast system).
When I was a kid we had an RCA 12" round-tube B+W TV, in a giant piece
of furniture cabinet with a 12" speaker. It was all AC coupled, so the
screen always averaged grey.
It made a pretty good liquor cabinet, after I scrounged it for parts.
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
I suspect this same poor AGC was responsible for a shift in the
detection of the sync pulse such that the text would be moved to the
right of its proper position, which could result in distortion of the
letters as the average brightness varied line by line.
In the early days of television, using thermionic valves, it was
probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.
Were studio monitors any better, anyone know?
Sylvia.
On 27/02/2025 08:10, Jeff Layman wrote:<...>
On 27/02/2025 04:52, Sylvia Else wrote:
I always believed that made it impossible for them not to have<...>
newscasters with flesh that slowly shifted between ghastly green and
purple tones (or was clamped to unnatural pale orange like the
Donald's). NTSC was called Never Twice the Same Colour in the UK for
good reason. PAL was self correcting. My Japanese sets could do both.
However, when I was in Japan I saw US style NTSC TV implemented
correctly. It seems there was no reason that it could not be made to
work well only that US makers couldn't be bothered to do it right.
On Thu, 27 Feb 2025 08:37:32 -0800, john larkin <jl@650pot.com> wrote:
On Thu, 27 Feb 2025 09:47:22 +0000, liz@poppyrecords.invalid.invalid
(Liz Tuddenham) wrote:
Sylvia Else <sylvia@email.invalid> wrote:
Leave aside the ghosting, which could largely be addressed by having a >>>> decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish >>>> automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my >>>> experience was that instead it was based on the average amplitude of the >>>> demodulated signal. A black image containing large white text, such as a >>>> title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
I think you may be mis-remembering; something similar to the fault you
describe was prevalent on B&W televisions which were built down to a
price. Some of it was caused by average AGC and some was due to lack of >>> DC coupling, or skimped DC restoration, in the video amplifier. Some of >>> the better sets used back-porch AGC and, for the enthusiast, add-on
circuits were published in Wireless World (designed by Mothersole, I
think).
From the beginnings of colour television the designers recognised that
all three video amplifiers had to be DC coupled but the AGC was much
simpler because they used inverted modulation, so a sync pulse
corresponding to 100% modulation was always available. I can't imagine
Philips would have produced a model with such gross errors as you
describe,. Was your own set faulty or was this a common insurmountable
problem caused by NTSC and positive modulation on the system in use in
the U.S. at the time?
In Europe, Philips and Mullard (their UK valve-making subsidiary)
published large quantities of material to aid set designers and help
them get the best out of their range of valves. I read it several years >>> before the colour television service started in England and it included
details on DC coupling and AGC. (The BBC did a lot of their preliminary
experimental work using NTSC - but eventually decided to use PAL for the >>> public broadcast system).
When I was a kid we had an RCA 12" round-tube B+W TV, in a giant piece
of furniture cabinet with a 12" speaker. It was all AC coupled, so the
screen always averaged grey.
It made a pretty good liquor cabinet, after I scrounged it for parts.
We only had three channels, but the programming was better than what
we have now.
On 2/27/2025 1:04 AM, Don Y wrote:
My last set was a JVC AV2600US (IIRC):
   <https://i.ebayimg.com/images/g/cj4AAOSwMKdksA8u/s-l1600.webp>
It was big and heavy, had funny "dumbo ears" -- but just kept on
running, year after year (I think I purchased it in the
early 80's and it lasted well into this millenium -- not bad for
a $1K investment!)
The first TV I have memory of was an early 80s Motorola/Quasar 19" that my parents probably bought not long after I was born and looked a lot like this one:
<https://www.intervideo.co/quasar-wt5957-19-crt-television/amp/>
Reception wasn't usually a problem we were 10 miles south of the main Boston-area antenna fields in Newton, just rabbit ears sufficed and it worked fine hooked up to a Nintendo...that served well into the 90s when we got a larger Mitsubishi tube-type TV that served well into the 2000s, and then that was about it for the tubes.
On 2/27/25 17:40, john larkin wrote:
On Thu, 27 Feb 2025 08:37:32 -0800, john larkin <jl@650pot.com> wrote:
On Thu, 27 Feb 2025 09:47:22 +0000, liz@poppyrecords.invalid.invalid
(Liz Tuddenham) wrote:
Sylvia Else <sylvia@email.invalid> wrote:
Leave aside the ghosting, which could largely be addressed by having a >>>>> decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish >>>>> automatic gain control (AGC), and odd interactions between brightness >>>>> and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my >>>>> experience was that instead it was based on the average amplitude of the >>>>> demodulated signal. A black image containing large white text, such as a >>>>> title screen, would show a clear darkening to the sides of the text, >>>>> while being decidedly grey over the rest of the screen.
I think you may be mis-remembering; something similar to the fault you >>>> describe was prevalent on B&W televisions which were built down to a
price. Some of it was caused by average AGC and some was due to lack of >>>> DC coupling, or skimped DC restoration, in the video amplifier. Some of >>>> the better sets used back-porch AGC and, for the enthusiast, add-on
circuits were published in Wireless World (designed by Mothersole, I
think).
From the beginnings of colour television the designers recognised that >>>> all three video amplifiers had to be DC coupled but the AGC was much
simpler because they used inverted modulation, so a sync pulse
corresponding to 100% modulation was always available. I can't imagine >>>> Philips would have produced a model with such gross errors as you
describe,. Was your own set faulty or was this a common insurmountable >>>> problem caused by NTSC and positive modulation on the system in use in >>>> the U.S. at the time?
In Europe, Philips and Mullard (their UK valve-making subsidiary)
published large quantities of material to aid set designers and help
them get the best out of their range of valves. I read it several years >>>> before the colour television service started in England and it included >>>> details on DC coupling and AGC. (The BBC did a lot of their preliminary >>>> experimental work using NTSC - but eventually decided to use PAL for the >>>> public broadcast system).
When I was a kid we had an RCA 12" round-tube B+W TV, in a giant piece
of furniture cabinet with a 12" speaker. It was all AC coupled, so the
screen always averaged grey.
It made a pretty good liquor cabinet, after I scrounged it for parts.
We only had three channels, but the programming was better than what
we have now.
https://en.wikipedia.org/wiki/Rosy_retrospection ;)
On 2/26/25 8:52 PM, Sylvia Else wrote:
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish
automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my
experience was that instead it was based on the average amplitude of
the demodulated signal. A black image containing large white text,
such as a title screen, would show a clear darkening to the sides of
the text, while being decidedly grey over the rest of the screen.
I suspect this same poor AGC was responsible for a shift in the
detection of the sync pulse such that the text would be moved to the
right of its proper position, which could result in distortion of the
letters as the average brightness varied line by line.
In the early days of television, using thermionic valves, it was
probably a miracle that these things worked at all, but surely in the
transistor age, something better could have been provided.
Were studio monitors any better, anyone know?
Sylvia.
Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
used transistor signal processing stages and many had already changed to transistors for the power stages such as line and frame output as well
as using chopper stabilised power supplies.
Typically they used some form of gated AGC to not be affected by the
video modulation.
https://en.wikipedia.org/wiki/Rosy_retrospection ;)
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my experience was that instead it was based on the average amplitude of the demodulated signal. A black image containing large white text, such as a title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
I suspect this same poor AGC was responsible for a shift in the
detection of the sync pulse such that the text would be moved to the
right of its proper position, which could result in distortion of the
letters as the average brightness varied line by line.
In the early days of television, using thermionic valves, it was
probably a miracle that these things worked at all, but surely in the transistor age, something better could have been provided.
Were studio monitors any better, anyone know?
Sylvia.
If reception conditions were poor, yes the picture could degrade a bit, but that is far preferable to the behaviour of digital systems that completely drop
out halfway through the movie if the rain gets too heavy.
If you're watching an
off-air recording that is particularly frustrating, because moving the aerial could only have been done in the past. A good outdoor aerial is much more necessary than it used to be.
On 2/27/2025 5:46 PM, Chris Jones wrote:
If reception conditions were poor, yes the picture could degrade a
bit, but that is far preferable to the behaviour of digital systems
that completely drop out halfway through the movie if the rain gets
too heavy.
Is it the presence of the water in the air that is the problem?
I've noted the correlation but assumed it was because of the
effects of wind and rain on the *tree* that is in my line-of-sight
to the broadcast towers.
I.e., if the transmitter had a COMPLETELY unobstructed path to my
antenna, would rain *still* be a problem?
Sylvia Else <sylvia@email.invalid> wrote:
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish
automatic gain control (AGC), and odd interactions between brightness
and picture position.
The AGC should have been based on the amplitude of the sync pulses,
which was 30% of the total. I'm sure this could have been done, but my
experience was that instead it was based on the average amplitude of the
demodulated signal. A black image containing large white text, such as a
title screen, would show a clear darkening to the sides of the text,
while being decidedly grey over the rest of the screen.
I think you may be mis-remembering; something similar to the fault you describe was prevalent on B&W televisions which were built down to a
price. Some of it was caused by average AGC and some was due to lack of
DC coupling, or skimped DC restoration, in the video amplifier. Some of
the better sets used back-porch AGC and, for the enthusiast, add-on
circuits were published in Wireless World (designed by Mothersole, I
think).
From the beginnings of colour television the designers recognised that
all three video amplifiers had to be DC coupled but the AGC was much
simpler because they used inverted modulation, so a sync pulse
corresponding to 100% modulation was always available. I can't imagine Philips would have produced a model with such gross errors as you
describe,. Was your own set faulty or was this a common insurmountable problem caused by NTSC and positive modulation on the system in use in
the U.S. at the time?
In Europe, Philips and Mullard (their UK valve-making subsidiary)
published large quantities of material to aid set designers and help
them get the best out of their range of valves. I read it several years before the colour television service started in England and it included details on DC coupling and AGC. (The BBC did a lot of their preliminary experimental work using NTSC - but eventually decided to use PAL for the public broadcast system).
https://en.wikipedia.org/wiki/Rosy_retrospection ;)
On 27/02/2025 19:58, KevinJ93 wrote:<...>
On 2/26/25 8:52 PM, Sylvia Else wrote:
Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
used transistor signal processing stages and many had already changed
to transistors for the power stages such as line and frame output as
well as using chopper stabilised power supplies.
The first two colour TVs I recall owned by friends or family were about
the time of Apollo 8 in 1968. Memorable for the Earth rise shot. Both
were entirely valves and my uncle's caught fire leaving a nasty brown
burn mark on their wool carpet and smoke damage on the ceiling.
The earliest was at a school friends house and was in pastel shades pre
Nd glass. It was in colour but only just... Joe 90 launch was the first programme I can recall watching there in colour. Test cards in shops
don't count.
I'd believe 1974 as a date for hybrid colour TVs that almost worked
correctly and didn't need a service engineer visiting them every other
week. By 1980 I'm pretty sure they were almost entirely semiconductor
based.
As a slight aside, I tried watching episodes of the first series of Star Trek a
few years back. It was rather obvious that the display panels on the Enterprise's bridge were actually back-lit paper. Good enough for TVs of the time, apparently, but not for modern digital displays.
On 2/27/25 12:45 PM, Martin Brown wrote:
On 27/02/2025 19:58, KevinJ93 wrote:<...>
On 2/26/25 8:52 PM, Sylvia Else wrote:
Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
used transistor signal processing stages and many had already changed
to transistors for the power stages such as line and frame output as
well as using chopper stabilised power supplies.
The first two colour TVs I recall owned by friends or family were about
the time of Apollo 8 in 1968. Memorable for the Earth rise shot. Both
were entirely valves and my uncle's caught fire leaving a nasty brown
burn mark on their wool carpet and smoke damage on the ceiling.
The earliest was at a school friends house and was in pastel shades pre
Nd glass. It was in colour but only just... Joe 90 launch was the first
programme I can recall watching there in colour. Test cards in shops
don't count.
I'd believe 1974 as a date for hybrid colour TVs that almost worked
correctly and didn't need a service engineer visiting them every other
week. By 1980 I'm pretty sure they were almost entirely semiconductor
based.
My father bought a Ferguson 19" colour TV at the end of 1970 that was
fully semiconductor (it was my first term at university and he got it
just before I came back for Christmas). It seemed to work fairly well -
he would tinker with it but I don't remember it needing any significant repair. I gather it was one of the first such sets.
On 2/27/2025 5:46 PM, Chris Jones wrote:
If reception conditions were poor, yes the picture could degrade a
bit, but that is far preferable to the behaviour of digital systems
that completely drop out halfway through the movie if the rain gets
too heavy.
Is it the presence of the water in the air that is the problem?
I've noted the correlation but assumed it was because of the
effects of wind and rain on the *tree* that is in my line-of-sight
to the broadcast towers.
I.e., if the transmitter had a COMPLETELY unobstructed path to my
antenna, would rain *still* be a problem?
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
But my memory of a Philips Colour TV (1984ish) was that it had rubbish automatic gain control (AGC), and odd interactions between brightness
and picture position.
Were studio monitors any better, anyone know?
ISTR the original valve EHT stack rectifier gave off X-rays and had a
lead shield around it.
On 28/02/2025 01:29, Don Y wrote:
On 2/27/2025 5:46 PM, Chris Jones wrote:
If reception conditions were poor, yes the picture could degrade a bit, but >>> that is far preferable to the behaviour of digital systems that completely >>> drop out halfway through the movie if the rain gets too heavy.
Is it the presence of the water in the air that is the problem?
I've noted the correlation but assumed it was because of the
effects of wind and rain on the *tree* that is in my line-of-sight
to the broadcast towers.
Quite likely to be a factor. Wet tree leaves can be pretty destructive to signals to hence our local microwave internet requires clear line of sight or the microwave link is unreliable.
Some VHF/UHF bands are less susceptible to water in their path.
I.e., if the transmitter had a COMPLETELY unobstructed path to my
antenna, would rain *still* be a problem?
Obviously it depends a lot on the frequency, but my satellite feed drops out when there are tall cumulonimbus thunderclouds overhead irrespective of whether
it is actually raining. TDTV holds good under most conditions expect when the local mast burned down spectacularly in a freak accident.
On 2/27/25 12:45 PM, Martin Brown wrote:
The first two colour TVs I recall owned by friends or family were
about the time of Apollo 8 in 1968. Memorable for the Earth rise shot.
Both were entirely valves and my uncle's caught fire leaving a nasty
brown burn mark on their wool carpet and smoke damage on the ceiling.
The earliest was at a school friends house and was in pastel shades
pre Nd glass. It was in colour but only just... Joe 90 launch was the
first programme I can recall watching there in colour. Test cards in
shops don't count.
I'd believe 1974 as a date for hybrid colour TVs that almost worked
correctly and didn't need a service engineer visiting them every other
week. By 1980 I'm pretty sure they were almost entirely semiconductor
based.
My father bought a Ferguson 19" colour TV at the end of 1970 that was
fully semiconductor (it was my first term at university and he got it
just before I came back for Christmas). It seemed to work fairly well -
he would tinker with it but I don't remember it needing any significant repair. I gather it was one of the first such sets.
The set in the common room at university was an older valve based one
with no blue channel but it was surprisingly watchable and it didn't
Valve sets ran hot and somehow valves would gradually work themselves
out of sockets or just plain and simple fail at switch on. I saw one or
two very old all valve sets survive into the 80's that kept the valve filaments warm continuously even when the set was nominally off.
On 28/02/2025 03:03, KevinJ93 wrote:
On 2/27/25 12:45 PM, Martin Brown wrote:
The first two colour TVs I recall owned by friends or family were
about the time of Apollo 8 in 1968. Memorable for the Earth rise shot.
Both were entirely valves and my uncle's caught fire leaving a nasty
brown burn mark on their wool carpet and smoke damage on the ceiling.
The earliest was at a school friends house and was in pastel shades
pre Nd glass. It was in colour but only just... Joe 90 launch was the
first programme I can recall watching there in colour. Test cards in
shops don't count.
I'd believe 1974 as a date for hybrid colour TVs that almost worked
correctly and didn't need a service engineer visiting them every other
week. By 1980 I'm pretty sure they were almost entirely semiconductor
based.
My father bought a Ferguson 19" colour TV at the end of 1970 that was
fully semiconductor (it was my first term at university and he got it
just before I came back for Christmas). It seemed to work fairly well -
he would tinker with it but I don't remember it needing any significant
repair. I gather it was one of the first such sets.
It was the valve ones from the mid to late 60's that were both seriously >expensive and unreliable. So much so that most people at the time rented
them since then routine service and repair was included. Enough money
was made on the TV rental business to fund a new Cambridge college.
https://www.robinson.cam.ac.uk/college-life/library/college-archive-and-history/sir-david-robinson
There were others in the same white goods and TV rental business and
several of the founders were big charity donors and philanthropists.
The set in the common room at university was an older valve based one
with no blue channel but it was surprisingly watchable and it didn't
Valve sets ran hot and somehow valves would gradually work themselves
out of sockets or just plain and simple fail at switch on. I saw one or
two very old all valve sets survive into the 80's that kept the valve >filaments warm continuously even when the set was nominally off.
On 2/27/25 17:40, john larkin wrote:
We only had three channels, but the programming was better than what
we have now.
https://en.wikipedia.org/wiki/Rosy_retrospection ;)
Lasse Langwadt wrote:
On 2/27/25 17:40, john larkin wrote:
We only had three channels, but the programming was better than what
we have now.
https://en.wikipedia.org/wiki/Rosy_retrospection ;)
Commercial networks sometimes had operas, then even PBS stopped showing
them, then the Bravo cable channel was created for them, then it became
the Real Housewives channel.
On 28/02/2025 03:03, KevinJ93 wrote:
On 2/27/25 12:45 PM, Martin Brown wrote:
On 27/02/2025 19:58, KevinJ93 wrote:<...>
On 2/26/25 8:52 PM, Sylvia Else wrote:
Are you sure that 1984 date is correct? By 1970 in the UK colour TVs
used transistor signal processing stages and many had already changed
to transistors for the power stages such as line and frame output as
well as using chopper stabilised power supplies.
The first two colour TVs I recall owned by friends or family were about
the time of Apollo 8 in 1968. Memorable for the Earth rise shot. Both
were entirely valves and my uncle's caught fire leaving a nasty brown
burn mark on their wool carpet and smoke damage on the ceiling.
The earliest was at a school friends house and was in pastel shades pre
Nd glass. It was in colour but only just... Joe 90 launch was the first
programme I can recall watching there in colour. Test cards in shops
don't count.
I'd believe 1974 as a date for hybrid colour TVs that almost worked
correctly and didn't need a service engineer visiting them every other
week. By 1980 I'm pretty sure they were almost entirely semiconductor
based.
My father bought a Ferguson 19" colour TV at the end of 1970 that was
fully semiconductor (it was my first term at university and he got it
just before I came back for Christmas). It seemed to work fairly well -
he would tinker with it but I don't remember it needing any significant
repair. I gather it was one of the first such sets.
The first domestic UK colour sets were valve-based. However, it wasn't long before transistor sets came in. See page 22 at
<https://americanradiohistory.com/UK/Practical-Television/60s/Practical-Television-1968-06.pdf#search=%22practical%20television%22>.
This was the June 1968 edition of Practical Television, and it refers to the new 19" Marconiphone Model 4701 as being "fully
transistorised".
More details can be found in Practical TV July and September 1967. What's amazing to me is the price - "284 guineas". So just
short of £300 in 1968; equivalent to £4500 today!!!
--
Jeff
On 2/27/25 1:49 AM, Martin Brown wrote:
On 27/02/2025 08:10, Jeff Layman wrote:<...>
On 27/02/2025 04:52, Sylvia Else wrote:
I always believed that made it impossible for them not to have newscasters with flesh that slowly shifted between ghastly green<...>
and purple tones (or was clamped to unnatural pale orange like the Donald's). NTSC was called Never Twice the Same Colour in the
UK for good reason. PAL was self correcting. My Japanese sets could do both. >>
However, when I was in Japan I saw US style NTSC TV implemented correctly. It seems there was no reason that it could not be made
to work well only that US makers couldn't be bothered to do it right.
Sony was notorious for not implementing PAL decoding fully by omitting the delay line and effectively processing PAL as if it was
NTSC.
kw
On 2/26/2025 9:52 PM, Sylvia Else wrote:^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
Analog television had two distinct issues:Â one was multipath problems (bummer), the other was that you COULD eek a signal out of the ether,
even a bad one (contrast to digital which is essentially "all or nothing")
On 2/26/25 10:04 PM, Don Y wrote:
On 2/26/2025 9:52 PM, Sylvia Else wrote:Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
Analog television had two distinct issues:Â one was multipath problems
(bummer), the other was that you COULD eek a signal out of the ether,
even a bad one (contrast to digital which is essentially "all or nothing")
That's exactly the problem. I told one station manager that it would be a mistake to go full digital (well, the government made them) and even more of a
mistake to give up their VHF channel without a fight. That it would erode viewership in the fringe areas, hence for people with more disposable income, who will migrate to the Internet, resulting in advertising income to drop, which will lead to painful staff cuts.
He didn't believe me. And then pretty much all that happened.
On 2/26/25 10:04 PM, Don Y wrote:
On 2/26/2025 9:52 PM, Sylvia Else wrote:^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Leave aside the ghosting, which could largely be addressed by having a
decent antenna.
Analog television had two distinct issues: one was multipath problems
(bummer), the other was that you COULD eek a signal out of the ether,
even a bad one (contrast to digital which is essentially "all or nothing")
That's exactly the problem. I told one station manager that it would be
a mistake to go full digital (well, the government made them) and even
more of a mistake to give up their VHF channel without a fight. That it
would erode viewership in the fringe areas, hence for people with more >disposable income, who will migrate to the Internet, resulting in
advertising income to drop, which will lead to painful staff cuts.
He didn't believe me. And then pretty much all that happened.
[...]
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 546 |
Nodes: | 16 (2 / 14) |
Uptime: | 26:03:27 |
Calls: | 10,390 |
Calls today: | 1 |
Files: | 14,064 |
Messages: | 6,417,039 |