Broadcast engineering presented a lot of new challenges, and the 1960s and 1970s were periods of great developments and expansion such as the introduction of colour TV in 1967, and the requirement for standards converters so that different nations could receive and view TV pictures from all around the world.
Some countries adopted the NTSC system from North America, some adopted PAL, from Europe, and some countries like France and her colonies had their own SECAM system. To convert from one TV system to another required a lot of racks of custom built equipment.
The Apollo Program, the Olympics of 1964 (Tokyo) and 1968 (Mexico) were the driving forces behind electronic standards converters. Previously 16mm and 35mm film had been the means of transferring program images from one format to another.
With the advent of cheaper and larger semiconductor memory in the 1970s, line stores and frame stores eventually became practical, allowing digital sampling and interpolation of video information.
Minicomputers were just becoming sufficiently useful for research into image processing techniques from the late 1970s.
My department had a PDP-11, followed by a VAX and then a pair of MicroVAX machines, before moving towards “3M” workstations in the early 1990s. The VAX had custom built semiconductor framestores interfaced to it so that short sequences of images could be processed.
When I joined in 1986, there was fundamental research into HDTV coding and decoding techniques putting a high demand on the computing resources.
As an HDTV picture contained at least 4 times the pixel data as a standard definition picture, how do you compress the bandwidth sufficiently for broadcasting without visible loss of image quality?
Image processing routines were written in FORTRAN for FFT and DCT algorithms. Small 64x64 pixel blocks of monochrome images would be batch processed overnight, or even over the weekend - such were the demands of the coding tasks.
Eventually by about 1988/89 an experimental HDTV CODEC had been built from real hardware, capable of running at realtime video speed - it took up about three and a half full height racks.
The CODEC was asymmetrical by design, most of the complexity was put into the coder - which would be a large expensive machine owned by the broadcasting operator. The decoder would be considerably simple and cheaper, and would be built into the viewer’s set top box.
The BBC’s role was to try to maintain picture quality for the HDTV viewer, against a lot of competition from the European set top box manufacturers, who just wanted the decoder to be commercially viable, and boost lagging TV sales.
I remember there was an atmosphere of distrust between the BBC engineers and their European commercial partners.
Out of this research from 35 years ago, came the various MPEG video compression standards, now universally used on all computing devices.
Once only possible using several racks of custom TTL and ECL circuitry can now be performed on a fingernail sized ARM device costing a few dollars.