I've got a copy of the original MIT HDTV proposal to the FCC (dated
February 1991), which clearly lays out the intraframe coding as subband
coding. However, in the latest IEEE Micro ("Digital Video Coding Techniques
for U.S. High-Definition TV"), it has the now renamed MIT/GI system using a
DCT. The reference to the system is from an April 1992 document, so I
assume that somewhere between 2/91 and 4/92, MIT decided that the earlier
system with subband coding wasn't going to work out. Is this what happened?
Does anyone know why? Finally, is the current compression method in the
MIT/GI Channel Compatible DigiCipher MIT's or GI's?
----------
MIT did not have the necessary money to fund the prototyping and FCC
testing of their system. MIT's test slot at the ATTC (Advanced
Television Test Center) was considered valuable since it was the last
one. General Instruments more or less bought the test slot from MIT.
The CCDC (Channel Compatible DigiCipher) is a merging of both GI and MIT
technology. One of the biggest differences between the DC and CCDC is
that DC (DigiCipher) is interlace scan and CCDC is progressive scan. At
the time that GI bought MIT's test slot progressive vs. interlace was
one of the biggest issues of contention. GI hedged their bet by
proposing one system of each type. Major algorithm improvements of the CCDC vs. DC are adaptive motion block size and vector coded DCT coeffecient selection.