Problems with real-time captioning

For some advocates of closed captioning, the amount of service available is just fine. Quality, they say, is another matter.

According to Jim Roots, executive director of the Canadian Association of the Deaf, the CAD launched captioning in Canada by running a ‘Captioned Films for the Deaf’ series in the 1970s. Making its point that there was an audience for captioning, the CAD then collaborated with the Department of Communication and the CRTC to form the Canadian Captioning Development Agency. The CCDA was an independent corporation that provided and promoted captioning for Canadian TV.

In the late 1980s, independent captioning houses rose in competition to the CCDA, and broadcasters started up their own in-house captioning services. As a result, the CCDA was forced to close its doors in the early 1990s. Nevertheless, Roots maintains that the CAD has remained active in lobbying for increased captioning.

‘The CRTC now requires that all licence applicants include their plans and expenditures for captioning,’ Roots writes in an e-mail interview. ‘We have also used human rights complaints to push broadcasters to provide full captioned programming schedules.’

Roots explains that the standards set forth by the CRTC are based on a broadcaster’s revenues: large licensees are ‘required’ to provide captions for at least 95% of their programming, whereas medium and small licensees are ‘encouraged’ to provide captions for 90% and 85%, respectively. He adds, however, that no broadcaster has ever been punished for not meeting these criteria. But of greater concern is the quality of the captioning that already exists.

‘The use of real-time captioning equipment or live captioning often results in incomprehensible captions,’ he laments. ‘The equipment is set up on a phonetic programming basis – one keystroke prints a pre-programmed syllable. So a word like ‘holodeck’ might end up as ‘hole on deck’.’

Toronto-based journalist Joe Clark has paid close attention to the development of closed captioning in Canada over the years, and has written a book about Web accessibility, Building Accessible Websites, soon to be released.

Clark cites information from the 1991 Census that indicated 11,000 people had closed captioning access, and an additional 5,000 needed it. Accessibility increased dramatically with the implementation of the 1993 Television Decoder Circuitry Act in the U.S., requiring caption-decoding chips in all new TV sets with screens 13 inches or larger. Since almost all TV manufacturers have one production run for both the U.S. and Canada, the results are also felt north of the 49th.

Clark estimates that today, nearly half of Canadians have decoder-equipped sets, compared to approximately 16,000 decoder users (based on the 10-year old data). The bottom line: most of those watching closed captioning are not hearing impaired. Some TV manufacturers offer a feature on their sets whereby closed captioning comes on every time the viewer hits the ‘mute’ button. It’s a pretty handy feature for hearing people – you can still get Bob Cole’s play-by-play even if you turn the volume off to answer the phone.

Clark is vocal that the emphasis on captioning has been misdirected.

‘British Columbia lawyer Henry Vlug just won his complaint against CBC at the Canadian Human Rights Tribunal,’ Clark points out. ‘The CHRT ordered CBC to caption every second of its programming. The issue of quantity of captioning has been won. The elephant in the room, [however], is quality.’

While Roots’ concerns are primarily with real-time captioning, Clark’s views are the opposite.

‘Captioning of pre-recorded programming is crap, but Canadian live captioning in English is very good,’ he says. ‘There are many issues involved in real-time captioning of sports [for example], and many approaches have been tried over the last 15 years. What you see now on hockey games is captioning only after the play stops or if something really significant happens during a play.’

Although Clark says he supports this current approach, Vlug complained to the CHRT, stating that whenever there’s speech in a broadcast, there should be words on the screen. Like Roots, Clark believes that the process of real-time captioning makes this hard to achieve.

‘A court reporter is sitting there with a stenographic keyboard,’ he explains. ‘They have about 20 keys – one bank on the bottom, two banks on the top. They’re not typing by letters – they’re typing phonetically. You have to press several keys at once to produce anything from a single sound to a single word.’

The steno-captioner is watching the video, listening to the audio, digesting what they see and hear and translating that into the correct finger motions, which are in turn translated into English text by software. Complicating matters are homonyms such as ‘there,’ ‘their,’ and ‘they’re’ that must be kept separate, and proper names must be in the computer dictionary up front.

Clark says few steno-captioners have the typing aptitude (180-200 words/minute) to work in live TV.

‘Canada is actually a good source for this,’ Clark says. ‘[The Northern Alberta Institute of Technolgy] has a really good court reporting program, and the Captioning Group in Calgary [and Studio City, Calif.] gets almost all the graduates from there.’ *

-www.cad.ca

-www.joeclark.org