To add another part of it, for people using BSL, it’s akin to their mother tongue.
Being able to watch content with signing is akin to having a TV show dubbed into your native language, rather than relying on subtitles.
Edit: I just had a check, and it’s actually mentioned in the ofcom guidelines: Subtitle users reflect the full range of proficiency in English; some profoundly deaf people regard
BSL as their first language, and are less fluent in English.
The UK’s always been pretty inclusive and this law’s been around for decades, since way before subtitles were practical, or even visible on crappy old b&w CRT screens
So it’s just because they haven’t bothered updating some guideline booklet about new technologies?
Nobody has gone like: BTW this new thing called subtitles, could actually replace sign language requirements especially now that we have color TVs
That said, I can imagine sign language to be better at real time interpretation, than someone typing in the speach unless they use some really good transcription software
I’ll just reply on this one too, we have fairly detailed recomendations and guidelines on access services in the UK. If you’re curious, it’s summarised really well in this document (10 pages).
Live subtitles always used to be done using a stenograph, or similar, though having a look now speech-to-text seems more common. As I happened upon it too, here is a cool white paper by BBC R&D on inserting a longer delay in live events to allow the subtitles to follow more closely.
But why is there such a legal requirement?
To add another part of it, for people using BSL, it’s akin to their mother tongue.
Being able to watch content with signing is akin to having a TV show dubbed into your native language, rather than relying on subtitles.
Edit: I just had a check, and it’s actually mentioned in the ofcom guidelines:
Subtitle users reflect the full range of proficiency in English; some profoundly deaf people regard BSL as their first language, and are less fluent in English.
The UK’s always been pretty inclusive and this law’s been around for decades, since way before subtitles were practical, or even visible on crappy old b&w CRT screens
So it’s just because they haven’t bothered updating some guideline booklet about new technologies?
Nobody has gone like: BTW this new thing called subtitles, could actually replace sign language requirements especially now that we have color TVs
That said, I can imagine sign language to be better at real time interpretation, than someone typing in the speach unless they use some really good transcription software
I’ll just reply on this one too, we have fairly detailed recomendations and guidelines on access services in the UK. If you’re curious, it’s summarised really well in this document (10 pages).
Live subtitles always used to be done using a stenograph, or similar, though having a look now speech-to-text seems more common. As I happened upon it too, here is a cool white paper by BBC R&D on inserting a longer delay in live events to allow the subtitles to follow more closely.