DGC, ACTRA discuss ‘existential’ threat of AI in Bill C-27 study

Representatives from the unions also called for stronger creator protections in their testimony to the Standing Committee on Industry and Technology.

A rtificial intelligence (AI) tools represent an existential threat to creative industry workers, representatives of the Directors Guild of Canada (DGC) and the Alliance of Canadian Cinema, Television and Radio Artists’ (ACTRA) told a parliamentary committee on Monday (Feb. 12).

DGC national executive director David Forget (pictured) was joined by Samuel Bischoff, the DGC’s manager, policy and regulatory affairs, as well as ACTRA National president Eleanor Noble and executive director Marie Kelly in a testimony to the Standing Committee on Industry and Technology in Ottawa, which is currently studying AI-focused legislation Bill C-27.

“Generative AI threatens the ecosystem of creativity on an existential level,” said Forget. “As we stand at a crossroads of regulating AI, creators should have the right to consent and be compensated whenever an AI entity uses their copyrighted content.”

Broadly, Bill C-27 contains three proposed Acts — the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act (AIDA) — which would regulate the use of AI systems.

Forget said transparency in AI systems should be a “prerequisite” to protecting the rights of creators, and said that large language model developers are extensively reproducing creative works for commercial purposes without authorization or fair compensation to the original creators of those works.

“Copyright remains an essential framework law for governing our industry. Any unauthorized copying to train AI models is theft,” said Forget.

He added that, in its current form, AIDA is failing to “protect and uphold fundamental copyright principles.”

Forget said the European Union Artificial Intelligence Act offered a framework that could be mirrored. He presented three recommendations for AIDA to ensure protection for creators: require the authorization of rights holders for the use of copyrighted content; require general purpose AI systems like ChatGPT to be transparent and provide descriptions of the materials used for training purposes; and require providers of general purpose AI models in the Canadian market to comply with Canadian copyright laws, including obtaining consent for data mining purposes.

ACTRA’s Noble said a performer’s livelihood depends on their name, image and likeness, as well as their reputation, but Bill C-27 does not encompass the latter.

She gave examples of how the voice and likeness of two actors were manipulated using AI to “say obscene things” and portray one of them performing sexual acts to highlight some of the harms performers can potentially face.

“Sadly, reputational harm is not currently encompassed by Bill C-27. The definition of harm to include psychological harm or economic loss to an individual does not sufficiently encompass the reputational harm we experience. Due to the nature of our business, we might not be able to show an exact circumstance of loss of work due to a deep fake or manipulation. But there is no doubt that damage to a performer’s reputation means a real and tangible loss for our careers,” said Noble.

Noble said ACTRA had submitted to the committee “proposed language to rectify this gap under the legislation.”

A recent ACTRA survey found that 98% of member respondents are worried about the potential misuse of their name, image and likeness rights, while 93% of those surveyed think the tech will eventually replace human actors in certain roles and performances in the entertainment industry.

Also present in the standing committee meeting were representatives from Music Canada, the Coalition for the Diversity of Cultural Expressions and l’Association nationale des éditeurs de livres.

Music Canada CEO Patrick Rogers said the music industry is already making use of “positive elements” of AI to help artists “make more intriguing and interesting” work, but echoed the recommendations of both the DGC and ACTRA.