Considering the human factor in adopting AI

Canadian unions and industry organizations say the time is now to set regulations around the use of artificial intelligence.

This two-part feature explores the opportunities and concerns about the rise of artificial intelligence in the film and TV industry. See part one for an examination of how the technology is currently being used. Part two, below, sees labour experts discuss the potential consequences of adopting it too quickly.

 

Artificial intelligence (AI) tools have already begun to play an important role in content creation, but little has been done to assuage the concerns of the people who could lose their jobs to AI-driven automation.

For his part, Anil Verma (pictured right), professor emeritus of industrial relations and human resource management at the University of Toronto’s Rotman School of Management, says he’s not worried about robots replacing all of our jobs any time in the foreseeable future – though, he tacks on a few caveats.

“We are embedding the intelligence in machines,” he explains, adding that the technology “has tremendous power to transform the way we do things,” much like computers replaced typewriters.

It could, he offers, mean fewer – if any – background performers on future movie or TV sets, since the capabilities of video and audio AI tools have grown far beyond the deep-fakes of only a few years ago. But, in that instance, Verma says the extras should be compensated every time their likeness appears on screen along the lines of “a more equitable profit-distribution stream.”

The challenge the industry is facing, he adds, is for employers and the state to assist the individual and ensure that people are made whole in the AI world.

“It is the future, and we need to adapt to this new technology – and successful adaptation means finding new things for people to do,” says Verma.

Ultimately, however, he believes that “human creativity will always be valued over and above machine creativity.”

While that may turn out to be true, industry union leaders want it codified as soon as possible.

Writers Guild of Canada (WGC) assistant executive director Neal McDougall (pictured below, right) says creators have obvious questions about the use of AI. “From the human resources perspective: who benefits from it?” he says. “How much do they stand to benefit? And who loses out?

“The question from the creative perspective is that from the outputs we get [from AI-generated works], how much have they lost the humanity that formerly went into them? Even if we don’t notice that immediately, what kind of cumulative effect will that have on society?”

McDougall, a former television screenwriter, says that when social media arrived, it was difficult to predict the ways it would be used and the impact it would have.

“I think that we as a society were less aware and cautious than we could have been before we rushed into that world. And now we see that there is a downside to the social media-fication of everything,” he says. “We need to have more caution and thought about AI than we did about social media.”

And he’s not alone in his concerns.

“Our product is our voice, our face – our image, our likeness, who we are. That’s how we make a living,” says Eleanor Noble (pictured left), national president of the 28,000-member Alliance of Canadian Cinema, Television and Radio Artists (ACTRA).

“So if producers want to take that and turn it into AI and make mass profits, they’re stealing our product. When that happens, we don’t have control. We haven’t given consent, and we’re not being compensated – and that’s huge,” she says. “It’s immoral and unethical.”

AI technology has become a major point of contention for both the Writers Guild of America, the Screen Actors Guild and the American Federation of Television and Radio Artists (SAG-AFTRA) in their labour dispute with the Alliance of Motion Picture and Television Producers (AMPTP).

SAG-AFTRA has made the right to digitally replicate a performer’s voice or likeness in order to create a new work – or to train AI to do the same – mandatory subjects of bargaining during the strike.

In mid-July, AMPTP released a statement in which it said that it agreed with SAG-AFTRA’s position and noted that it proposed first-of-its-kind protections, including “advance, specific consent from the performer required both to create and use digital replicas,” which requires an actor’s written consent “and description of the intended use in the film.”

Noble says the concern among actors is also where their digital images will be stored for later use – an issue that especially involves the aforementioned background performers, who, as she highlights, “aren’t the highest paid on set, and now we’re going to undercut them by stealing their likeness so studios and streaming services can mass produce.”

AMPTP said in July that its AI proposal includes a prohibition on later use of a digital replica unless the performer specifically consents to that new use and is paid for it, and that the studio alliance “explicitly confirmed to SAG-AFTRA that consents needed for later use of digital replicas apply to background actors as well as principal performers.”

Solutions found in the Copyright Act?

Noble would like the federal government in Canada to establish guardrails to protect the creative industry from AI’s encroachment. In 2021, ACTRA participated in a public consultation on a modern copyright framework for AI and the Internet of Things, led by the federal department of Innovation, Science and Economic Development.

The ACTRA proposal flagged the threat of deep-fake technology in content creation, and suggested that the best protection against misuse would be to grant moral rights solely to audiovisual artists in Canada’s Copyright Act – so copyright could only exist for AI-assisted works, not AI-generated works. In other words: no human, no copyright.

The WGC, which represents about 2,500 Canadian screenwriters, agreed – calling on the federal government to ensure that AI is not recognized for authorship under the Copyright Act. But McDougall stresses that his guild would also like financial support flowing from Telefilm Canada, the Canada Media Fund and the Canadian Film or Video Production Tax Credit to remain for the benefit of human creators.

“We must not divert essential limited funding from human artists to AI. These are cultural funds, not technology development funds,” he sums.

In 2022, Innovation, Science and Industry Minister François-Philippe Champagne introduced Bill C-27, the Digital Charter Implementation Act, which would, if passed by Parliament, make it an offence to use an AI system if it “is likely to cause serious physical or psychological harm to an individual or substantial damage to an individual’s property,” and that it “causes such harm or damage.”

But copyright isn’t included in the proposed law, and that’s an important component of regulating AI, according to Brad Danks, CEO of Vancouver-headquartered OUTtv Media Global, who previously practised entertainment law.

Amending the Copyright Act “is one of the things that we should look at, especially moral rights clauses, but that alone might not be enough of what we need,” he says. “In the end, it might require more than one piece of legislation.”

The goal, says the head of OUTtv, should be to “have more creative people being productive as much as possible in our society, and do it in a sustainable way where production costs are aligned with the economic value of content.”

While the government ponders possible visions of the future and industry bodies raise red flags, the industry is already moving forward. “There is such a largely immovable industry built around keeping everything the same, but the world moves on,” offers Gusto president and CEO, Chris Knight.

And that’s the rub. The future is not lingering on a distant horizon. It’s already here.

“I am concerned,” sums WGC’s McDougall. “But I am also cautiously optimistic that human beings can never be fully and truly replaced. At least I’m hopeful of that.”

With files from Brendan Christie

This story originally appeared in Playback‘s 2023 Fall Issue

Image: Pexels