• Home  
  • ‘Have we done ourselves out of a job?’: concerns in film and TV industry over on-set body scanning | AI
- Magazine

‘Have we done ourselves out of a job?’: concerns in film and TV industry over on-set body scanning | AI

For performers on TV or movie sets, it is not unusual to receive a request to enter a booth filled with scores of cameras ready to capture their likeness from every possible angle. Yet with the cast and crew of productions already fretting over the coming role of AI in the industry, it is an […]

For performers on TV or movie sets, it is not unusual to receive a request to enter a booth filled with scores of cameras ready to capture their likeness from every possible angle. Yet with the cast and crew of productions already fretting over the coming role of AI in the industry, it is an increasingly troubling undertaking.

“It happens without warning,” says Olivia Williams, who adds she has been scanned more times than she cares to remember during a career that has spanned from The Sixth Sense to Dune: Prophecy.

“You are on set. You are in costume. A friendly assistant director who is already known to you, who brings you tea and holds your phone while you’re acting, says that the VFX [visual effects] team are in today – and just after you finish the scene, could you pop over to the VFX bus? And off you go.

“Actors are, by and large, people pleasers. To have a standoff about scanning when you are in the midst of a scene annihilates your creativity, engenders fear that you will never work again, that your agent will drop you. So you comply.”

Lead and supporting actors, stunt performers and dancers have all told the Guardian of similar experiences on set, of being ushered into scanners despite being unclear on their rights relating to the biometric data produced.

Williams said performers were told that “if you want to be in the scene, or you want the scene to look cool with the alien crawling out of your brain”, then scans were needed.

Scanning ‘happens without warning’, said Olivia Williams, who said actors complied through ‘fear that you will never work again’. Photograph: David Vintiner/The Observer

The experience has caused unease for some time, but the development of “AI doubles” for performers and claims about the arrival of “AI actors” has injected urgency into clarifying exactly what is happening to the data harvested on set.

Those concerns were flushed into the open with the publicity around an AI actor called “Tilly Norwood”. It seems unlikely that the company behind the creation will generate the first AI star, but it has given a focus to an ongoing fight to clarify performers’ rights.

Williams decided to put her head above the parapet out of concern for young actors just starting out, as well as the existential threat posed to performers known in the industry as supporting artistes (SAs), who populate a show’s crowds and backdrops.

Dave Watts, an experienced SA who has appeared in numerous superhero movies and major productions, has been scanned several times. He said there were wider implications for the industry.

“I already hear crew members saying: ‘To be honest, we don’t even need to do this any more. We can just ask AI to create a crowd of 1,000 people based on information which has already been captured,’” he said.

“If you don’t have your usual crowd of 100, 200 or 500 SAs on a big production, then you also don’t need the assistant directors that look after them, and you don’t need the hair and makeup people. You don’t need the costume people, the costume fittings, all the caterers, all the drivers and location marshals. There’s a whole range of jobs there that AI effectively puts at risk.”

Images of the AI-generated actor ‘Tilly Norwood’ have heightened concerns for performers. Photograph: Reuters

A dancer, speaking anonymously due to concerns speaking out would affect their work, raised similar points about the pressure to be scanned and the use of the data. “Filming is gruelling – you’re getting up at 3am,” they said. “It’s now 8pm and you’re not allowed to go home until you’ve done it. The way it happens, you just really don’t have a choice.

“You wonder, have we all done ourselves out of a job? It makes you feel a bit of a fool.”

Alex Lawrence-Archer, a data rights lawyer from the law firm AWO which has been working with actors on the issue, said performers were hampered by a morass of complicated, overlapping laws. He said it was crucial for them to have clearer agreements going into a production, rather than trying to reclaim their data after the fact.

“Contracts are often quite poorly drafted, often industry-standard wording that has been around for many years,” he said. “They’re really not designed with these kinds of technologies in mind. What you have is kind of a vacuum of uncertainty. In that vacuum, AI developers and studios are doing as much as they can get away with.

skip past newsletter promotion

“It’s the future instances of training that actors and the representatives really need to turn their attention to. They need to negotiate better contracts that are clearer and that truly reflect a fair agreement between actors, studios and AI developers.”

There are now signs of a fledgling rebellion. On one recent shoot, performers were given advanced notice of scans after concerns were raised.

“The cast has collectively been pushing back against the atmosphere of ambushing actors,” said one of the performers, speaking on condition of anonymity. “We succeeded in getting them to put a sort of addendum into our agreement, which basically prevents them from using the digital scans for anything other than the show without our written consent.”

Filming in Cardiff for Mr Burton. As well as actors, AI puts the jobs of assistant directors, hair and makeup artists, costume designers and wardrobe staff, caterers, drivers and location marshals at risk, says one supporting actor. Photograph: Sarah Lee/The Guardian

The battle for rights in the face of the AI industry’s thirst for data can seem hopeless. Such data can be harvested from various footage and sources that sidestep professional performers. However, there is a consensus over attempting to take back some control.

“The technology could conceivably be used in a reductionist way that drastically reduces the need for human performers, or it could be used to benefit creativity and build things out in a really positive way,” said Theo Morton, a professional stunt performer and member of the British Stunt Register. “There’s a lot of uncertainty and no one truly knows. That’s why it’s so important to create safeguards contractually, to protect against this potential erosion of control that could happen.”

Williams, however, is among those who despair that control has already been lost.

The great unknown is where exactly the data training AI models is coming from. Lawrence-Archer said that remained a closely guarded secret, but needed to be exposed. He also warned against reducing the issue to extra compensation for performers.

“The AI industry relies on large amounts of data,” he said. “Someone is gathering it. We know that these questions are very sensitive for AI developers and studios. We have supported actors to make these data access requests, trying to learn more. I personally know of actors who have been paid off by AI companies in order to withdraw those requests.

“We need to be building a world in which the human creativity, connection and performance of actors carries on being valued. If we focus only on legal and compensation questions, there’s a risk that you end up with actors becoming data gig workers, rather than the creative performers they are.”

First Appeared on
Source link

Leave a comment

Your email address will not be published. Required fields are marked *

isenews.com  @2024. All Rights Reserved.