This post was originally published on this site.
Eric Shamlin, Television Academy governor and chair of its AI Task Force, urged members to take on an advocacy role in developing AI policies and frameworks, during a keynote at Saturday’s TV Academy AI Summit. “We must now lead,” he asserted. “We can’t afford to sit back and wait for others to shape how AI is implemented into our industry.”
Added Shamlin, who is also CEO of AI-driven entertainment studio Secret Level, “This means advocating for responsible AI policies, collaborating with guilds, unions and studios, ensuring our members aren’t left behind in the transition, and pushing for ethical AI use that empowers creatives, not just corporations.”
The half-day program was presented to a full house at the TV Academy’s Saban Media Center Wolf Theatre. Summit topics including legal issues, impact on jobs, and new tools, as well as use cases.
Popular on Variety
Addressing the thorny topic of job retention, several speakers opined that some jobs will go away or change while new ones will emerge, but that AI still requires creatives. Ed Elbrich, a Digital Domain alum who is now chief content officer and president of production at Metaphysic.ai, remembered a day on the set of “The Curious Case of Benjamin Button,” when director David Fincher joked, “someday, we’re going to be making CG people on a laptop in real time.” Ulbrich asserted that that time is here, but creatives are still needed. “These powerful tool in the hands of great artists is yielding amazing things,” he said, adding that all roles from actors to cinematographers are “highly relevant in optimizing AI.”Â
Ulbrich advocated for pros to learn AI. “Your skills are valuable,” he said, while reminding young people, “learn filmmaking, learn how to tell stories. The tools are all going to change.”
Echoed producer Christina Lee Storm, “I really feel that the craft is necessary, because that’s what’s going to elevate these tools.”
Industry vet Barbara Ford Grant noted that while there are many ways that content now reaches consumers, “I do feel like really good stories cut through, and people who really know their craft and are embracing these new tools are getting there faster.” She also noted that she is seeing a shift with “more actual filmmakers versus the team around the filmmaker directly operating these tools.”
Stephen Fefferman, executive vice president and deputy general counsel, business & legal affairs at Paramount, discussed key areas of AI concern for studios. The first, simply put, is they don’t want to get sued for copyright infringement. He elaborated, “They’re concerned that if they use tools that have been trained on works for which an AI company, the tool maker, does not have our licenses, that that underlying material could end up in the output and then eventually get the studio sued.”
Second, he pointed out that when a studios spends millions to produce a new series or movie, they obviously want to be able to monetize it. “What they really want at the end of the day is a piece of paper from the United States Copyright Office saying that they actually own the copyright in that resulting outcome, and that means that they can license it, exploit it around the world, make millions of dollars for the studio and for all people who worked on it.”
Legal topics included fair use. On this subject, Dave Davis, general manager at Protege Media, pointed out that these AI models are global. “One of the things that we say to AI companies is you can go try to figure out how to navigate 191 different copyright regimes and figuring out the exceptions in each one for your models that are going worldwide, or you can just license content and sign one piece of paper and write a check. And that seems simpler.”
Throughout the summit, speakers urged industry pros to be proactive in the AI field. Warned Loyola Law School’s Julie Shapiro, “You cannot turn your back on the fact that this is happening.”