{"id":15790,"date":"2024-11-01T08:04:30","date_gmt":"2024-11-01T09:04:30","guid":{"rendered":"https:\/\/medexperts.pro\/?p=15790"},"modified":"2024-11-01T09:24:13","modified_gmt":"2024-11-01T09:24:13","slug":"what-if-a-i-is-actually-good-for-hollywood","status":"publish","type":"post","link":"https:\/\/medexperts.pro\/?p=15790","title":{"rendered":"What if A.I. Is Actually Good for Hollywood?"},"content":{"rendered":"<div><\/div>\n<div class=\"css-s99gbd StoryBodyCompanionColumn\" data-testid=\"companionColumn-0\">\n<div class=\"css-53u6y8\">\n<p class=\"css-at9mc1 evys1bk0\">The Los Angeles headquarters of Metaphysic, a Hollywood visual-effects start-up that uses artificial intelligence to create digital renderings of the human face, were much cooler in my imagination, if I\u2019m being honest. I came here to get my mind blown by A.I., and this dim three-room warren overlooking Sunset Boulevard felt more like the slouchy offices of a middling law firm. Ed Ulbrich, Metaphysic\u2019s chief content officer, steered me into a room that looked set to host a deposition, then sat me down in a leather desk chair with a camera pointed at it. I stared at myself on a large flat-screen TV, waiting to be sworn in.<\/p>\n<p class=\"css-at9mc1 evys1bk0\">But then Ulbrich clickety-clicked on his laptop for a moment, and my face on the screen was transmogrified. \u201cSmile,\u201d he said to me. \u201cDo you recognize that face?\u201d I did, right away, but I can\u2019t disclose its owner, because the actor\u2019s project won\u2019t come out until 2025, and the role is still top secret. Suffice it to say that the face belonged to a major star with fantastic teeth. \u201cSmile again,\u201d Ulbrich said. I complied. \u201cThose aren\u2019t your teeth.\u201d Indeed, the teeth belonged to Famous Actor. The synthesis was seamless and immediate, as if a digital mask had been pulled over my face that matched my expressions, with almost no lag time.<\/p>\n<p class=\"css-at9mc1 evys1bk0\">Ulbrich is the former chief executive of Digital Domain, James Cameron\u2019s visual-effects company, and over the course of his three-decade career he has led the VFX teams on several movies that are considered milestones in the field of computer-generated imagery, including \u201cTitanic,\u201d \u201cThe Curious Case of Benjamin Button\u201d and \u201cTop Gun: Maverick.\u201d But in Ulbrich\u2019s line of work, in the quest for photorealism, the face is the final frontier. \u201cI\u2019ve spent so much time in Uncanny Valley,\u201d he likes to joke, \u201cthat I own real estate there.\u201d<\/p>\n<p class=\"css-at9mc1 evys1bk0\">In the spring of 2023, Ulbrich had a series of meetings with the founders of Metaphysic. One of them, Chris Ume, was the visual-effects artist behind a series of <a class=\"css-yywogo\" href=\"https:\/\/www.tiktok.com\/@deeptomcruise\" title rel=\"noopener noreferrer\" target=\"_blank\">deepfake Tom Cruise videos<\/a> that went viral on TikTok in early 2021, a moment many in Hollywood cite as the warning shot that A.I.\u2019s hostile takeover had commenced. But in parts of the VFX industry, those deepfake videos were greeted with far less misgiving. They hinted tantalizingly at what A.I. could soon accomplish at IMAX resolutions, and at a fraction of the production cost. That\u2019s what Metaphysic wanted to do, and its founders wanted Ulbrich\u2019s help. So when they met him, they showed him an early version of the demonstration I was getting.<\/p>\n<\/div>\n<\/div>\n<div data-testid=\"Dropzone-1\"><\/div>\n<div class=\"css-s99gbd StoryBodyCompanionColumn\" data-testid=\"companionColumn-1\">\n<div class=\"css-53u6y8\">\n<p class=\"css-at9mc1 evys1bk0\">Ulbrich\u2019s own career began during the previous seismic shift in the visual-effects field, from practical effects to C.G.I., and it was plain to him that another disruption was underway. \u201cI saw my career flash before my eyes,\u201d Ulbrich recalled. \u201cI could take my entire team from my former places of employment, I could put them on for eternity using the best C.G.I. tools money can buy, and you can\u2019t deliver what we\u2019re showing you here. And it\u2019s happening in milliseconds.\u201d He knew it was time to leave C.G.I. behind. As he put it: \u201cHow could I go back in good conscience and use horses and buggies and rocks and sticks to make images when this exists in the world?\u201d<\/p>\n<p class=\"css-at9mc1 evys1bk0\">Back on Sunset Boulevard, Ulbrich pecked some more at his laptop. Now I was Tom Hanks \u2014 specifically, a young Tom Hanks, he of the bulging green eyes and the look of gathering alarm on his face in \u201cSplash\u201d when he first discovers that Daryl Hannah\u2019s character is a mermaid. I can divulge Hanks\u2019s name because his A.I. debut arrived in theaters nationally on Nov. 1, in a movie called \u201cHere.\u201d Directed by Robert Zemeckis, written by Zemeckis and Eric Roth \u2014 a reunion of the creative team behind \u201cForrest Gump\u201d \u2014 and co-starring Robin Wright, \u201cHere\u201d is based on a 2014 graphic novel that takes place at a single spot in the world, primarily a suburban New Jersey living room, over several centuries. The story skips back and forth through time but focuses on a baby-boomer couple played by Hanks and Wright at various stages of their lives, from age 18 into their 80s, from post-World War II to the present day.<\/p>\n<div class=\"css-1336jj\">\n<div class=\"css-121kum4\">\n<div class=\"css-171d1bw\"><\/div>\n<div class=\"css-asuuk5\">\n<div class=\"css-7axq9l\" data-testid=\"optimistic-truncator-noscript\">\n<div data-testid=\"optimistic-truncator-noscript-message\" class=\"css-6yo1no\">\n<p class=\"css-3kpklk\">We are having trouble retrieving the article content.<\/p>\n<p class=\"css-3kpklk\">Please enable JavaScript in your browser settings.<\/p>\n<\/div>\n<\/div>\n<div class=\"css-1dv1kvn\" id=\"optimistic-truncator-a11y\">\n<hr \/>\n<p>Thank you for your patience while we verify access. If you are in Reader mode please exit and\u00a0<a href=\"https:\/\/myaccount.nytimes.com\/auth\/login?response_type=cookie&amp;client_id=vi&amp;redirect_uri=https%3A%2F%2Fwww.nytimes.com%2F2024%2F11%2F01%2Fmagazine%2Fai-hollywood-movies-cgi.html&amp;asset=opttrunc\">log into<\/a>\u00a0your Times account, or\u00a0<a href=\"https:\/\/www.nytimes.com\/subscription?campaignId=89WYR&amp;redirect_uri=https%3A%2F%2Fwww.nytimes.com%2F2024%2F11%2F01%2Fmagazine%2Fai-hollywood-movies-cgi.html\">subscribe<\/a>\u00a0for all of The Times.<\/p>\n<hr \/>\n<\/div>\n<div class=\"css-1g71tqy\">\n<div data-testid=\"optimistic-truncator-message\" class=\"css-6yo1no\">\n<p class=\"css-3kpklk\">Thank you for your patience while we verify access.<\/p>\n<p class=\"css-3kpklk\">Already a subscriber?\u00a0<a data-testid=\"log-in-link\" class=\"css-z5ryv4\" href=\"https:\/\/myaccount.nytimes.com\/auth\/login?response_type=cookie&amp;client_id=vi&amp;redirect_uri=https%3A%2F%2Fwww.nytimes.com%2F2024%2F11%2F01%2Fmagazine%2Fai-hollywood-movies-cgi.html&amp;asset=opttrunc\">Log in<\/a>.<\/p>\n<p class=\"css-3kpklk\">Want all of The Times?\u00a0<a data-testid=\"subscribe-link\" class=\"css-z5ryv4\" href=\"https:\/\/www.nytimes.com\/subscription?campaignId=89WYR&amp;redirect_uri=https%3A%2F%2Fwww.nytimes.com%2F2024%2F11%2F01%2Fmagazine%2Fai-hollywood-movies-cgi.html\">Subscribe<\/a>.<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>The Los Angeles headquarters of Metaphysic, a Hollywood visual-effects start-up that uses artificial intelligence to create digital renderings of the human face, were much cooler in my imagination, if I\u2019m being honest. I came here to get my mind blown by A.I., and this dim three-room warren overlooking Sunset Boulevard felt more like the slouchy offices of a middling law firm. Ed Ulbrich, Metaphysic\u2019s chief content officer, steered me into a room that looked set to host a deposition, then sat me down in a leather desk chair with a camera pointed at it. I stared at myself on a large flat-screen TV, waiting to be sworn in.But then Ulbrich clickety-clicked on his laptop for a moment, and my face on the screen was transmogrified. \u201cSmile,\u201d he said to me. \u201cDo you recognize that face?\u201d I did, right away, but I can\u2019t disclose its owner, because the actor\u2019s project won\u2019t come out until 2025, and the role is still top secret. Suffice it to say that the face belonged to a major star with fantastic teeth. \u201cSmile again,\u201d Ulbrich said. I complied. \u201cThose aren\u2019t your teeth.\u201d Indeed, the teeth belonged to Famous Actor. The synthesis was seamless and immediate, as if a digital mask had been pulled over my face that matched my expressions, with almost no lag time.Ulbrich is the former chief executive of Digital Domain, James Cameron\u2019s visual-effects company, and over the course of his three-decade career he has led the VFX teams on several movies that are considered milestones in the field of computer-generated imagery, including \u201cTitanic,\u201d \u201cThe Curious Case of Benjamin Button\u201d and \u201cTop Gun: Maverick.\u201d But in Ulbrich\u2019s line of work, in the quest for photorealism, the face is the final frontier. \u201cI\u2019ve spent so much time in Uncanny Valley,\u201d he likes to joke, \u201cthat I own real estate there.\u201dIn the spring of 2023, Ulbrich had a series of meetings with the founders of Metaphysic. One of them, Chris Ume, was the visual-effects artist behind a series of deepfake Tom Cruise videos that went viral on TikTok in early 2021, a moment many in Hollywood cite as the warning shot that A.I.\u2019s hostile takeover had commenced. But in parts of the VFX industry, those deepfake videos were greeted with far less misgiving. They hinted tantalizingly at what A.I. could soon accomplish at IMAX resolutions, and at a fraction of the production cost. That\u2019s what Metaphysic wanted to do, and its founders wanted Ulbrich\u2019s help. So when they met him, they showed him an early version of the demonstration I was getting.Ulbrich\u2019s own career began during the previous seismic shift in the visual-effects field, from practical effects to C.G.I., and it was plain to him that another disruption was underway. \u201cI saw my career flash before my eyes,\u201d Ulbrich recalled. \u201cI could take my entire team from my former places of employment, I could put them on for eternity using the best C.G.I. tools money can buy, and you can\u2019t deliver what we\u2019re showing you here. And it\u2019s happening in milliseconds.\u201d He knew it was time to leave C.G.I. behind. As he put it: \u201cHow could I go back in good conscience and use horses and buggies and rocks and sticks to make images when this exists in the world?\u201dBack on Sunset Boulevard, Ulbrich pecked some more at his laptop. Now I was Tom Hanks \u2014 specifically, a young Tom Hanks, he of the bulging green eyes and the look of gathering alarm on his face in \u201cSplash\u201d when he first discovers that Daryl Hannah\u2019s character is a mermaid. I can divulge Hanks\u2019s name because his A.I. debut arrived in theaters nationally on Nov. 1, in a movie called \u201cHere.\u201d Directed by Robert Zemeckis, written by Zemeckis and Eric Roth \u2014 a reunion of the creative team behind \u201cForrest Gump\u201d \u2014 and co-starring Robin Wright, \u201cHere\u201d is based on a 2014 graphic novel that takes place at a single spot in the world, primarily a suburban New Jersey living room, over several centuries. The story skips back and forth through time but focuses on a baby-boomer couple played by Hanks and Wright at various stages of their lives, from age 18 into their 80s, from post-World War II to the present day.We are having trouble retrieving the article content.Please enable JavaScript in your browser settings.Thank you for your patience while we verify access. If you are in Reader mode please exit and\u00a0log into\u00a0your Times account, or\u00a0subscribe\u00a0for all of The Times.Thank you for your patience while we verify access.Already a subscriber?\u00a0Log in.Want all of The Times?\u00a0Subscribe.<\/p>\n","protected":false},"author":1,"featured_media":15792,"comment_status":"close","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-15790","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-technology"],"_links":{"self":[{"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/posts\/15790","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=15790"}],"version-history":[{"count":2,"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/posts\/15790\/revisions"}],"predecessor-version":[{"id":15793,"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/posts\/15790\/revisions\/15793"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/media\/15792"}],"wp:attachment":[{"href":"https:\/\/medexperts.pro\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=15790"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=15790"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=15790"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}