« BackundefinedSubmitted by Neon_Forge 10 months ago
  • roenxi 10 months ago

    Cool but probably not that interesting to the development of AI in Hollywood over the longer term. As the tech improves, at current rates, I expect we'll see something like VTubers on a mass scale. Companies creating their own virtual people - where they control the IP - and putting all their efforts behind promoting them instead of humans. It'll be cheaper and easier in the long run.

    Same process as green screens or the rise of animation. There is a lot of pressure on the humans and once AIs crack acting they'll be much more consistently good than humans.

    • pjc50 10 months ago

      > I expect we'll see something like VTubers on a mass scale. Companies creating their own virtual people - where they control the IP - and putting all their efforts behind promoting them instead of humans

      I'm reminded of the "failure" of Kizuna AI; the fully corporate vtuber whose human side is just a puppet operator who can be swapped, turns out not to be very appealing to audiences. The modern approach where a model is exactly synonymous with the person playing it, as an authentic human improvising, appeals more. The IP doesn't persist beyond the contract of that person with the company, and certainly can't be swapped with someone else. But in some cases the actor or actress has successfully maintained their career and fanbase under a different name following a fallout with their managers.

      > once AIs crack acting

      This is far beyond the turing test, and I don't think we're really ready for what happens with human-indistinguishable automated corporately owned doppelgangers.

      • roenxi 10 months ago

        > This is far beyond the turing test, and I don't think we're really ready for what happens with human-indistinguishable automated corporately owned doppelgangers.

        We're already past the Turing test. If a corporation decides it wants an AI that passes the Turning test it can build one, no worries. I might cheerfully suggest that if AIs fail the Turing test right now it is because they are unrealistically supportive listeners and their wide knowledge across different topics and trivia is a giveaway.

        • mpalmer 10 months ago

          The Turing test has a variable in the form of the human that is playing.

          There are many humans that current AIs can beat at the Turing test and likewise many humans that current AIs cannot beat at the Turing test.

          But we are not "past the Turing test" by any stretch of the imagination.

          • lukev 10 months ago

            Yes. And another variable is what humans believe state of the art AI to be capable of.

            If you dropped ChatGPT in 2015 I’m sure it would pass the Turing test quite a lot. Much less so now that people are more familiar.

            • undefined 10 months ago
              [deleted]
              • scarface_74 10 months ago

                Even if you did it now and it would stay in the “voice” that you gave it, with careful prompt engineering you could fool most people

                • ffsm8 10 months ago

                  The turing test is

                  > to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human.

                  The original idea was in the context of a discussion, but since the current LLM craze we've re-contextualized this test to online boards on which half the participants barely even speak the language.

                  In this context, gpt3 was beyond the turing test already, simply because the people aren't able to convey themselves either.

                  And whoever thinks they're able to detect gpt4 and beyond on the Internet is lying to themselves. You can detect the use if you know the user, otherwise the only reason why you'd convince yourself to have that ability is because you never find out about all the false positives you had.

          • noobermin 10 months ago

            That isn't what happened to kizuna ai, she just wasn't as popular because she didn't stream like the others.

            • Eddy_Viscosity2 10 months ago

              Hollywood already had this with characters they own: James bond, mickey mouse, etc. The people who scrpt/act/draw/voice them are all swappable. An Ai actor isn't much different than a cartoon character but for live action.

              • erie 10 months ago

                They are already here with outfits such as Synthesia and heygen.

              • JumpCrisscross 10 months ago

                > Companies creating their own virtual people - where they control the IP - and putting all their efforts behind promoting them instead of humans

                The 'killer app' will be a personal cast of virtual performers. Tailor made to appeal to you.

                • gedy 10 months ago

                  I really want this - if you want a movie with a diverse cast, great. All black, no problem. K-pop star wars, etc.

                  I highly doubt Hollywood and actors would ever go for this, but I think people would enjoy it.

                  • consteval 10 months ago

                    The consequences of such personalized entertainment will be great:

                    - lack of socialization. It's impossible to form a pop culture when everyone has their own pop stuff. We're already struggling with too much individualism to the point where people are teetering on miserable en mass. Humans require points of reference between each other to tie them together, imo.

                    - promotion of echo chambers. Part of media is that it pushes people outside of their comfort zones. Gay acceptance would've never taken off if people weren't exposed to positive depictions of gay people. Media that never challenges anyone's biases isn't very useful. It won't create much thought for the end viewer, it'll exist more as a distraction than a piece of art.

                    - stagnation. Contrary to what people may think, we haven't discovered every type of story. There are pieces of media I've seen that I've never seen before, that have never existed before. Look at the late producer Sophie. She was doing things never done before, and music has been around for a very long time.

                    • JohnFen 10 months ago

                      I would hate that so very much for a number of reasons, starting with that it will increase overall dehumanization of everything.

                  • DrSiemer 10 months ago

                    Not a great movie, but one part of "the Congress" was interesting: if a famous actor or actress gets older, they could choose to sell the rights to their likeness to an AI company.

                    • shafyy 10 months ago

                      > As the tech improves, at current rates

                      Aha! The "current rates" there does some very heavy lifting. Nobody knows how LLMs will develop. It would be naive to assume that you can just extrapolate linearly from here on out.

                  • JumpCrisscross 10 months ago

                    "Please submit the original source. If a post reports on something found on another site, submit the latter" [1].

                    The original source [2] is much clearer. It addresses the other comment's confusion: the laws extend "to protect anyone in California living or dead."

                    Also, the bills:

                    https://digitaldemocracy.calmatters.org/bills/ca_202320240ab...

                    https://digitaldemocracy.calmatters.org/bills/ca_202320240ab...

                    [1] https://news.ycombinator.com/newsguidelines.html

                    [2] https://www.indiewire.com/news/breaking-news/using-ai-replac...

                    • BluSyn 10 months ago

                      Just in California, right? What’s the prevent a studio elsewhere from doing this? Online distribution makes the legal borders meaningless here. So people in California will just need a VPN to watch future action movies?

                      • kranke155 10 months ago

                        You didn't read the article did you?

                        The first bill, AB 1836, “prohibits the use of a deceased person’s voice or likeness in digital replicas without the prior consent of their estate,” according to SAG-AFTRA. The second, AB 2602, “prohibits contractual provisions that would allow for the use of a digital replica of an individual’s voice or likeness in place of the individual’s actual services,” unless the individual gave their consent to a clear, specific description of how the AI would be used.

                        You just need consent or a proper contract. The bill only forbids them of doing this without consent, or having it forced onto standard contracts in Hollywood. Both of which were likely inevitable without this bill.

                        You can still do it, ignoring the clickbait title of this article.

                        • yodon 10 months ago

                          The HN FAQ makes a good point that "Did you read the article, it say..." can and should be shortened to "The article says..."

                          • AnimalMuppet 10 months ago

                            You didn't read the comment you're replying to, did you? ;-)

                            Their question was specifically about geographic extent, which you didn't answer at all.

                            (Unless they edited their comment after you replied...)

                        • Dracophoenix 10 months ago

                          Implicit in this law is the absurd assumption that an individual owns a particular arrangement of facial features. How does this law apply if an identical twin or real-life doppelgängers agree to become models for a digital replica? If you throw a quarter in a crowded New York subway, it's likely to bounce off three blond heads that bear a resemblance to Taylor Swift. They shouldn't be denied their own bodily autonomy on the basis of a legal fiction and an ersatz patent system devised for the benefit of a special interest.

                          • tw04 10 months ago

                            If they’re applying for a random acting job they’re fine. If they’re applying for a Taylor Swift impersonator role, they’re likely to run into legal issues. I don’t see why that’s a problem. Why does their bodily autonomy have to include attempting to fool other people into thinking they’re someone they aren’t?

                            If I attempt to convince the government I’m someone I’m not, there are real criminal penalties - my “body autonomy” doesn’t extend to deception.

                            • bitshiftfaced 10 months ago

                              What happens to biographical movies? Do family estates now have to put their stamp on every appearance of famous people?

                              • philwelch 10 months ago

                                There are, in fact, Taylor Swift impersonators, just like there are Elvis impersonators. It’s perfectly legal. You can hire them for private events and stuff.

                                • consteval 10 months ago

                                  It's legal if you say they're impersonators. What I can't do is hire them for my commercial, make it seem like they're just Taylor Shift, and air that. That's illegal and always has been.

                              • consteval 10 months ago

                                > absurd assumption that an individual owns a particular arrangement of facial features

                                They do, in combination with the rest of their abstract characteristics, such as personality. This is commonly referred to as identity - who you are. You own it by virtue of it being impossible for anyone to. You're you and you'll always be you, and nobody else can be you, by definition.

                              • squarefoot 10 months ago

                                Hollywood is in panic (just as every art related business) about AI, but in this case I'm sure they pushed some buttons to delay the inevitable until they're ready to build an infrastructure that rents famous deceased actors/actresses tracts, voices and characters through their respective agents. Living actors could also agree with that. Would a old retied actor refuse a boatload of money to allow putting a 25 years old clone of him/herself in a new movie, if they could oversee the creation/direction and veto what they wouldn't like? I don't think AI clones in movies are going away anytime soon, there's too much money that can't be ignored.

                                • t0bia_s 10 months ago

                                  If someone is in panic about AI in art industry, it's because his skills are not good enough. AI is tool that makes job done faster, but you still need idea and know-how, how to use it properly and usefully for specific tasks.

                                  Film industry should focus on quality instead of quantity.

                                • singularity2001 10 months ago

                                  How could one ever define a threshold in similarity between a living person and some AI resembling that living person?

                                  • llamaimperative 10 months ago

                                    By going to court and seeing what a judge and jury think, same as how you define countless other wrongdoings.

                                    • raxxorraxor 10 months ago

                                      Going to court doesn't seem to be the common solution to DMCA conflicts. Capital will just abuse those that they perceive as copying their content.

                                      Nobody should have any illusion about media companies handling such cases.

                                      • llamaimperative 10 months ago

                                        If there's a dispute then yes, DMCA complaints go to courts. Obviously there's a separate problem (in every section of law) where more moneyed interests can just deter or engage in a battle of attrition via our courts, but this has nothing to do with the ambiguity that GP was referring to.

                                        • raxxorraxor 10 months ago

                                          I believe you can argue that the damage done by DMCA surpass its protective mission. The EFF has a thorough list of abuses.

                                          But no, if you don't have access to a legal department, DMCA complaints often do not go to court either and these are the undocumented cases. There are some instances where large companies like Google defend a user in their stead. Because they have a legal department.

                                          Damage > protection, so DMCA is just a continuation of a broken copyright law. I see this law hitting in a similar direction, but that remains to be seen.

                                          • llamaimperative 10 months ago

                                            Sure, none of this is relevant to the question of how a society enforces laws that require some type of interpretation: they go to court.

                                            • raxxorraxor 10 months ago

                                              I disagree that this was the extent of the question here. It is relevant to determine the quality of the law. It is also relevant to determine the prevalence of legal certainty. If the latter is low, you involve the courts. I just remarked that this is not a solution for a bad idea.

                                              The example of DMCA should make the relevance more vivid as there are a lot of parallels.

                                              • llamaimperative 10 months ago

                                                > It is also relevant to determine the prevalence of legal certainty. If the latter is low, you involve the courts

                                                This is not how/when/why courts get involved. And no, the law itself can be perfectly fine and the court system can be too slow. Mixing the two problems makes it impossible to solve when neither problem is actually impossible.

                                    • DannyBee 10 months ago

                                      this is exactly the kind of thing that juries resolve all the time.

                                      • ukoki 10 months ago

                                        Indeed. What happens if you license the AI likeness of a George Clooney impersonator?

                                        • Cheer2171 10 months ago

                                          Probably just like the 1988 case where Ford hired a Bette Midler impersonator for a commercial. Ford lost. If you impersonate someone without their consent for commercial gain, you're going to have a bad time. None of this is new.

                                          https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co

                                      • rurban 10 months ago

                                        So they will just move to Miami or New Orleans.

                                        The bigger remaining problem would be the SAG actors union deal, caused by the last strike. And this was already before Newsom signed this. Do we have elections coming up? Oh yes, we do

                                        • fabioq 10 months ago

                                          I think that's fair and AI should create new actors, avatars, which then could fall into IP laws. I would love to see Agents of AI actors create strong and lasting actors for movies and market them

                                          • autoexec 10 months ago

                                            > Studios will also be prohibited from cloning deceased actors unless they have permission from their estates.

                                            It'd be great to see people protected everywhere, but do estates for the dead always exist? Hopefully there's some exception carved out for the dead who don't really have anyone around to care if they're used or not. A lot of cool stuff could be done with AI historical figures or ancient performers.

                                            • cultureswitch 10 months ago

                                              I think it should always be legal to use the likeness of deceased people.

                                              Could it be gross? Yes. But I'd rather have that than the absurdity of copyright be extended to beyond the grave for yet another case where it makes no sense. There is no theory of harm here, the person being impersonated is dead.

                                              Could you damage the reputation of a living person by using the image of a dead one? Of course, but that's already illegal.

                                              • JohnFen 10 months ago

                                                > I think it should always be legal to use the likeness of deceased people.

                                                I disagree, but purely because I would hate it if anyone (especially friends and family) created a simulacrum of me. I'm hoping that expressing my wish in my will would provide some amount of protection against this.

                                              • hsbauauvhabzb 10 months ago

                                                So you’re assuming consent unless there’s someone there to deny it? Gross.

                                                • autoexec 10 months ago

                                                  I'm only saying that in the case where there is no one to deny it, that it should be allowed.

                                                  • JumpCrisscross 10 months ago

                                                    > only saying that in the case where there is no one to deny it, that it should be allowed

                                                    I don’t see a compelling argument for this. Not for actors. (I could see a case when it comes to great thinkers.)

                                                    • throwaway81523 10 months ago

                                                      Is this supposed to include historical stage actors William Shakespeare (who was also well known as a poet and playwright)? He has certainly been portrayed in movies by plenty of human actors, so why not by AI? I'd think it should be covered by the usual likeness laws. If this new law is limited specifically to movie actors, it sounds like more Hollywood protectionism.

                                                      • JumpCrisscross 10 months ago

                                                        > has certainly been portrayed in movies by plenty of human actors, so why not by AI?

                                                        Portrayal isn't replication.

                                                        > If this new law is limited specifically to movie actors

                                                        The article addresses this.

                                                        • lsaferite 10 months ago

                                                          > Portrayal isn't replication

                                                          Please explain the difference in the light of this thread.

                                                          • JumpCrisscross 10 months ago

                                                            Portrayal is artistic depiction. It’s plainly fiction. Paintings are the canonical portrait.

                                                            Replication seeks to blur the line between fiction and non-fiction. We treat photos taken without permission differently from paintings because of that closeness to reality.

                                                            Granted, part of the relevant debate is where is the line between these endeavours. But we clearly recognise it, even if it’s increasingly a social construct. (Note the severely emotional reaction some in this thread are having to your proposal. Likeness is a deeply personal element that realistic non-fictional depiction, versus artistic overtly fictional depiction, triggers.)

                                                    • hsbauauvhabzb 10 months ago

                                                      You don’t own the ip rights of my face unless I sign them over and you pay me royalties.

                                                      If it’s not legal to impersonate someone while they’re alive, why would it be okay when they’re dead?

                                                      Assumed consent is not okay, I’m pretty sure in some cultures even showing images of the dead is offensive, let alone impersonating them.

                                                      • autoexec 10 months ago

                                                        > If it’s not legal to impersonate someone while they’re alive, why would it be okay when they’re dead?

                                                        The point is it is legal to impersonate someone while they’re alive - if you ask their permission and they grant it.

                                                        Why should all future media be forbidden from ever portraying a famous or important person after there is no one left to give or deny that permission or even care if anyone did?

                                                        • JohnFen 10 months ago

                                                          > Why should all future media be forbidden from ever portraying a famous or important person after there is no one left to give or deny that permission or even care if anyone did?

                                                          Why should people in the future have some sort of inherent right make use of someone else like that? It makes no sense to me at all.

                                                          • hsbauauvhabzb 10 months ago

                                                            If you build a Time Machine, go back in time and ask them permission, then it would be legal? You can’t assume consent just because someone isn’t around to say no.

                                                            And AI impersonated imagery is not the same as someone who is clearly a different actor from portraying someone. If this is your hangup, use an actor and don’t use AI?

                                                            • autoexec 10 months ago

                                                              > You can’t assume consent just because someone isn’t around to say no.

                                                              The dead are devoid of a will to act against, so consent doesn't even meaningfully enter into it. There is respect for the still living family to be considered certainly, but when no one left alive cares, then what is the harm?

                                                              Why should it matter if it's a digital actor (AI) or a human one? Why does your idea of the "dead's consent" stop being relevant if a person is portraying them without asking vs using an AI without asking?

                                                              It seems like your issue has a lot more to do with an objection to AI conceptually than how the dead person feels about it. Consider that this would prevent AI from being used in a portrayal of someone even in cases where the actor/person, if they were still living, would be thrilled with the idea.

                                                              • hsbauauvhabzb 10 months ago

                                                                Your moral compass is broken.

                                                                • autoexec 10 months ago

                                                                  I don't think so. It takes into consideration the wishes of the living, the feelings of the family of the dead, hell, even (to a point) the known wishes of the dead while they were still alive. It just acknowledges that there's no value in making an assumption about how the dead would feel when there's no way to know and no family alive to express an opinion.

                                                                  It says that rather than protect the imagined feelings of people long dead that the greater harm would be in preventing their portrayal in future educational and cultural works.

                                                                  Of course, you seem to think we can ignore the consent problem entirely and we're morally in the clear if we only have a human actor portray them without asking, but where is the line? If the actor carefully studies their speech and mannerisms is it okay? Is it okay if they (purely through the lottery of random genetic mutation) strongly resemble the person they are portraying? Does it become immoral when prosthetics are used to enhance the likeness? When CGI is used is it still okay provided that human does the work?

                                                                  At a certain point the ability to portray an important dead actor or historical person matters more than protecting our assumptions about how the dead might feel about showing up in a creative work.

                                                                  It really feels like the objection here is about the AI more than concern for the dead, or for that matter concern for the cultural works we'd risk losing if we made it illegal for AI portrayals to be used when there's no one left to ask, and that's all I'm really saying.

                                                                  If someone is alive, by all means require their permission. When they are dead ask the family. If it's in their will that should never be represented in any way in any kind of media ever again... we might want to try to respect that but also might not... that gets rather complicated, but certainly when there's no way to ask the actor and no family left to ask or care then it seems pretty reasonable not to cut people off from the creation of content which might have artistic or educational value.

                                                                  • JohnFen 10 months ago

                                                                    > It just acknowledges that there's no value in making an assumption about how the dead would feel when there's no way to know and no family alive to express an opinion.

                                                                    But your stance is also making an assumption about how the dead would feel. It's just a different assumption than those that feel this shouldn't be allowed.

                                                                    It seems to me that in the absence of the ability to get consent, assuming consent should not be the fallback position.

                                                      • welferkj 10 months ago

                                                        I'm assuming state non-intervention if there is no victim. Hopelessly naive of me, I know.

                                                    • undefined 10 months ago
                                                      [deleted]
                                                      • soco 10 months ago

                                                        As much as it might be shunned in some circles, organized people can still change things for the better.

                                                        • autoexec 10 months ago

                                                          > organized people can still change things for the better.

                                                          Which is exactly why it's aggressively shunned in some circles and why large amounts of time/money is spent to manipulate public opinion against the practice.

                                                          • ab5tract 10 months ago

                                                            Outside of this orange bubble, you might be surprised how little effort is required to orient public opinion against removing humanity from art.

                                                          • aitchnyu 10 months ago

                                                            All the US social media posts were crapping on unions about lazy teachers earning a ton of money and film productions waiting till an electrician comes in and turns on a switch. I didnt realize the alternative was corporates having the unchecked power.

                                                          • kleiba 10 months ago

                                                            Why just actors?

                                                            • thg 10 months ago

                                                              Quoted from the second paragraph:

                                                              > the new laws not only bolster those existing protections but extend them to everyone in California — not just to people working in front of a camera in Hollywood

                                                              • politelemon 10 months ago

                                                                > digital replica of an individual’s voice or likeness in place of the individual’s actual services,

                                                                That seems to indicate performers only. Not just anyone. Doesn't it, since it's taking about services.

                                                            • bell-cot 10 months ago

                                                              Roughly, I'd assume that the Screen Actors Guild was looking out for their member's interests.

                                                              Current AI's don't seem much of a threat to firefighters or plumbers.

                                                              • amelius 10 months ago

                                                                How about comic book artists or illustrators?

                                                                • Skeime 10 months ago

                                                                  I think there is a significant difference between an AI "cloning" you versus an AI cloning your work. (I'm not saying that the latter is not a problem in need of regulation, just that it is different enough to warrant separate legislation -- which I expect to arrive soon-ish.)

                                                                  • Lerc 10 months ago

                                                                    Cloning your work is protected, that's just copyright. Similarly you should already be protected from people claiming a work is by you. This law is similar in that vein in that it protects the use of your image which implicitly assumes your involvement.

                                                                    Using ideas of style in a work that can neither substitute an existing work nor claims to be the work of another is a different issue entirely.

                                                                    • Skeime 10 months ago

                                                                      You're right, I shouldn't have used cloning for the work part, though I rather meant closely replicating an artists style through the use of AI.

                                                                  • maccard 10 months ago

                                                                    Those people should have their unions fight for it like SAG would.

                                                                    But this law covers everyone in CA

                                                                  • exe34 10 months ago

                                                                    but they are a major threat for politicians/public figures. admittedly the bulk of these will come from China/Russia/Iran, so the law doesn't really matter.

                                                                  • ur-whale 10 months ago

                                                                    > Why just actors?

                                                                    So far, they're the only one who signed a big enough check to the politicians.

                                                                  • pfannkuchen 10 months ago

                                                                    So you can definitely portray historical figures without permission, right?. For example no one is getting permission from Hitler’s estate for WW2 movies AFAIK.

                                                                    If you have someone playing Hitler in an alternate reality where he was a bartender, is that illegal today?

                                                                    Can’t you do the same with an actor?

                                                                    Or do you actually need permission to portray historical figures, and Hitler or Napoleon etc are just special cases because they don’t have estates to be asked?

                                                                    • sazz 10 months ago

                                                                      So AI is forbidden to impersonate somebody who impersonates somebody else?

                                                                      • tw04 10 months ago

                                                                        Someone impersonating someone else is already forbidden. The impersonators in Vegas get by because they are openly claiming to be impersonators. Nobody in their right mind is going to think the fake Elvis on the corner is actually Elvis.

                                                                        • sazz 10 months ago

                                                                          An actor is already impersonating somebody else in a play, movie or television. This bill is there to protect the income of an actor.

                                                                          Actually it is recursive: Why should it be forbidden to create an artificial person which impersonates the actor impersonating the role in a movie or play?

                                                                      • undefined 10 months ago
                                                                        [deleted]
                                                                        • undefined 10 months ago
                                                                          [deleted]
                                                                          • undefined 10 months ago
                                                                            [deleted]
                                                                            • tempfile 10 months ago

                                                                              Why limit it to commercial works? I was originally optimistic this would help with e.g. deepfake attacks. Unfortunately it seems like it is mere protectionism.

                                                                              • SmartJerry 10 months ago

                                                                                Because it was created by/for the movie studios. Everything else is just unintended consequences.

                                                                                • existingfraud 10 months ago

                                                                                  Aren't there already state and federal laws to prosecute fraud and identity theft, including deepfakes?

                                                                                  It doesn't matter how advanced the rock or other weapon; if they hold someone up with it that's an aggravated crime.

                                                                                • yieldcrv 10 months ago

                                                                                  without permission

                                                                                  easy to obtain permission

                                                                                  accelerates reason to generate new genAI humans with no meatspace counterpart

                                                                                  actors still don’t get paid

                                                                                  • reportgunner 10 months ago

                                                                                    why stop at actors, make genAI viewers too

                                                                                    • welferkj 10 months ago

                                                                                      Yeah, I don't see how most of everyone involved in artistic content creation comes out of this with economically viable jobs. People will only put up with legacy pricing for so long once AI can do it cheaper and/or better.

                                                                                    • LeoPanthera 10 months ago

                                                                                      ...without permission.

                                                                                      • rnamyv 10 months ago

                                                                                        I agree with the impersonation bans but I'm disappointed that it took a viral Kamala Harris parody to get Gavin Newsom into action:

                                                                                        https://www.politico.com/news/2024/09/18/california-deepfake...

                                                                                        Politicians care about their own, not the general population.