Comments
16

Well, apparently this article answers the query in the title: "No, she can't".

And really, who cares? AI is a threat on so many deeper levels that articles like this are the social equivalent to the Law of Triviality, otherwise known as the bike-shed problem.

I don't give a shit where Scarlett Johansson puts her goddamned bicycle... AI is more dangerous than any of this...

Expand full comment

Who is Scarlett and why would we care what she thinks?...her job is to fake being someone else in films that actually are not real...more interested in how AI can create killing machines than I am about an actress"s voice being faked.....

Expand full comment

"[Altman] cryptically tweeted the word Her—in an apparent reference to the 2013 film starring Johansson as [the voice of] an AI chatbot who falls in love with Joaquin Phoenix."

River Page, did you watch the movie, Her? It would appear not.

The lonely man played by Phoenix falls in love with the AI app that he uses, not the other way around. That is the point of the movie, the danger of 'human simulation' by a computer program supplanting the role of another human in the relationship and how that can go awry when the human finds that the app is not going to give him what he wants (here, exclusivity).

Expand full comment

lol, they should use HAL’s voice from 2001: Space Odyssey

Expand full comment

Altman openly admitted to using her voice. I don't understand the confusion. She should be paid, and the voice should be changed.

Expand full comment

Yeah for likeness and training people need to be paid...eagerly awaiting openai/nyt case.

Expand full comment

Can Open AI license her voice from the movie studio that put out Her, rather than her voice from the actor herself? May be cheaper that way.

Expand full comment

I would like to see a debate on the No AI Fraud Act here at TFP.

My initial impression is that it's wrong to deliberately mislead people into thinking that they are hearing from or seeing a specific person, celebrity or not. I wonder how it can be enforced and, of course, abused.

Expand full comment

Labeling things AI does not change copy right / left laws , AI will be in law suits for taking data away from creators without payment . Search engines have been playing with AI concepts for the last 20 years and because they were a major way creators got hits and $$ there stealing of data was allowed somewhat. But when you cut the creators out of hits and $$, your going to get a law suit , now how they settle such suits will be interesting to see . This is just normal computing trying to push the boundaries and of course make money from it . Call it AI , call it search, call it clouds it's all marketing and sales BS , and from the outsider it looks new but insiders know nothing is really new !

Expand full comment

So the one entity we don’t want any where close to controlling digital platforms, people are now cheering on to protect us all. Is any watching what joe’s madness is being put out by Joe’s bureaucracy?

Expand full comment

My primitive "understanding" of AI is that it inherently depends upon being "trained" by massive amounts of appropriated data, the ultimate sources of which are largely uncompensated. If Ms Johannson chooses to challenge OpenAI and succeeds it will be a pyrrhic victory for the many millions of little people whose data has also been scraped but who lack the resources to do anything about it.

Expand full comment

Yes that is AI, there are decision trees that govern what path or paths an AI takes then the data lookup and finalizing the results , all of this is done in a optimized way , i.e. this is where they cheat in order to stay fast . All of the above points the process in a direction of the creators choosing . AI is not thinking for it's self there just isn't the processor speed that could really do that . This is important it's not thinking it's just following code and data and decision trees all programmed , into code or databases .

Expand full comment

The power of artificial learning is in creating code that creates its own decision trees. Of course, it does so based on its programming.

People who are worried about what AI can do should be more worried about what people who write AI allow it to do outside of the code. One example being "self-driving" cars. In most cases, much safer than human drivers, but the mistakes they can make are not mistakes we expect from other drivers, so it's hard to share the road with them.

Another issue, as governments in the western world are doing their best to eliminate reliable energy grids, is that data centers require more and more power. Relying on applications that use modern data centers may be dangerous because they won't work when energy is rationed.

I'm not so worried about the legality of the training of AI. Everything we know that we haven't learned first-hand comes from appropriated history. It's not so much whether data is legally scraped. The important issue is whether the data is being scraped accurately. I do not trust the tech bros in California any more than I trust the bigots at MSNBC or The New York Times.

Expand full comment

People in IT are never worried about the law until they get served , I was in the business for 40+ years and saw it a lot. AI will be served once the owners of data start seeing there data given away . I worked next to a 40 story building full of lawyers looking for the right case to sue , and talked to a few of them about how it all worked , that building was only 1 of 3 . Chicago , Germany and AU a 24 hour operation . They had there hands in a lot of pockets .

I used to build out data centers as a side job, there is always a data center that will give you 24x7 power all for a price , and these types of data centers are across the county (and a lot of the world) with 3 forms of uninterrupted power this is pretty standard these days so power will always be there for the right price . If you have the money you can run it without much worry . If your a startup using some cloud then I might worry . The big guys build clouds , the little use them.

Expand full comment

If they wanted an actual attractive voice they wouldn't use Booties-Walking-on-Ground-Glass-and-Getting-Shredded-at-Every-Step Johannson. They would use a great voice, a relevant voice, a voice that everyone on Earth recognizes, a voice that matters.

The original TV show Star Trek computer.

Expand full comment

Majel Barrett was married to Gene Roddenberry, and, sadly, died 15 years ago. She played several roles on Star Trek, including the computer voice, Nurse Chapel and a major character in the pilot that was essentially replaced by Spock.

According to many sources, Amazon's Alexa was designed to sound like her and the Alexa project's internal name was Majel.

Johansson did a nice job in Her, but I thought for sure that she was trying to sound like Barrett. Perhaps the Roddenberry estate could sue her, since Hollywood seems to be all about the lawsuit these days, now that fewer people are watching their movies.

Expand full comment