DiscoverThe AI OptimistSchmidt's Copy-and-Conquer Crashes into AI Copyright Reality Check
Schmidt's Copy-and-Conquer Crashes into AI Copyright Reality Check

Schmidt's Copy-and-Conquer Crashes into AI Copyright Reality Check

Update: 2024-08-23
Share

Description

Everyone's talking about Eric Schmidt's Silicon Valley blunt video. Still, they need to include how it applies to AI and remember the artists' work is in these AI models for nothing.

This episode is about how the AI industry's "copy and conquer" mentality is crashing into the reality of copyright law.

First, the AI leaders came for our content, taking it without permission. It's the internet. As Eric Schmidt brazenly put it:

"The example that I gave of the TikTok competitor. And by the way, I was not arguing that you should illegally steal everybody's music.

What you would do if you're a Silicon Valley entrepreneur, which hopefully all of you will be, is if it took off, then you hire a bunch of lawyers to go clean the mess up, right?

But if nobody uses your product, it doesn't matter. That you just stole all the content. Do not quote me on that, right?"

This mindset isn't just limited to social media. It's pervasive in the AI industry, especially regarding training data.

Then they came for our jobs. AI first because who needs arrogant programmers?

“ Imagine a non-arrogant programmer that actually does what you want, and you don't have to pay all that money to.”

Finally, if you get in their way, they send lawyers. In a minute, we'll share a whole bunch of examples.

The Big Tech Goliath Bias – Act Fast and Sue People who get in your way

This billionaire view of AI took a needed hit this week, not just from Eric Schmidt.

The artists' lawsuit against AI image generators like Stable Diffusion and Midjourney may force them to reveal how their black boxes work and shed the cloak of business privacy.

Here’s the actual legal lawsuit document:

 Just like Schmidt and the groundswell, it's rising. Listen to Adam Conover at the recent Animation Guild rally:

"But the fact is, it is a lie.

Your work makes these people hundreds of millions of dollars. You work. They need you."

That's the sound of people fed up with being used by AI leaders who claim they have the right, under what's called fair use, to grab what others create because how else would they build their business?

Poor billionaires. But the artists are really the ones being left out in the cold while the AI leaders preach about bias and ethics.

They have a ton of negative bias against people with jobs. And when it comes to what they do and ethics, it's out the door.

You can see it in their scraping actions without permission or compensation. That's how we do things in the valley, right?

We're trying to get to a truth that unicorn companies do not just tell.

Artists are the Mighty Underdog to Big Tech

The artists' lawsuit against AI image generators is a classic David versus Goliath scenario. Sarah Andersen, Kelly McKernan, Karla Ortiz, Hawke Southworth, Grzegorz Rutkowski, Gregory Manchess, Gerald Brom, Jingna Zhang, Julia Kaye, and Adam Ellis have, on behalf of all artists, accused Midjourney, Runway, Stability AI, and DeviantArt of copying their work by offering AI image generation based on the open-source Stable Diffusion AI model.

The artists allege that Stable Diffusion uses their copyrighted works in violation of the law. The model was allegedly trained on LAION-5B, a dataset of more than 5 billion images scraped from across the web in 2022.

Stable Diffusion CEO Emad Mostaque described the scraping of artists' images without permission as

 "a collaboration that we did with a whole bunch of people.

We took 100,000GB of images and compressed it into a two-gigabyte file that can recreate any of those images and iterations of those."

If that statement is literal, that's a violation of copyright, isn't it? Knowingly.

Emad hasn't been the CEO since March 2024, resigning following an investor mutiny and staff exodus that left the one-time tech darling in turmoil.

This is not ethical behavior, is it? He didn't bother to pay the artists or let them know, and neither did the other defendants who used the data, like Midjourney and DeviantArt.

The AI companies' defense largely rests on the concept of fair use. They argue that their use of copyrighted material falls under section 107 of the Copyright Act, which allows for certain uses of copyrighted material without permission for purposes such as criticism, comment, news reporting, teaching, scholarship, and research.

OpenAI even went as far as to say,

"because copyright covers virtually every form of expression, it would be impossible to train today's AI models”

 without using copyrighted materials for free.

This statement reveals the arrogance and entitlement that permeates the AI industry.

So you make hundreds of millions of dollars but can't pay for it because it wouldn't work.

In fact, so far, they're not really proving much of a business model. And they're the ones, ChatGPT, making all the money.

Now, does that sound ethical to you or just Silicon Valley? Emphasis. Emphasis on con.

Fair Use or Fair Abuse of Copyright?

Despite the odds stacked against them, the artists have scored a small but significant victory.

Judge William Orrick allowed parts of the lawsuit to proceed to the discovery phase. The discovery means that AI companies will have to open up their black boxes and reveal more details about their training datasets and technologies.

Judge Orrick stated:

"This is a case where the plaintiffs allege that Stable Diffusion is built to a significant extent on copyrighted works and that the way the product operates necessarily invokes copies or protected elements of those works."

He further added:

"Whether true and whether the result of a glitch, which is what Stability contends, or by design (plaintiffs' contention) will be tested at a later date."

This decision is crucial because it allows the artists' lawyers to examine documents from the AI image generator companies, potentially revealing more about how these systems were built and trained.

The judge's decision means two important things.

First, the allegations of induced infringement are sufficient to move the case forward to discovery.

Now, the lawyers for the artists can peer inside and examine documents from the AI image generator companies, revealing more details about their training data sets, their tech, and how we got here in the first place.

 Private companies don't have to share unless they do something illegal, like violating copyright law.

Second, this is not a legal decision. The case has a ways to go, but it just means it can move forward and have enough merit to warrant that deeper discovery.

That's what makes this a huge victory. Now, they're going to have to open those black boxes of AI to not only tell us how it works but also the decisions that went into grabbing those materials.

One Small Legal Victory, One Giant Challenge to the AI Industry

This lawsuit is a watershed moment in the AI industry. It's a crack in the wall of big Tech dominance and could lead to more accountability in the future.

 It's about time that the AI industry started respecting the people who create the content it so readily uses without permission or compensation.

We should thank Eric Schmidt for his transparent wake-up call. His private comments reveal how many in Silicon Valley think: prioritizing speed and profit over ethics and fair competition.

Schmidt's talk is how they talk in private, prioritizing speed and profit over ethics and fair competition.

As AI continues to evolve, will today's small victories lead to a more balanced and fair tech industry tomorrow?

We need to support these artists. Go on social media and support these artists, who are putting their time and probably money into this effort. These small victories can lead to significant change.

There could be an uprising going on. Or maybe I'm just crazy, and AI is in control, and we should just let them attack, steal, and sue and accept that as reality. But I don't think so.

This lawsuit is a watershed moment. Reach out to these artists.

Let them know you care. And even more, let the AI companies know that they should reach into their pockets when they get billions of dollars of funding and find a way to pay for the content that determines their output.

That theoretically violates copyright law, but honestly, it's not ethical. It's time for the AI industry to start practicing the ethics it so often preaches about.

Maybe ethics matter  - will AI do the right thing?

You can't make good tech with bad intentions. In their defense, two parts of the lawsuit were thrown out.

One is called the Digital Management Copyright Act of 1998. The DMCA challenge was thrown out because they challenged the Stable Diffusion of copyright to the tech.

 So Stable Diffusion said our tech is copyrighted, but the plaintiffs claim our content is part of that tech. And the judge said those are two different copyrights.

They also did what's called unjust enrichment. And I'll save you the legalese. Go to my Substack, and you can read it for yourself. A deep legal hole. They lost that as well.

And now, as the case possi

Comments 
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Schmidt's Copy-and-Conquer Crashes into AI Copyright Reality Check

Schmidt's Copy-and-Conquer Crashes into AI Copyright Reality Check

Declan Dunn