As policy makers in the UK weigh how to regulate the AI industry, Nick Clegg, former UK deputy prime minister and former Meta executive, claimed a push for artist consent would “basically kill” the AI industry.
Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models. But he claimed it wasn’t feasible to ask for consent before ingesting their work first.
“I think the creative community wants to go a step further,” Clegg said according to The Times. “Quite a lot of voices say, ‘You can only train on my content, [if you] first ask’. And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data.”
“I just don’t know how you go around, asking everyone first. I just don’t see how that would work,” Clegg said. “And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight.”
Then please, ask them.
Very good. Please do that. Now.
Good
But why won’t anyone think of the AI shareholders…
I do think of them! Though, I’m lucky that thoughts aren’t subject to § 212 StGB.
Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models.
No, it should be the opposite. The creative community should have to opt in. AI can run off the uploaded pieces. Everything else is theft.
But he claimed it wasn’t feasible to ask for consent before ingesting their work first.
What the fuck…?! Send a fucking email. If you don’t get an answer, then it’s a “No”. Learn to take no for an answer.
The big issue is that they don’t just “do not ask”, they also actively ignore if it if someone tells “no” upfront. E.g. in a robots.txt
Yeah, if they cant bother to check for an opt-in, how should we trust them to respect an opt-put?
I’m starting to think we need to reframe this a little. Stop referring to “artists”. It’s not just lone, artistic types that are getting screwed here, it’s literally everyone who has content that’s been exposed to the Internet. Artists, programmers, scientists, lawyers, individuals, companies… everyone. Stop framing this as “AI companies versus artists” and start talking about it as “AI companies versus intellectual property right holders”, because that’s what this is. The AI companies are choosing to ignore IP law because it benefits them. If anyone, in any other context, tried to use this as a legal defense they would be laughed out of the courtroom.
And adhering to the law would kill my thriving “pay me a dollar and I allow you to club a billionaire to death”-business. So what?
Or maybe the solution is in dissolving the socio-economic class hierarchy, which can only exist as an epistemic paperclip maximizer. Rather than also kneecapping useful technology.
I feel much of the critique and repulsion comes from people without much knowledge of either art/art history, or AI. Nor even the problems and history of socio-economic policies.
Monkeys just want to be angry and throw poop at the things they don’t understand. No conversation, no nuance, and no understanding of how such behaviours roll out the red carpet for continued ‘elite’ abuses that shape our every aspect of life.
The revulsion is justified, but misdirected. Stop blaming technology for the problems of the system, and start going after the system that is the problem.
IMO tech bros’ main goal for this technology is to use it to manipulate public opinion on social media. It is perfect for it and the “daydreaming” (bullshitting) is perfect.
Notice that all social media are involved in it, Twitter was “sold” to xAI, the recent incident with Grok about South African apartheid. The 10 year ban to regulate it by states etc.
They talk about it increasing productivity (and are hoping that it could be used for that too), but if people would know it is meant for disinformation, they would be even more against skipping copyright for it.
Well then maybe the AI industry deserves to die.
This is true almost every time someone says “but without <obviously unethical thing>”, these businesses couldn’t survive! Same deal with all the spyware that’s part of our daily lives now. If it’s not possible for you to make a smart TV without spying on me, then cool, don’t make smart TVs.
If your business model crumbles under the weight of ethics, then fuck your business model and fuck you.
There’s a big difference in generative image AI, and then AI for lets say the medical industry, Deepmind etc.
And yes, you can ban the first without the other.
Going for AI as a whole makes no sense, and this politician also makes it seem like it’s the same.
Saying AI is the same as just saying internet, when you want to ban a specific site.
There is a very interesting dynamic occurring, where things that didn’t used to be called AI have been rebranded as such, largely so companies can claim they’re “using AI” to make shareholders happy.
I think it behooves all of us to stop referring to things blankety as AI and specify specific technologies and companies as the problem.
Just call it ML then, like we used to, and what describes it best
Perhaps the government should collect money from the AI companies — they could call it something simple, like “taxes” — and distribute the money to anyone who had ever written something that made its way to the internet (since we can reasonably assume that everything posted online has now been sucked in to the slop machines)
What’s a fucking shocking idea right? My mind is blown and I’m sure Mr. Clegg would be ecstatic when we tell him about it! /s
Greedy dumb mfkers.
I think the primary goal of LLM is to use it on social media to influence public opinion.
Notice that all companies that have social media are heavily invested in it. Also the recent fiasco with Grok taking about South African apartheid without being asked shows that such functionality is being added.
I think talking about it to replace white collar jobs is a distraction. Maybe it can some, but the “daydreaming” (such a nice word for bullshit) I think makes the technology not very useful in that direction.
In other news, asking Nick Clegg before emptying out his home would kill the robbery industry.
would ‘kill’ the AI industry
And nothing of value was lost.
"you would basically kill the AI industry in this country overnight.” Cutting through to the heart of the issue here, economic FOMO. “If we don’t steal this data, someone else will”.
I feel the same way about my Linux isos
Nick Clegg says asking artists for use permission would ‘kill’ the AI industry
I fail to see any downside to this.







