Photoshop's newest terms of service has users agree to allow Adobe access to their active projects for the purposes of "content moderation" and other various reasons. This has caused concern among…
Expect a LOT more companies to do stuff like this. Because “deep fake” porn is a plague and nobody (reputable) wants their software to be the go to for violating people.
Yes. Photoshop is not currently equal to deepfake porn. It is a few popular plugins away from being it though. Hence getting out ahead of things with content policies.
And… NSFW digital art is not as good money as you think it is. At least, not at the corporate/software level.
Artists for furry porn aren’t generally paying for 100+ enterprise licenses. But then, people doing more questionable stuff probably aren’t paying at all so it still doesn’t make sense.
That’s what I was thinking. Deep fakes have existed since photo manipulation was invented, and Adobe hasn’t cared one iota about it before. The only reason I can see for them to care now is if they think they can get in legal trouble for what people create with their products.
But also… that is kind of the point. Adobe and basically every company that isn’t a porn company doesn’t care about the revenue from porn. And the companies that DO care about the revenue are constantly fighting piracy.
There are some patreon-like artists who make bank for getting their Source Film Maker on. But they are a handful of licenses, at best.
That’s what I was thinking. Apart from the porn locked up in the Disney vault, big companies aren’t in the business of making porn. And the companies that do aren’t going to be interested in deep fakes. The people who are using Photoshop to create porn are small fries to Adobe. Deep fake porn has been around as long as photo manipulation has, and Adobe hasn’t cared before.
Bearing that in mind, I don’t think this policy has anything to do with AI deep fakes or porn. I think it’s more likely to be some new revenue source, like farming data for LLM training or something. They could go the Tumblr route and use AI to censor content, but considering Tumblr couldn’t tell the difference between the Sahara Desert and boobs, I think that’s one fuck up with a major company away from being litigation hell. The only reason that I think would make sense for Adobe to do this because of deep fakes is if they believe that governments are going to start holding them liable for the content people make with their products.
I know AI is the big bogeyman right now (and it is especially pertinent to Adobe because the stuff that makes Photoshop and Premier and the like so good are the “AI” tools they have had… for the better part of a decade), but I think there is almost a zero chance that is a factor in this*
Because… the big companies care about that. If using Illustrator means that all of their content is being used to train models for their competitors? You can bet that MASSIVE amounts of money would be pumped into Inkscape and the like overnight. Almost as much money as they pump into the lawyers who will own Adobe by the end of the month. Same with Premier and Photoshop and all the other ones.
I DO expect Adobe to release something akin to a RAG based tool so that Company A can “save money” by feeding in all of their personal IP as training data to make a semi-personalized model. But there is zero chance that adobe is going ot risk aggregating that themselves.
*: Unless the secret is that Adobe wants to develop a service to detect the probability that art was used in the training of a model or even to implement some form of DRM to identify stolen art. Similar to what those god awful NFT models failed to do.
I mean… they ARE telling you?
Expect a LOT more companies to do stuff like this. Because “deep fake” porn is a plague and nobody (reputable) wants their software to be the go to for violating people.
Photoshop != deepfake porn. Although it might get used to touch up some images for realism.
Which isn’t where the money is in NSFW digital art.
Yes. Photoshop is not currently equal to deepfake porn. It is a few popular plugins away from being it though. Hence getting out ahead of things with content policies.
And… NSFW digital art is not as good money as you think it is. At least, not at the corporate/software level.
What do you mean by “at the corporate/software level”? What corporations are drawing furry porn?
Artists for furry porn aren’t generally paying for 100+ enterprise licenses. But then, people doing more questionable stuff probably aren’t paying at all so it still doesn’t make sense.
That’s what I was thinking. Deep fakes have existed since photo manipulation was invented, and Adobe hasn’t cared one iota about it before. The only reason I can see for them to care now is if they think they can get in legal trouble for what people create with their products.
I mean, have you seen Gadget?
But also… that is kind of the point. Adobe and basically every company that isn’t a porn company doesn’t care about the revenue from porn. And the companies that DO care about the revenue are constantly fighting piracy.
There are some patreon-like artists who make bank for getting their Source Film Maker on. But they are a handful of licenses, at best.
That’s what I was thinking. Apart from the porn locked up in the Disney vault, big companies aren’t in the business of making porn. And the companies that do aren’t going to be interested in deep fakes. The people who are using Photoshop to create porn are small fries to Adobe. Deep fake porn has been around as long as photo manipulation has, and Adobe hasn’t cared before.
Bearing that in mind, I don’t think this policy has anything to do with AI deep fakes or porn. I think it’s more likely to be some new revenue source, like farming data for LLM training or something. They could go the Tumblr route and use AI to censor content, but considering Tumblr couldn’t tell the difference between the Sahara Desert and boobs, I think that’s one fuck up with a major company away from being litigation hell. The only reason that I think would make sense for Adobe to do this because of deep fakes is if they believe that governments are going to start holding them liable for the content people make with their products.
I know AI is the big bogeyman right now (and it is especially pertinent to Adobe because the stuff that makes Photoshop and Premier and the like so good are the “AI” tools they have had… for the better part of a decade), but I think there is almost a zero chance that is a factor in this*
Because… the big companies care about that. If using Illustrator means that all of their content is being used to train models for their competitors? You can bet that MASSIVE amounts of money would be pumped into Inkscape and the like overnight. Almost as much money as they pump into the lawyers who will own Adobe by the end of the month. Same with Premier and Photoshop and all the other ones.
I DO expect Adobe to release something akin to a RAG based tool so that Company A can “save money” by feeding in all of their personal IP as training data to make a semi-personalized model. But there is zero chance that adobe is going ot risk aggregating that themselves.
*: Unless the secret is that Adobe wants to develop a service to detect the probability that art was used in the training of a model or even to implement some form of DRM to identify stolen art. Similar to what those god awful NFT models failed to do.