Ew, steve chowder the naziin all honesty i just know the meme 🤔
Here’s a better version for you friend
next time 👌
There are two versions of Calvin & Hobbes that could be used in his place.
The person in the picture is a piece of shit right wing commentor.
fixed
Thanks for mentioning this. I was so confused at these comments. Vincent may be a terrible guy, but he ain’t a Nazi.
i propose using the tag NSFC, short for Not Safe For Climate
Adobe is trying for the opposite. Content authenticity with digital signatures to show something is not AI (been having conversations with them on this).
And being adobe, they will put a nice little backdoor in it for them to change the credentials so that they can take artists’ work and use it, train their AI with it, and sell it like they have been doing for years.
You can’t change the credentials if the user owns the private key. But nothing stops AI training, that’s part of the terms of service of some of their products, which operate outside the realm of this more open initiative.
Spoken like a real Adobe rep lol.
It’s called a backdoor for a reason. Also since adobe software nowadays has almost full access to your machine, what is to stop adobe from simply uploading and storing your private key on their servers and using it when they like? They run their DRM client with a ton of rights to your computer on boot.
WhatsApp can do exactly the same thing and read every message you write and still claim it is “end to end encrypted” for example because key creation is through a process in their proprietary software.
Not sure why you’d say that, its just a factual statement. Also, I don’t even use Adobe products, and transitioned to GIMP and Shotcut many, many years ago. I work in privacy and data security, so I just happen to be involved with this initiative from the sidelines.
As for your conmetary, you could say the same thing about Signal. But you wouldn’t, because you like them. Just because you don’t like a company doesn’t mean they are being nefarious.
Would I rather a privacy-focused company be doing this? Yes.
Am I pleased with what I see from Adobe (a weekly working group full of identity and open source community members)? Yes.
Does Adobe have a good chance of making this mainstream because of their ecosystem? Also yes.
When you see something better, let me know and I’ll participate there too, vs complaining about those trying.
https://community.signalusers.org/t/overview-of-third-party-security-audits/13243
Here is an entire list of years and years of independent audits
Here, go look yourself to verify that the frontend isn’t sending your encryption key back to the server.
https://www.adobe.com/trust/security.html
Please tell me where I can find the source code of Adobe’s creative cloud DRM that has full access to the computer it is installed on and their audits to verify that they aren’t sending my private keys back.
You are comparing an audited, open source program with closed down proprietary system that says “trust me bro, we work with ‘security partners’, no we won’t release the audits”.
Interesting comparison. It’s like comparing a local farming co-op to the agro-industrial complex of Monsanto/beyer and saying “you could say the same about either! Monsanto is at least innovating in the seed space, no no no, ignore how they use it!!”
You’re taking that out of context. Signal is open source, but you don’t get to see what happens between GitHub and the Play Store. Adobe’s system that I am aluding to is also open, but we don’t get to see what happens in the software itself. The problem is, that’s not even what I’m talking about. I’m talking about a standard they are developing, not their software or DRM.
This isn’t just for Adobe, they’re just starting the process. Other systems can run it. Hardware can run it. Do you not use linux because Canonical or Red Hat contributed? Do you steer developers away from flutter because Google started it? Where is the line? Who do you think kicks off all the standards you use today? OAuth, OIDC, etc. If you want to avoid everything these companies contribe to, you’re going to have to stop using the internet.
How would that work then, I presume most would just ignore it because if it only verifies you used Adobe to make something it’s pretty worthless as a “this isn’t AI” mark.
It uses cryptographic signatures in the cameras and tools. Say you take a photo with a compatible camera, it gets a signature. Then you retouch in Photoshop, it gets a another signature. And this continues through however many layers. The signature is in the file’s EXIF data, so it can be read on the web. Meaning a photo on a news site could be labeled as authentic, retouched, etc.
Edit: Doesn’t require Adobe tools. Adobe runs the services, but the method is open. There are cameras on the market today that do this when you take a picture. I beleive someone could add it to GIMP if they desired.
GIMP is open source, could someone then just tell it to sign anything?
Only with their private key.
As valid and informative as TwitteX’ blue mark.
Oh I’m sure Adobe has the greatest of intentions on this. Such a reputable company that has a stellar past.
I’m sure they won’t gatekeep this digital human signature in some atrocious proprietary standard along with an expensive subscription to have the honor of using it.
Don’t listen to Adobe on AI or even better don’t accept any “idea” or solution from Adobe.
Yeah pretty much.
I recall flash, and how they absolutely controlled it. I loved flash as a young programmer too.
But in retrospect, forcing users to go through adobe to use something, with no alternatives? What a nightmare for a Open Internet.
Very nice idea in theory, but proving there is no AI involved in the creation of art is not something I think is remotely possible. It’s an arms race more than anything, but I’m very interested in how Adobe will tackle it. I think people will be appreciating physical art more again, but even then we could argue about the usage of AI tools.
Anyhow, people will have to come to terms with the fact that AI is here to stay, and will only get better too.
My other reply talks about how this works with cryptographic signatures, but sure, people can lie. The key to this method is if there is a signature from a reputable artist, news org, or photographer, then that origin can’t be forged. So it’s about proving the authenticity (origin) vs the negative use of AI.
Pretty cool indeed, thank you. I like the idea of a cryptographic certificate of authenticity, would definitely add value to the digital art world.
If you can’t tell, does it matter?
Yes.
Yes
I get the intent but I feel that Photoshop and CGI are just as important to label.
Why?
Because generative AI should burn in hell.
It’s here to stay. You’re a luddite with no understanding of the subject matter and should fuck off. The hate bubble is deflating
You’re free to show your AI images to your AI friends who will give you AI congratulations. I’m not sure why I need to be a part of this masturbation.
Do you think computers are generating images for other computers to look at? I applaud your optimism about AI but we’re not there yet. There are always people involved
No, I think the bitcoin nft metaverse gooners of yesteryear have found a new thing to pretend is the future.
Are we saying Yes to AI or yes to Dolores?
Because if the latter… 🤤
Yes to AI
Because edited pictures and CGI require actual effort?, the artist will credit themselves, but ai “artist” most of the time didn’t say it’s ai generated
Westworld-Season-4-Finale-Lisa-Joy
I am in complete agreement with this. While you can currently tell what’s AI it won’t be long before we’re scratching our heads wondering which way is up and which way is down. Hell, I saw an AI generated video of a cat cooking food. It looked real sortve.
A lot of people seem to think that all ai art is low effort garbage, which is just not true. There can be a lot of skill put into crafting the correct prompt to get the image you want from an image generator, not to mention the technical know-how of setting it up locally. The “ai art is not art” argument to me doesn’t sound any more substantiated than “electronic musicians aren’t musicians, go learn a real instrument” or “photographers aren’t really artists, all they do is push a button”. But regardless, I agree that we need good tagging, or as @ThatWeirdGuy1001 said, different communities. Even though the output looks similar, actually drawing things and wrangling prompts are two completely different skillsets, and the way we engage with the artistic product of those skills is completely different. You wouldn’t submit a photo you took to a watercolor painting contest. Same with ai art and non-ai art.
Anyway, just thought i’d share my opinion as an ai non-hater.
For art to be art you need space to express yourself through individual choices:
- play an original song on a real instrument, and you have the entire artistic spectrum to yourself
- if you make the music for it out of individual pieces, you narrow that range. The sounds are not yours, only their composition and words
- when you record a cover of a rap song over some elses beat, you further narrow it down to your performance only. Its still artistic expression, but to a much less degree than an original song
In a prompt generated image, the image itself is not your expression. The prompt is, but comparing the amount of choices you need to make with a painting over a prompt, its just so… less art?
In digital art, the image itself is not your expression. The idea is, but comparing the choice of shaders you use with brush strokes done with real paint, where you can see and feel the emotions the artist wanted to express with their physical brush, it’s just so… less art?
Sure. Only problem is, it’s a people issue. Some people making ai generated content may be honest and willing to abide to such rule, but most are proud to not even read the rules and just blast shitty slop left and right. For this second category of people, when you point it to them, a very small percentage of them goes “oh, sorry”. The vast majority just keep posting until blocked.
Granted, this experience mostly stems from every media posting sites out there, so it may be a bit biased…
Also, you can’t really regulate other countries. Especially if that country is China.
Text, sure. But I don’t get the hate towards AI generated images. If it’s a good image and it’s not meant to mislead, I am completely fine with AI content. It’s not slop if it’s good.
It’s still stolen content. Regardless of any other issues, it’s 100% stolen content.
So I assume you are morally opposed to piracy?
There’s a pretty clear difference in the two. If piracy ended in a new digital good that removes the market for the original good while eliminating the jobs of those that made the original good, then it’d be close. Even then pretty much everyone agrees not all piracy is the same; you wouldn’t pirate an indie game that hasn’t sold well unless you’re an absolute piece of subhuman shit.
I really enjoyed the “Hobbit: Extended Edition” project which condensed the three films of the Hobbit trilogy down into a single film, and as an unofficial fan-made project, is only available online for free.
Under that proposed gradient, I’m not sure where that would fall, given that it is a transformative work which uses the work of others to make them redundant (in this case, the original trilogy and the studios which would have otherwise profited from those sales).
I feel like there’s a better way to divide it, but it will be difficult to negotiate the exact line against the long-held contradictory ideas that art should both be divorced from its creator once released but also that the creator is entitled to full control and profit until the expiry of its copyright.
well uh, idk how to break it to you but it kinda does.
Piracy doesn’t equal a 1:1 sale, that argument is true, however that argument works with both AI and piracy plus it goes both ways.
The more people who do it via the free method, the less people who /may/ have bought it via the paid method. Meaning the less profit/earnings for the affected party.
However, since it goes both ways, obtaining the item via the free method does not mean that they would have purchased the paid good if the free good wasn’t available.
Both versions the original market is still available, regardless of method used.
I highly disagree that piracy and AI are any different at least in the scenario you provided.
if anything AI would be a morally higher ground imo, as it isn’t directly taking a product, it’s making something else using other products.
Being said I believe that CC’s should be paid for the training usage, but that’s a whole different argument.
It’s not solely about pay, but also what your work is used for. It makes sense you don’t understand this if you’ve never created anything, artwise or otherwise. If I draw a picture I control who displays that picture and for what purpose. If someone I don’t like uses that picture without permission it reflects poorly on me, and destroys my rights.
The easy example is an art piece by a Holocaust survivor being used by a neonazi without permission.
Now imagine you steal tens of millions of artists work. You know for a fact you don’t have the licenses needed to ensure their work is used to their liking.
I don’t make art myself, the closest I come is software development, which is already heavily scraped and used for training AI models. So, I agree that I might not fully understand, especially since my field tends to embrace assistive tools.
That said, I think the idea that AI-generated art reflects poorly on the original artist is a bit of a misnomer/self inflicted. When someone looks at an AI-generated piece, they’re not going to think, “Oh, that was by Liyunxiao,” because the end product isn’t a direct copy of any specific work. The models don’t store or reproduce the original source data, they learn patterns based off the source material, and then reapply them using what they have learned, often with a lot of randomization(as shown by it’s sometimes blatant inability to show realistic looking outputs)
While I believe we agree with the statement that work should have the artists permission before usage in a training model, or at the very least be paid for their usage instead of it just being scraped, I think both are comparable. One makes a new piece of art using what its “learned” off traits the training set had, one copies an existing piece of art. Neither prevent anyone from using the original source(artist or game studio), and they both are done usually against the wishes of the original team.
Being said, the example provided I think works better when compared to piracy, as at least at that point it’s a 1:1 clone instead of a creative works. As a art piece by a holocaust survivor being thrown into a training set on a diffusion model, wouldn’t come out the same image on the other end. Only a generalization and styleset is saved. At the end of the day, nobody has the ability to know where the diffusion art’s original sources came from nor is it able to produce a picture that is recognizable to an artists style, whereas with piracy you have a piece of work you can look up to see who owned it.
That’s just my opinion on it all though.
this bullshit again…
Yes, just because you disagree that your new toy is literally theft and is one of the most irresponsible inventions since leaded gasoline, that doesn’t change anything.
Sorry you’re the type of person that added lead shot to your gas tank after they banned leaded gasoline.
Sorry you’re the type of person that added lead shot to your gas tank after they banned leaded gasoline.
Well that devolved quickly. People with attitudes like yours make other people really not give a shit what your argument is. Also makes me know you can’t or won’t understand that I don’t really care what happens to AI, and that since there is no data taken it cannot be stolen. But you cant understand that I guess, and we have the same tired arguments.
At least I am some what happy that the corporate control is getting taken down by open source, that models are being jail broken or freed, and that people are realizing the what we have are only LLM’s and generative noise algo’s: not AI.
I am torn on that. If it’s a company making money off of it, despicable. If it’s an open source model used for memes? I’m fine with that. We shouldn’t act like artists follow some magical calling from god. Anything anyone creates is built on their education and the media they were exposed to. I don’t think generative models are any different.
Normalizing is a thing, on top of that there are still indie markets that can be supplanted by gan image generation. On top of that artists still have rights to their work, if they didn’t explicitly license their works for the model, it’s theft that removes the value of the original.
Admins get on this
And it should be tagged at every level: metadata, watermark, poster, website. Redundancies will make it harder to use AI for lying.
Internet-wide, culture-wide, society-wide.
I think ai posts should only be posted on ai communities so I can block them all at the same time.
Seriously. Even for memes and funny stuff ai needs to fuck off.
And the fact that everyone even calls it ai when it’s not even close to being a vi is infuriating.
Dude… or should I say, dude??? 🤖
Compling
Yeah that about sums it up
It should be fineable starting at like 500 dollars + any profits and ad revenue if its not labelled
How much money is a Lemmy upvote?
At least one money?
That might work for now when those of us who know what to look for can readily identify AI content for the time being, but there will be a time when nobody can tell anymore. How will we enforce the tagging then? Bad actors will always lie anyway. Some will accidentally post it without knowing its AI.
I think they should add a tag for it anyway so those who are knowingly posting AI stuff can tag it but I fear that in the next few years the AI images and videos will be inescapable and impossible to identify reliably even for people who are usually good at picking out altered or fake images and videos.
I’d even be worried that bad actors would abuse the tag to take legitimate footage of something they want to discredit and reposting it everywhere with AI tags. They could take control of the narrative if enough people become convinced that it’s AI-generated and use that to flip the accusations back on the people trying to spread the truth.
Yeah unfortunately bad actors ruin pretty much everything. We can do our best as a society to set things up in a way where systems can’t be abused but the sad reality is we just need to raise people better.
Lying, cheating (the academic or competitive integrity kind) and many other undesirable behaviors are part of human nature but good parenting teaches kids not to use those.
Good point.
Bugger…
Maybe all digital content just shouldn’t be trusted. It’s like some kind of demon-realm or something. Navigable by the wise but for common fools like you and I, perilous. Full of illusion.
I’m more like a moron fool, but ok.
Is this AI content?
No. I know because each hand has exactly 5 fingers.
You got me 😆