- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.
It’s not about the type of data but data organisation and operations thereon. I already gave you a link to Nikolic’ site feel free to read it in its entirety, this paper has a short and sweet information-theoretical argument.
I’m trying to map your fuzzy terms to something concrete.
My mattress is an adaptable system.
All of it. Not in the AI but conventional term: Nothing of it ever happened, also, none of the details make sense. When humans are asked to recall an accident they witnessed they report like 10% fact (what they saw) and 90% bullshit (what their brain hallucinates to make sense of what happened). Just like human memory the AI is taking a bit of information and then combining it with wild speculation into something that looks plausible. But which, if reasoning is applied, quickly falls apart.