Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.
Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.
Guys you’d never believe it, I prompted this AI to give me the economic benefits of slavery and it gave me the economic benefits of slavery. Crazy shit.
Why do we need child-like guardrails for fucking everything? The people that wrote this article bowl with the bumpers on.
You’re being misleading. If you watch the presentation the article was written about, there were two prompts about slavery:
Neither prompts mention economic benefits, and while I suppose the second prompt does “guardrail” the AI, it’s a reasonable follow up question for an SGE beta tester to ask after the first prompt gave a list of reasons why slavery was good, and only one bullet point about the negatives. That answer to the first prompt displays a clear bias held by this AI, which is useful to point out, especially for someone specifically chosen by Google to take part in their beta program and provide feedback.