Lots of experts on AI say it can only be as good as the data it’s trained on — basically, it’s garbage in and garbage out.
So with that old computer science adage in mind, what the heck is happening with Google’s AI-driven Search Generative Experience (SGE)? Not only has it been caught spitting out completely false information, but in another blow to the platform, people have now discovered it’s been generating results that are downright evil.
Case in point, noted SEO expert Lily Ray discovered that the experimental feature will literally defend human slavery, listing economic reasons why the abhorrent practice was good, actually. One pro the bot listed? That enslaved people learned useful skills during bondage — which sounds suspiciously similar to Florida’s reprehensible new educational standards.
Comments are closed.