- cross-posted to:
- becomeme@sh.itjust.works
- cross-posted to:
- becomeme@sh.itjust.works
Generative AI services like Midjourney and OpenAI’s DALL-E can deliver the unimaginable when it comes to stunning artifacts produced from simple text prompts. Sketching complex art imagery may be AI’s specialty, yet some of the simplest tasks are evidently what AI struggles with the most.
It’s actually surprisingly difficult to ask for an absence of anything, since the training data doesn’t normally include what isn’t in the image.
I think they call it the Giraffe Problem or something like that. If you ask an AI for “an image containing no giraffes,” you’ll end up with a bunch of giraffes. It’s all about how the training data is tagged.
That works for humans too:
“DON’T THINK ABOUT GIRAFFES!”
Or in sports when you’re like “dont throw/kick/shoot the ball at the exact thing you’re focusing on” before doing exactly that
Or dont drunk drive into the back of a fire truck
Gets me every time
Easy solution: pick any other animal and think about them.
Don’t think about pink elephants? Sure, here comes fluffy bunnies.
That’s why they have negative and positive prompts.
Definitely worth noting, yeah.
I seriously think a lot of this is just people not knowing how it works. It’s a knew tech, people need to figure out the quirks and nail down techniques.
Hell getting people to google basic things can be a lesson in futility sometimes, and they want to talk this down? Come on.
Absolutely. Hearing the way people interact with LLMs as if they are an all-knowing AGI is honestly a bit terrifying at times. Hopefully the longer these systems are in the mainstream, the better people get a sense of the boundaries.
That’s where most of the hate towards AI comes from on this platform. People here are always talking about how stupid it is, how it’s not real AI, and how it’s just a fancy autocorrect. I’m convinced that’s all user error. Anyone who has used AI correctly, and understands how to prompt it, has seen how remarkably powerful and useful it can be.