It may not “understand” like a human, but it can synthesize in a way that mimics — and sometimes even surpasses — human creativity.
Calling it a “stochastic parrot” is like calling a jazz musician an “audio repeater” because they’re using notes they’ve heard before. It misses the creativity in the combination — the generative power that lies within the latent space.
It reads like the brainless drivel that corporate drones are forced to churn out, complete with meaningless fluff words. This is why the executives love AI, they read and expect that trash all the time and think it’s suitable for everything.
Executives are perfectly content with what looks good at a cursory glance and don’t care about what’s actually good in practice because their job is to make themselves seem more important than they actually are.
It reads like the brainless drivel that corporate drones are forced to churn out, complete with meaningless fluff words. This is why the executives love AI, they read and expect that trash all the time and think it’s suitable for everything.
Executives are perfectly content with what looks good at a cursory glance and don’t care about what’s actually good in practice because their job is to make themselves seem more important than they actually are.
I literally asked it to make the maximalist case against the idea that LLM are just autocomplete and that’s exactly what it did.
The message before that did the opposite case.