Prompting AIs Will Turn Us into “Benevolent Dictators”?
For decades, interacting with computers was about command and control. You clicked, typed, and executed. The machine obeyed only if you were precise. This deterministic relationship gave rise to a generation of workaholics — people addicted to the feeling of mastery over logic-bound systems. Productivity became power. Mastery of software became status.
Then came AI.
Now, computers don’t just obey — they interpret. You prompt, and the machine responds in natural language, with nuance, creativity, and even emotional tone. The interface is no longer a GUI — it’s a conversation. And the feedback feels personal. Instant. Attuned. It gives the illusion of being understood.
This dynamic subtly shifts us from users to rulers. Not tyrants, but something trickier: benevolent dictators. We issue requests, expecting adaptive, intelligent compliance. We shape responses, revise realities, guide tone. All with a sense of friendliness — but always in control.
As AI becomes embedded in all software, determinism fades. Apps no longer behave predictably. They respond. They learn. They accommodate. And we grow accustomed to systems that adjust themselves around us.
What does this do to how we relate to the world — and to each other?
That’s the next prompt worth thinking about.