I figured why it is that, in every story where the a character gets “three wishes” from some agent of the supernatural, it always ends up badly.
It’s because they forgot to tell the genie, after “make me rich, but …”
- don’t make me a target for crime due to that wealth
- don’t make me lose it all the next day
- keep me healthy for a long time to enjoy it
- but not so healthy that I outlive everyone and end up a solitary creature
See how the pleas get more and more clever & elaborate, and yet still offer a mischevious genie an opportunity to teach a lesson!
In politics, a variant of this phenomenon is known as the “law of unintended consequences”. Whatever the noble intentions of the politicians, something unforseen and adverse always happens – sometimes even worse than the original problem they hoped to solve.
The technical reason: Frame Problem
In the field of AI, the Frame Problem is the difficulty of telling a computer that models some piece of the world about what changes and what doesn’t, when something occurs. It may require an effort proportional to the knowledge base size to add all the effects and non-effects of an additional action (see the genie plea list above). For example, for a program with an astronomy knowledge base, to teach it what a “supernova” event is could mean having to painstakingly enumerate how any particular comet on the other side of the universe is affected. Or whether dropping this ball might possibly affect a window on the other side of the street. It might, if it’s about to bounce back off of an incline, and it had lots of energy, and the street was narrow, and ….
It always depends. And writing down how it depends is just too much work to do it correctly.