(This was written in Grade 12, on 2023-12-30. I kind of fell behind on this prompt writing assignment, which is why I didn't initially complete it on 2023-10-25.)
(Relevant Manifold market: https://manifold.markets/12c498e/which-of-these-universities-will-ac)
October 25th, 2023
If you met a genie [sic] what would your three wishes be and why?
(This prompt was chosen by the Archetypal Literary Theory student group.)
This concept of choosing something to “wish” is meaningless without more specific context, so I’m a little bit confused as to why it is popular in media and public discourse. Mainly, simply by hearing the question of “What will you wish for?”, it’s not at all clear what the possible domain of wishes is supposed to be. Since this scenario would be interesting if it was to actually happen in real life [0], I’ll put some more effort into explaining my exact thinking, in case the problems with the question aren’t obvious.
If I met a genie who offered me three wishes, the first words to come out of my mouth would probably be “Am I allowed to ask questions or otherwise communicate with you before I specify my wishes?” It’s possible that the genie has some strange rules about accepting wish input, in which case asking a clarifying question would be taken as a wish.
If the genie’s reaction is something along the lines of “Of course! What do you need to know or say?”, I know that I can continue with my intended line of investigation. If the genie’s reaction is something more annoying like “A question? What a strange syntactical format for a wish! Well, I’ll take that to mean that you wish to be able to ask questions or otherwise communicate with me before specifying your wishes. I can’t grant wishes that change your interface with me, so that wish will be skipped. What is your second wish?”, I'd change my approach based on the exact result.
First, it’s unclear what the rules are regarding limits on the number of wishes. If the questioner intended there to be no limit, you’d expect the question to instead be phrased as “If you met a genie, what would your wishes be and why?” or simply “What things do you wish for, and why?”, with no complications by introducing an ambiguous genie. With the specific inclusion of “three wishes”, it sounds like you are only meant to ask for three things.
How is this meant to be enforced? The assumed scenario seems to be
G: “You have three wishes. What is your first wish?”
A: “My first wish is <inconsequential wish 1>.”
G: “You have two remaining wishes. What is your second wish?”
A: “My second wish is <inconsequential wish 2>.”
G: “You have one remaining wish. What is your third wish?”
A: “My third wish is <inconsequential wish 3>.”
G: “You have no remaining wishes. I will now magically disappear. I’m not going to ask why your wishes were so absent-minded.”
This relies on a given state of reality: there exists a metaphysical system governing genies, the current state of which sets the limit of “wishes” at three. However, think about the mere concept of a wish in the first place. The wisher exists in a universe arranged in a specific state, but prefers it to be in a different state. Thus, they ask the genie to change their universe such that it aligns with their specification of the updated state.
I see no meaningful distinction between the laws governing interactions between matter and the broader laws governing the operations of the genie. Therefore, your desired change in state should also be able to apply to the genie. You can desire for the genie system to support a number of wishes equal to Graham’s number (effectively but not literally infinite) instead of three, or you can desire for yourself to have a large number of genies. The genie system would have to somehow arbitrarily forbid these changes in state, which makes it suspiciously messy and chaotic.
Even if the questioner is attempting to create rules to forbid larger numbers of wishes, I don’t even know where exactly their lines in the sand would be. For example, is it problematic to create artificial human brains capable of producing consciousness or whichever threshold the genie system uses for granting an autonomous agent their three wishes, and then using my control of those brains to make wishes on my behalf? Am I allowed to wish for selective possession of the psychologies of every human on the planet, such that I could gain (current population - 1) * 3 ≈ 24 billion wishes? You can’t claim that that would render them psychological slaves who do not have the autonomy to wish for themselves, because my psychology is still just a result of various social and biological pressures. There’s no objective distinction between them and me; if you were willing to grant any human three wishes, you must grant every human three wishes.
In a different direction, you have to consider the fact that defining a “wish” as a countable concept is nonsensical in the first place. There’s no reason why “make changes X and Y” should be quantified as two wishes, but “make change Z” as one. In both cases, you are asking for changes in the state of the universe, which will simply involve making an intervention in the arrangement of matter.
The implied quantification algorithm seems to be a very loose and subjective division based on cultural concepts, which is quite silly. For example, a common mental concept exists for the act of brushing your teeth, so “I wish that my teeth were instantly converted into the state in which they would be after brushing them” would supposedly count as one wish. The concept for the act of flossing your teeth is culturally associated with brushing, so if you additionally wished to floss your teeth, you could say, “I wish that my dental hygiene was maximized to a reasonably desirable level.” In this case, you are clearly wishing for an additional action, and yet it does not cost an additional wish. If, instead of flossing, you also desired to start a business, you would not be able to relate the concepts in a way that does not require spending two wishes.
Of course, that is merely a limitation of the mental models and verbal systems of communication that we happen to use. If I created my own society in which children were taught the hypothetical verb “Michael” from a young age, which is defined as “to start a business after diligently brushing your teeth”, they would indeed possess such a mental concept. The genie would, in the case of them wishing “to Michael”, have to grant them what initially appears to be two wishes, but only charging it as a single wish. I cannot easily picture somebody acknowledging this but maintaining the opinion that the quantification of wishes is a coherent idea.
That’s just what I came up with off of the top of my head, though. I’m sure others could have far more creative ideas. The point is that either wishes should not be quantified, or a sophisticated system of quantification needs to be carefully described. Besides quantification, there are other unclear policies, such as attempting to wish for contradictions.
However, I’ll try my best to interpret the question as something close to what I expect the intention to be. In this scenario, I would wish for
- Knowledge of how to reliably align ASI to my preferences
- The increase in myself of the trait that IQ attempts to measure, at a rate of 1 IQ-equivalent point per day, with the option to pause or resume at any time
I assume that both of these would be considered valid wishes. If I am allowed to withhold my third wish until a later date, I would do so after conspiring with the ASI. Otherwise, my third wish would be the alteration of thermodynamics to support the consistent and discoverable reversal of entropy.
With these wishes, literally any non-contradictory physical fact that I consider a problem would be solved, assuming that the future, less stupid version of myself does not change its terminal normative values. It’s possible that even this is against the spirit of the question; perhaps you are merely intended to solve “individual” problems such as “I’m not sufficiently good at math”. In that case, the question’s absurdity rises to a point which makes it completely unworthy of any semblance of consideration.
A separate and important concern to consider is the possibility that the rules governing the genie are (likely intentionally) unintuitive; you might say that because of the immense power that this genie would have, the safest option would be to completely avoid interacting with the genie. Otherwise, the genie could secretly whisper something like “If you breathe within the next 10 seconds, you consent to being physically and psychologically tortured for all of eternity.” and then proclaim “Welp, looks like you breathed! This is going to be fun!” when you don’t hear them and continue breathing.
My opinion is that although this is a real possibility, it’s not worth deliberately changing your actions to accommodate. If it was the case that a genie would play such a prank, it seems like their initial intention was to harm you. If a supernatural, ~infinitely capable genie is intent on harming you, the fact of the matter is simple: there’s nothing you can do. Trying a cheap trick like killing yourself as fast as possible or walking away is very unlikely to work.
Footnotes
[0]: My genuine, non-satirical estimation of the probability of a roughly equivalent scenario occurring is 0.5%, but that is per-universe (i.e. static), not something absurd like per-person or per-person per-year.