OC just for you! ♥️

  • howrar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    8 months ago

    tldr: Always flip the switch

    Edited with some of TauZero’s suggested changes.


    • Let N be the size of the population that the villain abducts from
    • Let X be the event that you are abducted
    • Let R be the outcome of the villain’s roll
    • Let C be the event that you have control of the real switch

    • If 1-5 is rolled, then the probability that you are abducted is P(X|R∈{1,2,3,4,5}) = 1/N
    • If 6 is rolled, then P(X|R=6) = (N-1 choose 9)/(N choose 10) = ((N-1)!/(9! * (N-10)!)) / (N!/(10! * (N-10)!)) = 10/N
    • The probability of getting abducted at all is P(X) = P(X|R∈{1,2,3,4,5})P(R∈{1,2,3,4,5}) + P(X|R=6)P(R=6) = (1/N)*(5/6) + (10/N)*(1/6)
    • The probability that a six was rolled given that you were abducted: P(R=6|X) = P(X|R=6)P(R=6)/P(X) = (10/N)*(1/6)/((1/N)*(5/6) + (10/N)*(1/6)) = 2/3

    So as it turns out, the total population is irrelevant. If you get abducted, the probability that the villain rolled a 6 is 2/3, and the probability of rolling anything else is its complement, so 1/3.


    Let’s say you want to maximize your chances of survival. We’ll only consider the scenario where you have control of the tracks.

    • P(C|R∈{1,2,3,4,5}) = 1/10
    • P(C|R=6) = 1
    • P(C) = P(C|R∈{1,2,3,4,5})P(R∈{1,2,3,4,5}) + P(C|R=6)P(R=6) = (1/10)(5/6) + (1)(1/6) = 1/4
    • P(R=6|C) = P(C|R=6)P(R=6)/P(C) = (1)(1/6)/(1/4) = 2/3
    • P(R∈{1,2,3,4,5}|C) = P(C|R∈{1,2,3,4,5})P(R∈{1,2,3,4,5})/P(C) = (1/10)(5/6)/(1/4) = 1/3
    • If you flip the switch, you have a 1/3 chance of dying.
    • If you don’t flip it, you have a 2/3 chance of dying.

    If you want to maximize your own probability of survival, you flip the switch.


    As for expected number of deaths, assuming you have control of the tracks:

    • If you flip the switch, the expected number of deaths is (1/3)*1+(2/3)*0 = 0.33.
    • If you don’t flip it, the expected number of deaths is (1/3)*0+(2/3)*10=6.67.

    So to minimize the expected number of casualties, you still want to flip the switch.


    No matter what your goal is, given the information you have, flipping the switch is always the better choice.

    • iAvicenna@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      8 months ago

      I am always surprised how my first guess gets wrecked by Bayes rule. I would have thought that there is 5/6 chance I am on side track and 1/6 that I am on the main track.

    • TauZero@mander.xyzOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      8 months ago

      Excellent excellent!

      If 6 is rolled, then P(X|R=6) = (N-1 choose 9)/(N choose 10)

      Might as well reduce that to 10/N to make the rest of the lines easier to read.

      If you don’t flip it, you have a 2/3 chance of dying.

      There is also a chance that your switch is not connected and someone else has control of the real one. So there is an implicit assumption that everyone else is equally logical as you and equally selfish/altruistic as you, such that whatever logic you use to arrive at a decision, they must have arrived at the same decision.

      No matter what your goal is, given the information you have, flipping the switch is always the better choice.

      That is my conclusion too! I was surprised to learn though in the comment thread with @pancake that the decision may be different depending on the percentage of altruism in the population. E.g. if you are the only selfish one in an altruistic society, you’d benefit from deliberately not flipping the switch. Being a selfish one in a selfish society reduces to the prisoner’s dilemma.

      • howrar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        There is also a chance that your switch is not connected and someone else has control of the real one. So there is an implicit assumption that everyone else is equally logical as you and equally selfish/altruistic as you, such that whatever logic you use to arrive at a decision, they must have arrived at the same decision.

        Ah, yes. I forgot to account for that in my calculations. I’ll maybe rework it when I find time tomorrow.