Poll
22 votes (64.7%) | |||
12 votes (35.29%) |
34 members have voted
Quote: WizardI still don't get what is so hard about taking on faith that the NB will get any prediction right 90% of the time. This includes predictions of picking A, AB, and what the weather will be like tomorrow.
Here is an example that may help. The NB has ten identical looking crystal balls. Nine are always right and one is always wrong. He picks one at random with every prediction.
This is similar to free will vs predetermination. If the guy knows what you will do, then what you do doesn’t matter. Take both boxes, get all the money. If you were going to do it anyhow, then it is what it is. But, if you were only going to take A, then take both against your nature. If the guy was right you get all the money there was to get, if the guy was wrong you win 10% of the time. The guy is almost always going to predict that you take both boxes.
Quote: MoscaThis is similar to free will vs predetermination. If the guy knows what you will do, then what you do doesn’t matter. Take both boxes, get all the money.
That should argue to take A only. I'd rather have $1,000,000 than $1,000.
Quote:If you were going to do it anyhow, then it is what it is. But, if you were only going to take box A, then take both against your nature.
The NB would likely foresee you playing that trick.
Quote:If the guy was right you get all the money there was to get, if the guy was wrong you win 10% of the time. The guy is almost always going to predict that you take both boxes.
The NB is not a "guy," He is an alien with superior predictive powers. I also don't see why he would predict I take both. Would he likely predict I would take both if I took box A only?
Quote: WizardThat should argue to take A only. I'd rather have $1,000,000 than $1,000.
The NB would likely foresee you playing that trick.
The NB is not a "guy," He is an alien with superior predictive powers. I also don't see why he would predict I take both. Would he likely predict I would take both if I took box A only?
10% wrong. If box A had $1000000 before, it has it now. The guy left.
Using guy colloquially, so as to frame the argument as a common sense one rather than a logical one. Neither is better, just different ways of viewing a problem that, if it were easy, wouldn’t have been proposed.
the problem with predetermination vs free will is that both appear the same to us. That’s the same to me as a set not being able to contain itself. Because we are inside reality, we are unable to ascertain its true nature.
Quote: Newcomb paradox
An alien, known as the Newcomb Being, or NB for short, arrived on earth several years ago with allegedly superior predictive powers. The way this has been proven is on a game show, with 90% accuracy, with the following rules:
1. The NB would meet with a contestant and ask general questions, never directly relating to the game. The contestant is allowed to lie.
2. After this questioning, the contestant would be excused. Then, the NB would place either $1,000,000 or nothing in a box labeled A and $1,000 in a box labeled B.
3. 24 hours later, the contestant would return and be asked to pick either box A or boxes A and B, keeping all money in the boxes selected.
Based on his predictive ability, the NB will place $1,000,000 if box A if he predicted the contestant would pick box A only. If he predicted the contestant would pick box boxes, he would put $0 in box A.
It has been shown through thousands of episodes the NB has a 90% accuracy right. The goes both ways, being 90% right with AB predictions and 90% with A only predictions.
You are the next contestant. Do you pick A only or A and B?
Quote: WizardOkay, since some are having a hard time accepting the premise of the problem, let me change the wording a bit. New wordage in bold...
It’s still predetermination vs. free will. Or, better said, you’re damned if you do and damned if you don’t.
And, I get it, if you pick box A then it means that the (mostly) omniscient one predicted you would pick box A. That would mean you accept predetermination. But if you pick both, then that means you believe you have free will independent of the prediction.
Or you could go through an endless series of predicting that the other will predict that you would predict that the being would predict... but eventually, you have to choose.
Quote: MoscaIt’s still predetermination vs. free will. Or, better said, you’re damned if you do and damned if you don’t.
...oh, just tell me what you would do, A or AB?
Quote: unJonWizard, does causality still only run one direction in time? Does my choice in step #3 have a causal impact on anything that came before? In other words, will you postulate that the alien superior predictive ability does not include a time machine?
Yes. I claim the NB cannot see the future, just predict it. There are no time machines, alternate realities or any of that jazz.
Quote: WizardYes. I claim the NB cannot see the future, just predict it. There are no time machines, alternate realities or any of that jazz.
Then it’s simple:
1) Rational actors take A and B.
2) This leads to legitimate criticism of rationality being a good thing in all circumstances.
3) Rational and smart actors use the 24 hour period to get drunk and take mind altering drugs that will help them take only box A.
Quote: Wizard...oh, just tell me what you would do, A or AB?
AB.
As a contestant, do we know this?Quote: WizardBased on his predictive ability, the NB will place $1,000,000 if box A if he predicted the contestant would pick box A only. If he predicted the contestant would pick box boxes, he would put $0 in box A.
If so, it seems just taking A is the way to go. NB will predict this 90% of the time, and put the million in Box A; my EV would be $900,000.
If I were to take A & B, NB would predict this 90% of the time and put nothing in Box A (10% chance he would put the million in.); my EV in this scenario would be $101,000.
Am I missing something?
Quote: JoemanAs a contestant, do we know this?
If so, it seems just taking A is the way to go. NB will predict this 90% of the time, and put the million in Box A; my EV would be $900,000.
If I were to take A & B, NB would predict this 90% of the time and put nothing in Box A (10% chance he would put the million in.); my EV in this scenario would be $101,000.
Am I missing something?
Here’s the angle you are missing. You are treating the game as a simultaneous game. The other way to treat it is a sequential game where there are two boxes in front of you and nothing in the world can change whether A has a million dollars in it. It’s fixed. So it’s strictly better at that point to take A and B. Like past-posting in roulette.
Quote: JoemanAs a contestant, do we know this?
If so, it seems just taking A is the way to go. NB will predict this 90% of the time, and put the million in Box A; my EV would be $900,000.
If I were to take A & B, NB would predict this 90% of the time and put nothing in Box A (10% chance he would put the million in.); my EV in this scenario would be $101,000.
EV(A) = 0.9*$1,000,000 + 0.1*0 = $900,000
EV(AB) = 0.9*$1,000 + 0.1*$1,001,000 = $101,000
I agree.
Quote: unJonHere’s the angle you are missing. You are treating the game as a simultaneous game. The other way to treat it is a sequential game where there are two boxes in front of you and nothing in the world can change whether A has a million dollars in it. It’s fixed. So it’s strictly better at that point to take A and B. Like past-posting in roulette.
Seems like the logical answer, at a cost of $799,000 in EV.
Quote: WizardEV(A) = 0.9*$1,000,000 + 0.1*0 = $900,000
EV(AB) = 0.9*$1,000 + 0.1*$1,001,000 = $101,000
I agree.
Would your answer be the same if box B held $790,000 every time?
Agree. The problem is supposed to highlight issues with the concept of rationality.Quote: WizardSeems like the logical answer, at a cost of $799,000 in EV.
Quote: unJonWould your answer be the same if box B held $790,000 every time?
The expected values would be different, but it doesn't change the essence of the problem.
Sure it does. It would then become a problem where most of America does the rational thing and takes A and B and only some statisticians act irrationally by taking A.Quote: WizardThe expected values would be different, but it doesn't change the essence of the problem.
Quote: unJonSure it does. It would then become a problem where most of America does the rational thing and takes A and B and only some statisticians act irrationally by taking A.
Isn't that also the issue if box B had $1,000?
I think box B needs to be very low to trigger most people’s sense that taking box B is wrong and to put weigh what seems like the right thing to do given no causal relationship between the act of taking B and the prize and A.Quote: WizardIsn't that also the issue if box B had $1,000?
Quote: unJonI think box B needs to be very low to trigger most people’s sense that taking box B is wrong and to put weigh what seems like the right thing to do given no causal relationship between the act of taking B and the prize and A.
I agree.
Good hunch.Quote: MoscaIt’s still predetermination vs. free will.
Or you could go through an endless series of predicting that the other will predict that you would predict that the being would predict... but eventually, you have to choose.
This is exactly how I treated the problem in my paper, and could prove that Nash equilibrium was as good as all that wiggle-wagging.
The rational (or Nash) answer believes in free will. Hence it does not believe in the NB’s ability.
Michael’s (and others) answer believes in predetermination. Hence it forbids the possibility of free will.
Both are valid. Science has not yet found the answer. Recent research seems to show that free will is an illusion produced on the spot by our brain to post justify the acts we do, faking a conscience where there is none.
But other research shows that predetermination à la Lagrange is also hopeless, if we allow for indeterminacy à la Schrödinger.
In the end, there might be neither free will nor predetermination !!!
Quote: WizardOkay, since some are having a hard time accepting the premise of the problem, let me change the wording a bit.
Statistical evidence is never proof. For frequentists, it just gives a confidence interval. For bayesians, it gives a probability of being correct, but depends on prior beliefs and/or information.Quote:The way this has been proven is on a game show, with 90% accuracy.
Whatever the approach, the precise conditions of the experiment are needed to be able to make one’s opinion.
What was the information of the first contestants? Did they pick at random? Did they know what was supposed to be in the boxes? Did they receive info that made them believe in the NBs ability ? (Certainly not the thousands of cases since they are the first.)Quote:It has been shown through thousands of episodes …
The NBs ability is very different whether it is Common Knowledge between him and the contestant what his powers are, or the contestant chooses independently (i.e. unknowingly).
How many were there of each? (For example, if there were 9000 AB and 1000 A, the confidence in the second 90% figure is lower.)Quote:This goes both ways, being 90% right with AB predictions and 90% with A only predictions.
Does it mean that « when NB predicts Z, then 90% of the times Z happens » or « when the contestant chooses Z, then 90% of the times NB predicted it »?
It may not be evident to most, but those two are different! Only if the two events occurred exactly the same number of times is it equivalent.
Now imagine the episodes in sequence. The first contestants , having no stats about the NB rate of success, would probably have chosen AB, apart from a few credulous. Suppose the assertion about the NB is true: most of these first episodes, it will leave A empty.
After a while, the next contestants cannot determine if that state of things is due to prediction or to NB almost systematically (or at 90% rate) leaving A empty. Consequently, they will even more choose AB.
By induction, fewer and fewer contestants chooses A only! The sample is biased.
But the most important point is this: there is no way for the contestants (or me) to decide if the 90% numbers show actual prediction ability or some other cause. Correlation is not causation.
Their conclusion then MUST depend on their prior beliefs.
Back to step 1.
Quote: kubikulannGood hunch.
This is exactly how I treated the problem in my paper, and could prove that Nash equilibrium was as good as all that wiggle-wagging.
The rational (or Nash) answer believes in free will. Hence it does not believe in the NB’s ability.
Michael’s (and others) answer believes in predetermination. Hence it forbids the possibility of free will.
Both are valid. Science has not yet found the answer. Recent research seems to show that free will is an illusion produced on the spot by our brain to post justify the acts we do, faking a conscience where there is none.
But other research shows that predetermination à la Lagrange is also hopeless, if we allow for indeterminacy à la Schrödinger.
In the end, there might be neither free will nor predetermination !!!
Because we are inside reality, we cannot ascertain its true nature. We can decide that it is predetermined, or that we are free actors, but either way it will look the same, and feel the same, and play out the same from our perspective. It is essentially irrelevant to our daily lives. Therefore it is safe to ignore the question. Live.
I would figure that I have a good job, and money in the bank, and I don’t need to make a decision at this time. So, I will postpone my decision. In fact, I will leave my decision to my estate. At that time, my beneficiaries can choose to take A, or AB, depending on their needs.
This is a so-called logic puzzle in which the Wiz is invoking "aliens" with 90% predictive powers -and for which the nature of the 90% predictive power is at the heart of the puzzle and which is inherently unknowable.
Here is a puzzle:
If the Wizard was an alien who had 90% predictive powers, would he ever have posted this logic puzzle in the first place?
Quote: gordonm888"If pigs could fly, would they go supersonic?"
This is a so-called logic puzzle in which the Wiz is invoking "aliens" with 90% predictive powers -and for which the nature of the 90% predictive power is at the heart of the puzzle and which is inherently unknowable.
Here is a puzzle:
If the Wizard was an alien who had 90% predictive powers, would he ever have posted this logic puzzle in the first place?
If there was a correct answer there wouldn’t be a puzzle. It’s not like the Monty Hall Problem, because it isn’t real.
Quote: kubikulannStatistical evidence is never proof. For frequentists, it just gives a confidence interval. For bayesians, it gives a probability of being correct, but depends on prior beliefs and/or information.
Whatever the approach, the precise conditions of the experiment are needed to be able to make one’s opinion.
What was the information of the first contestants? Did they pick at random? Did they know what was supposed to be in the boxes? Did they receive info that made them believe in the NBs ability ? (Certainly not the thousands of cases since they are the first.)
The NBs ability is very different whether it is Common Knowledge between him and the contestant what his powers are, or the contestant chooses independently (i.e. unknowingly).
How many were there of each? (For example, if there were 9000 AB and 1000 A, the confidence in the second 90% figure is lower.)
Does it mean that « when NB predicts Z, then 90% of the times Z happens » or « when the contestant chooses Z, then 90% of the times NB predicted it »?
It may not be evident to most, but those two are different! Only if the two events occurred exactly the same number of times is it equivalent.
Now imagine the episodes in sequence. The first contestants , having no stats about the NB rate of success, would probably have chosen AB, apart from a few credulous. Suppose the assertion about the NB is true: most of these first episodes, it will leave A empty.
After a while, the next contestants cannot determine if that state of things is due to prediction or to NB almost systematically (or at 90% rate) leaving A empty. Consequently, they will even more choose AB.
By induction, fewer and fewer contestants chooses A only! The sample is biased.
But the most important point is this: there is no way for the contestants (or me) to decide if the 90% numbers show actual prediction ability or some other cause. Correlation is not causation.
Their conclusion then MUST depend on their prior beliefs.
Back to step 1.
Quote: WizardQuote: Newcomb paradox
An alien, known as the Newcomb Being, or NB for short, arrived on earth several years ago with allegedly superior predictive powers. The way this has been proven is on a game show, with 90% accuracy, with the following rules:
1. The NB would meet with a contestant and ask general questions, never directly relating to the game. The contestant is allowed to lie.
2. After this questioning, the contestant would be excused. Then, the NB would place either $1,000,000 or nothing in a box labeled A and $1,000 in a box labeled B.
3. 24 hours later, the contestant would return and be asked to pick either box A or boxes A and B, keeping all money in the boxes selected.
Based on his predictive ability, the NB will place $1,000,000 if box A if he predicted the contestant would pick box A only. If he predicted the contestant would pick box boxes, he would put $0 in box A.
It has been shown through thousands of episodes the NB has a 90% accuracy right. The goes both ways, being 90% right with AB predictions and 90% with A only predictions.
You are the next contestant. Do you pick A only or A and B?
Next question: if what The Wizard says actually happened, would our perception of what is happening become reality? How much effort and time and money would then be invested in understanding the mechanism of this part of reality? If it defied understanding, what then? It’s not a law, because sometimes it’s wrong, but it’s not random. What is it?
If I picked A, and there was nothing there; was I wrong, or was the guy wrong? Meaning, did I accidentally make the wrong choice, like sometimes happens playing video poker where you meant to discard this card but discarded that one instead?
I don’t know. If it was simple it wouldn’t have been presented. Whatever you pick, you’re both right and wrong at the same time.