-
Notifications
You must be signed in to change notification settings - Fork 1
Expand file tree
/
Copy pathNoticing-Confusion.html
More file actions
251 lines (239 loc) · 17.6 KB
/
Noticing-Confusion.html
File metadata and controls
251 lines (239 loc) · 17.6 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<link rel="icon" type="image/png" href="./images/favicon-32x32.png" sizes="32x32" />
<link rel="icon" type="image/png" href="./images/favicon-16x16.png" sizes="16x16" />
<title>Noticing Confusion - SPK's Rationality Essays</title>
<link rel="stylesheet" type="text/css" href="./css/default.css" />
<link rel="stylesheet" type="text/css" href="./css/highlight.css" />
<!-- <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script> -->
<!-- <script type="text/javascript" src="/js/header-links.js"></script> -->
<script type="text/javascript" src="http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML"></script>
<link href="atom.xml" type="application/atom+xml" rel="alternate" title="Sitewide ATOM/RSS Feed" />
<!-- Google Analytics stuff -->
<!-- Google tag (gtag.js) -->
<script async src="https://www.googletagmanager.com/gtag/js?id=G-DEWF2J5BG8"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'G-DEWF2J5BG8');
</script>
<script type="text/javascript" src="https://fast.fonts.net/jsapi/f7f47a40-b25b-44ee-9f9c-cfdfc8bb2741.js"></script>
</head>
<body>
<div id="header">
<div id="logo">
<a href="./">SPK's Rationality Essays</a>
</div>
<div id="navigation">
<a href="./">Home</a>
<a href="./notes.html">Notes</a>
<!-- <a href="/about.html">About</a> -->
<a href="./archive.html">Archive</a>
<a href="./atom.xml" type="application/atom+xml" rel="alternate" title="Sitewide ATOM/RSS Feed">RSS</a>
</div>
</div>
<div id="content">
<h1 id="post-title">Noticing Confusion</h1>
<!-- <center><img src="https://fbcdn-sphotos-d-a.akamaihd.net/hphotos-ak-prn1/t31.0-8/p600x600/10257116_10202295769100492_2438594605053717342_o.jpg" height="400" width="300" class="sujeet-pic" alt="Sujeet pic" /></center> -->
<p><strong>Entry Question</strong>: How do we notice when we are confused about some topic?</p>
<hr />
<h1 id="surprise">Surprise</h1>
<p>If you are <a href="./Truth-Predictive-Power.html#surprise-surprise">surprised</a> by something, it means that you made a prediction and it was wrong. You did constrain your anticipation - you said that X would happen, and Y would <em>not</em> happen. But you were wrong - Y happened anyway. That’s cool. You can update on this evidence. You can throw out the belief that predicted Y won’t happen (cos this is very strong evidence against it).</p>
<p>However, if you weren’t surprised by wrong answers, it means that you haven’t been making predictions at all.</p>
<p>Making predictions implies that you will either be reassured or be surprised by the outcome. You either got it right or you got it wrong.</p>
<p>But when you don’t make predictions, when your beliefs don’t constrain anticipation, you will never be surprised by <em>any</em> outcome.</p>
<h1 id="experiment-time">Experiment time!</h1>
<p>Ok. How do we find out whether you are confused about something?</p>
<p>You say you have a correct model about something.</p>
<p>Well, what could the reality be? You could actually have a correct model. You could have a wrong model. Or, you could be confused.</p>
<p>It’s my job as a scientist to find out the truth here.</p>
<p>What are the hypotheses?</p>
<p>Assume X is an event with two outcomes, A and B.</p>
<ul>
<li><p>H1: You’re confused about the outcome of X.</p></li>
<li><p>H2: You’re not confused about the outcome of X, and you have the <em>correct</em> model.</p>
<p>i.e., if you predict A, A will happen.</p></li>
<li><p>H3: You’re not confused about the outcome of X, but you have the <em>wrong</em> model.</p>
<p>i.e., if you predict A, A won’t happen.</p></li>
</ul>
<p>[For now consider the simple case where you’re either always right (H2) or always wrong (H3). We will deal with more complex hypotheses later.]</p>
<p>How do we design hypothesis tests to distinguish between these them?</p>
<p>We look at the places where they make differing predictions.</p>
<p>We know that if you’re confused, you won’t be surprised by anything.</p>
<p>Hmmm… Could they differ in the level of surprise we would have? Let’s see.</p>
<h1 id="variables">Variables</h1>
<p>Assumption: The actual outcome of X is A.</p>
<p>There is one independent variable - the input:</p>
<ul>
<li>What I tell you (“A happened”, “B happened”)</li>
</ul>
<p>The dependent variable - the output / prediction:</p>
<ul>
<li>Whether you’re surprised</li>
</ul>
<p>We want to see if the hypotheses give different predictions for the same input.</p>
<h1 id="cases">Cases</h1>
<p>A has happened.</p>
<ul>
<li><p>I tell you that A happened</p>
<p>i.e., I tell you the actual outcome</p>
<p>H1: When you’re confused, you won’t be surprised.</p>
<p>H2: You predicted A, so you’re not surprised.</p>
<p>H3: You predicted B (cos you have the wrong model). Hence, you’re surprised that A happened!</p>
<p><strong>Note</strong>: H3 differs from H1 and H2.</p></li>
<li><p>I tell you that B happened</p>
<p>i.e., I tell you a false outcome</p>
<p>H1: When you’re confused, you won’t be surprised.</p>
<p>H2: You predicted A (correctly). However, since I’m lying about the outcome, you’re surprised when I tell you B happened.</p>
<p>H3: You predicted B (cos you have the wrong model). Hence, you’re not surprised that B happened!</p>
<p><strong>Note</strong>: H2 differs from H1 and H3.</p></li>
</ul>
<h1 id="observations">Observations</h1>
<p>Yes! We have our differing predictions!</p>
<ol style="list-style-type: upper-alpha">
<li><p>Surprised at the actual outcome: you have the wrong model.</p></li>
<li><p>Surprised at the false outcome: you have the correct model.</p></li>
<li><p>Not surprised at either outcome: you, sir, are confused!</p></li>
</ol>
<p>When you’re not confused, whether you’re right or wrong, you have the <em>possibility</em> of surprise. That’s all we need.</p>
<hr />
<h1 id="more-complex-scenarios">More complex scenarios</h1>
<p>Let’s look at a more realistic scenario. In the real world, you are not always wrong. Sometimes, you are right; sometimes, you are wrong. i.e., you may understand some parts of the topic well, and misunderstand the other parts.</p>
<p>There’s an event X, with possible outcomes A, B, C, and D.</p>
<p>Actual outcome: A.</p>
<p>Hypotheses:</p>
<ul>
<li><p>H1: You’re confused</p></li>
<li><p>H2: You’re not confused and have an absolutely correct model</p></li>
<li><p>H3: You’re not confused but have a partially wrong model</p></li>
<li><p>H4: You’re not confused but have a completely wrong model</p></li>
</ul>
<p>Again, the input variable is what I tell you (“A happened”, …) and the output variable is your surprise.</p>
<h1 id="cases-1">Cases</h1>
<p>This time, let’s look at the <em>likelihood</em> of not being surprised in each case.</p>
<p><strong>Note</strong>: Likelihood of surprise = 1 - Likelihood of no surprise</p>
<h3 id="i-tell-you-a-happened">I tell you A happened</h3>
<p>i.e., I tell you the actual outcome</p>
<ul>
<li><p>H1: You won’t be surprised. Likelihood of no surprise: ~100%</p></li>
<li><p>H2: You won’t be surprised (cos you correctly predicted A).</p>
<p>Likelihood of no surprise: ~100%</p></li>
<li><p>H3: You may be surprised.</p>
<p>You would be surprised if you had predicted B or C or D. If you had predicted A, you wouldn’t be surprised.</p>
<p>So, likelihood of no surprise: 1/4 = 25%</p>
<p>In general, the likelihood of not being surprised by a correct outcome would be pretty small, especially if you suspect that your model is really wrong.</p></li>
<li><p>H4: You will definitely be surprised (cos you wrongly predicted something else). Likelihood of no surprise: ~0%</p></li>
</ul>
<p>Likelihood ratio for no surprise = 100 : 100 : 25 : 0</p>
<p>If you’re not surprised by the correct answer, you certainly don’t have a completely wrong model (H4).</p>
<p>But it is pretty strong evidence that you don’t have a partially wrong model (H3).</p>
<p>Likelihood ratio for surprise = 0 : 0 : 75 : 100</p>
<p>So, if you’re surprised by the correct answer, your model is definitely wrong (partially or fully).</p>
<h3 id="i-tell-you-at-random-that-b-happened">I tell you, at random, that B happened</h3>
<p>i.e., I tell you a false outcome</p>
<ul>
<li><p>H1: You won’t be surprised. You’re never surprised. Likelihood of no surprise: ~100%</p></li>
<li><p>H2: You will be surprised (cos you correctly predicted A and I’m lying to you). Likelihood of no surprise: ~0%</p></li>
<li><p>H3: You may be surprised.</p>
<p>Since I’m picking B at random, the chances of your wrong answer coinciding with my random answer are low. Hence, low chance of not being surprised.</p>
<p>Likelihood of no surprise: 1/3 = ~33%</p>
<p>In general, the more the possible outcomes, the lower the chances of you not being surprised by any particular random wrong answer.</p></li>
<li><p>H4: You have a high chance of being surprised (cos you wrongly predicted something else).</p>
<p>You might have predicted C or D. To not be surprised, you should have predicted B.</p>
<p>Likelihood of no surprise: 1/3 = ~33%</p></li>
</ul>
<p>Likelihood ratio for no surprise = 100 : 0 : 33 : 33</p>
<p>If you’re not surprised by a wrong answer, you certainly do <em>not</em> have a correct model (H2).</p>
<p>Also, if you’re not surprised by a random wrong answer, then it becomes more and more likely that you are actually Confused (H1), not just wrong.</p>
<p>Likelihood ratio for surprise = 0 : 100 : 66 : 66</p>
<p>If you’re surprised by a wrong answer, you are not confused. Period. You could have a correct model, or a partially wrong model, or a fully wrong model. But you’re <em>not</em> confused.</p>
<h1 id="observations-1">Observations</h1>
<p>We have plenty of differing predictions here. Look at the ’0’s in the Likelihood ratios.</p>
<ol style="list-style-type: upper-alpha">
<li><p>Not surprised by a false answer: you don’t have a correct model.</p>
<p>You <em>need</em> to be surprised by every single false answer I give you.</p></li>
<li><p>Not surprised by a Random False Answer: <em>strong</em> evidence that you are confused.</p>
<p>The chances of your model coming up with the exact same answer as my random answer are very slim.</p></li>
<li><p>Surprised at some answer (whether it was right or wrong): strong evidence that you are not confused</p>
<p>Confused people don’t usually get surprised.</p></li>
<li><p>Surprised by a correct answer: your model is definitely wrong.</p>
<p>But, on the bright side, you are definitely not confused.</p></li>
<li><p><em>Not</em> surprised by a correct answer: you don’t have a completely wrong model.</p>
<p>Congrats. Your model is not completely useless.</p></li>
</ol>
<h1 id="your-strength-as-a-rationalist">Your Strength as a Rationalist</h1>
<p>Here comes the bang!</p>
<p>If somebody claims that they <em>know</em> a topic well, then they should be able to make correct predictions, right?</p>
<p>Now, as per (A), I lie to them about the result. I give them some Random False Answer.</p>
<p>If they are surprised, it means they are not confused! Good. They <em>should</em> be surprised. I gave them the wrong result.</p>
<p>If they don’t get surprised, hey! They don’t have a correct model! They accepted a wrong answer! That is just stupid (like thinking that the lighter ball will reach the ground first. Or that the heavier ball will :P)</p>
<p>Are they confused or do they have a wrong model? As per (B), I give them more of such Random False Answers. If they don’t get surprised often, it becomes extremely likely that they are confused.</p>
<p>False results should surprise you if you know the subject well.</p>
<p>And so it is said:</p>
<blockquote>
<p>Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.</p>
<p>– <a href="http://lesswrong.com/lw/if/your_strength_as_a_rationalist/">Your Strength as a Rationalist</a>, Eliezer Yudkowsky</p>
</blockquote>
<h1 id="i-havent-understood-x-fully">“I haven’t understood X fully”</h1>
<p>“<em>That’s</em> why I am not able to answer questions correctly. All I need to do is sit and read some more books for a while. Then, I will beat your sorry little ass easily.”</p>
<p>Oh, really? Is that so?</p>
<p>Does he have a lack of understanding about X (wrong model) or is he actually plain confused?</p>
<p>Easy to find out! Just use (B) and (D).</p>
<p>Give him a bunch of Random False Answers, maybe by casually dropping some “facts” in conversation. If he is not surprised, it means he is <em>confused</em> about the topic, not wrong. He doesn’t have any model in his head. “Sitting” and “reading books” is not gonna fix his confusion.</p>
<p>On the other hand, if you give some Random False Answers or a few correct answers, and he <em>is</em> surprised, then he just has a wrong model. He can correct it by reading up on the topic.</p>
<p>The same test can be used for anybody who says that they don’t quite <em>remember</em> what they have studied about a subject. Deploy the Random False Answers test. They may actually be confused, not forgetful.</p>
<p>Ditto for what people say about esoteric subjects (like Quantum Mechanics or Evolution). Check whether they (or you) are surprised by Random False Answers (“Cockroaches started evolving 16 billion years ago. That’s why they can survive even nuclear attacks” - wrong on so many different levels). Check whether they (or you) are surprised by correct answers. If you have a wrong model, you should be surprised at least some times. If you’re nodding your head all along the way, you were, are, and will remain confused.</p>
<h1 id="whats-the-big-difference">What’s the big difference?</h1>
<p>We wanted to identify when you say you know a topic but are actually confused. Similarly, we wanted to identify when you say you have a wrong understanding about a topic but are actually confused (not just wrong).</p>
<p>But, whether you think you’re right but are confused, or whether you think you’re wrong but are actually confused, or whether you’re just plain wrong but not confused, the outcome is the same: you get <em>poor</em> results.</p>
<p>Why do we care?</p>
<p>We care because the <strong>remedies</strong> are different.</p>
<p>When you’re wrong but <em>not</em> confused, you can improve your predictions by getting more information. You can update on the evidence. The road to recovery is straightforward.</p>
<p>But when you’re confused, whether you think you’re right or wrong, the road ahead is completely different. More information will not solve your problem. Consulting with others won’t help. Neither will banging your head against the problem.</p>
<p>When you’re confused, you won’t be surprised by fake answers. When you’re confused, you won’t be surprised by right answers that you hadn’t guessed (and may never have guessed, like the two balls reaching the ground at the same time). This means that you cannot learn from experience. You can’t improve your performance over time.</p>
<hr />
<p>This is just one way of Noticing Confusion. When you’re Confused, you don’t get surprised by anything.</p>
<p>But that is not the only way your Confusion exhibits itself.</p>
<p>(to be continued)</p>
<h1 id="ps">PS</h1>
<ul>
<li><p>I have never faced so much confusion in writing an essay. I am finding it incredibly hard to explain confusion. What does that imply? I am confused about “confusion”.</p></li>
<li><p>Was incredibly confused and demotivated about writing this essay. Was stuck over and over.</p></li>
<li><p>Finally hit upon the hypothesis tests! :D</p></li>
</ul>
<h1 id="notes">Notes</h1>
<ul>
<li>Thanks to Eliezer for teaching me about <a href="http://wiki.lesswrong.com/wiki/Mysterious_Answers_to_Mysterious_Questions">Confusion</a>.</li>
</ul>
<div class="info">Created: November 18, 2014</div>
<div class="info">Last modified: August 6, 2015</div>
<div class="info">Status: finished</div>
<div class="info"><b>Tags</b>: confusion, anticipation constraint</div>
<br />
<div id="disqus_thread"></div>
<script type="text/javascript">
/* * * CONFIGURATION VARIABLES: EDIT BEFORE PASTING INTO YOUR WEBPAGE * * */
var disqus_shortname = 'spkrationalitytrainingground'; // required: replace example with your forum shortname
var disqus_identifier = '/Noticing-Confusion.html';
var disqus_title = 'Noticing Confusion';
/* * * DON'T EDIT BELOW THIS LINE * * */
(function() {
var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true;
dsq.src = '//' + disqus_shortname + '.disqus.com/embed.js';
(document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq);
})();
</script>
<script type="text/javascript" src="https://fast.fonts.net/jsapi/f7f47a40-b25b-44ee-9f9c-cfdfc8bb2741.js"></script>
<noscript>Please enable JavaScript to view the <a href="http://disqus.com/?ref_noscript">comments powered by Disqus.</a></noscript>
<a href="http://disqus.com" class="dsq-brlink">comments powered by <span class="logo-disqus">Disqus</span></a>
</div>
<div id="footer">
Site proudly generated by
<a href="http://jaspervdj.be/hakyll">Hakyll</a>
</div>
</body>
</html>