more work added
This commit is contained in:
parent
721d1d83d0
commit
5c071fcbb2
151
R/ch2.html
151
R/ch2.html
|
@ -243,12 +243,12 @@ Probability and Likelihood
|
|||
<span id="cb7-9"><a href="#cb7-9" aria-hidden="true" tabindex="-1"></a> gt<span class="sc">::</span><span class="fu">cols_width</span>(<span class="fu">everything</span>() <span class="sc">~</span> <span class="fu">px</span>(<span class="dv">100</span>))</span></code><button title="Copy to Clipboard" class="code-copy-button"><i class="bi"></i></button></pre></div>
|
||||
<div class="cell-output-display">
|
||||
|
||||
<div id="veslrdaavz" style="overflow-x:auto;overflow-y:auto;width:auto;height:auto;">
|
||||
<div id="jtimbozsld" style="overflow-x:auto;overflow-y:auto;width:auto;height:auto;">
|
||||
<style>html {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Helvetica Neue', 'Fira Sans', 'Droid Sans', Arial, sans-serif;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_table {
|
||||
#jtimbozsld .gt_table {
|
||||
display: table;
|
||||
border-collapse: collapse;
|
||||
margin-left: auto;
|
||||
|
@ -273,7 +273,7 @@ Probability and Likelihood
|
|||
border-left-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_heading {
|
||||
#jtimbozsld .gt_heading {
|
||||
background-color: #FFFFFF;
|
||||
text-align: center;
|
||||
border-bottom-color: #FFFFFF;
|
||||
|
@ -285,7 +285,7 @@ Probability and Likelihood
|
|||
border-right-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_title {
|
||||
#jtimbozsld .gt_title {
|
||||
color: #333333;
|
||||
font-size: 125%;
|
||||
font-weight: initial;
|
||||
|
@ -297,7 +297,7 @@ Probability and Likelihood
|
|||
border-bottom-width: 0;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_subtitle {
|
||||
#jtimbozsld .gt_subtitle {
|
||||
color: #333333;
|
||||
font-size: 85%;
|
||||
font-weight: initial;
|
||||
|
@ -309,13 +309,13 @@ Probability and Likelihood
|
|||
border-top-width: 0;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_bottom_border {
|
||||
#jtimbozsld .gt_bottom_border {
|
||||
border-bottom-style: solid;
|
||||
border-bottom-width: 2px;
|
||||
border-bottom-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_col_headings {
|
||||
#jtimbozsld .gt_col_headings {
|
||||
border-top-style: solid;
|
||||
border-top-width: 2px;
|
||||
border-top-color: #D3D3D3;
|
||||
|
@ -330,7 +330,7 @@ Probability and Likelihood
|
|||
border-right-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_col_heading {
|
||||
#jtimbozsld .gt_col_heading {
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
font-size: 100%;
|
||||
|
@ -350,7 +350,7 @@ Probability and Likelihood
|
|||
overflow-x: hidden;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_column_spanner_outer {
|
||||
#jtimbozsld .gt_column_spanner_outer {
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
font-size: 100%;
|
||||
|
@ -362,15 +362,15 @@ Probability and Likelihood
|
|||
padding-right: 4px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_column_spanner_outer:first-child {
|
||||
#jtimbozsld .gt_column_spanner_outer:first-child {
|
||||
padding-left: 0;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_column_spanner_outer:last-child {
|
||||
#jtimbozsld .gt_column_spanner_outer:last-child {
|
||||
padding-right: 0;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_column_spanner {
|
||||
#jtimbozsld .gt_column_spanner {
|
||||
border-bottom-style: solid;
|
||||
border-bottom-width: 2px;
|
||||
border-bottom-color: #D3D3D3;
|
||||
|
@ -382,7 +382,7 @@ Probability and Likelihood
|
|||
width: 100%;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_group_heading {
|
||||
#jtimbozsld .gt_group_heading {
|
||||
padding-top: 8px;
|
||||
padding-bottom: 8px;
|
||||
padding-left: 5px;
|
||||
|
@ -407,7 +407,7 @@ Probability and Likelihood
|
|||
vertical-align: middle;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_empty_group_heading {
|
||||
#jtimbozsld .gt_empty_group_heading {
|
||||
padding: 0.5px;
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
|
@ -422,15 +422,15 @@ Probability and Likelihood
|
|||
vertical-align: middle;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_from_md > :first-child {
|
||||
#jtimbozsld .gt_from_md > :first-child {
|
||||
margin-top: 0;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_from_md > :last-child {
|
||||
#jtimbozsld .gt_from_md > :last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_row {
|
||||
#jtimbozsld .gt_row {
|
||||
padding-top: 8px;
|
||||
padding-bottom: 8px;
|
||||
padding-left: 5px;
|
||||
|
@ -449,7 +449,7 @@ Probability and Likelihood
|
|||
overflow-x: hidden;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_stub {
|
||||
#jtimbozsld .gt_stub {
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
font-size: 100%;
|
||||
|
@ -462,7 +462,7 @@ Probability and Likelihood
|
|||
padding-right: 5px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_stub_row_group {
|
||||
#jtimbozsld .gt_stub_row_group {
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
font-size: 100%;
|
||||
|
@ -476,11 +476,11 @@ Probability and Likelihood
|
|||
vertical-align: top;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_row_group_first td {
|
||||
#jtimbozsld .gt_row_group_first td {
|
||||
border-top-width: 2px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_summary_row {
|
||||
#jtimbozsld .gt_summary_row {
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
text-transform: inherit;
|
||||
|
@ -490,16 +490,16 @@ Probability and Likelihood
|
|||
padding-right: 5px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_first_summary_row {
|
||||
#jtimbozsld .gt_first_summary_row {
|
||||
border-top-style: solid;
|
||||
border-top-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_first_summary_row.thick {
|
||||
#jtimbozsld .gt_first_summary_row.thick {
|
||||
border-top-width: 2px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_last_summary_row {
|
||||
#jtimbozsld .gt_last_summary_row {
|
||||
padding-top: 8px;
|
||||
padding-bottom: 8px;
|
||||
padding-left: 5px;
|
||||
|
@ -509,7 +509,7 @@ Probability and Likelihood
|
|||
border-bottom-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_grand_summary_row {
|
||||
#jtimbozsld .gt_grand_summary_row {
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
text-transform: inherit;
|
||||
|
@ -519,7 +519,7 @@ Probability and Likelihood
|
|||
padding-right: 5px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_first_grand_summary_row {
|
||||
#jtimbozsld .gt_first_grand_summary_row {
|
||||
padding-top: 8px;
|
||||
padding-bottom: 8px;
|
||||
padding-left: 5px;
|
||||
|
@ -529,11 +529,11 @@ Probability and Likelihood
|
|||
border-top-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_striped {
|
||||
#jtimbozsld .gt_striped {
|
||||
background-color: rgba(128, 128, 128, 0.05);
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_table_body {
|
||||
#jtimbozsld .gt_table_body {
|
||||
border-top-style: solid;
|
||||
border-top-width: 2px;
|
||||
border-top-color: #D3D3D3;
|
||||
|
@ -542,7 +542,7 @@ Probability and Likelihood
|
|||
border-bottom-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_footnotes {
|
||||
#jtimbozsld .gt_footnotes {
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
border-bottom-style: none;
|
||||
|
@ -556,7 +556,7 @@ Probability and Likelihood
|
|||
border-right-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_footnote {
|
||||
#jtimbozsld .gt_footnote {
|
||||
margin: 0px;
|
||||
font-size: 90%;
|
||||
padding-left: 4px;
|
||||
|
@ -565,7 +565,7 @@ Probability and Likelihood
|
|||
padding-right: 5px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_sourcenotes {
|
||||
#jtimbozsld .gt_sourcenotes {
|
||||
color: #333333;
|
||||
background-color: #FFFFFF;
|
||||
border-bottom-style: none;
|
||||
|
@ -579,7 +579,7 @@ Probability and Likelihood
|
|||
border-right-color: #D3D3D3;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_sourcenote {
|
||||
#jtimbozsld .gt_sourcenote {
|
||||
font-size: 90%;
|
||||
padding-top: 4px;
|
||||
padding-bottom: 4px;
|
||||
|
@ -587,64 +587,64 @@ Probability and Likelihood
|
|||
padding-right: 5px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_left {
|
||||
#jtimbozsld .gt_left {
|
||||
text-align: left;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_center {
|
||||
#jtimbozsld .gt_center {
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_right {
|
||||
#jtimbozsld .gt_right {
|
||||
text-align: right;
|
||||
font-variant-numeric: tabular-nums;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_font_normal {
|
||||
#jtimbozsld .gt_font_normal {
|
||||
font-weight: normal;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_font_bold {
|
||||
#jtimbozsld .gt_font_bold {
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_font_italic {
|
||||
#jtimbozsld .gt_font_italic {
|
||||
font-style: italic;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_super {
|
||||
#jtimbozsld .gt_super {
|
||||
font-size: 65%;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_footnote_marks {
|
||||
#jtimbozsld .gt_footnote_marks {
|
||||
font-style: italic;
|
||||
font-weight: normal;
|
||||
font-size: 75%;
|
||||
vertical-align: 0.4em;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_asterisk {
|
||||
#jtimbozsld .gt_asterisk {
|
||||
font-size: 100%;
|
||||
vertical-align: 0;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_indent_1 {
|
||||
#jtimbozsld .gt_indent_1 {
|
||||
text-indent: 5px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_indent_2 {
|
||||
#jtimbozsld .gt_indent_2 {
|
||||
text-indent: 10px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_indent_3 {
|
||||
#jtimbozsld .gt_indent_3 {
|
||||
text-indent: 15px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_indent_4 {
|
||||
#jtimbozsld .gt_indent_4 {
|
||||
text-indent: 20px;
|
||||
}
|
||||
|
||||
#veslrdaavz .gt_indent_5 {
|
||||
#jtimbozsld .gt_indent_5 {
|
||||
text-indent: 25px;
|
||||
}
|
||||
</style>
|
||||
|
@ -733,11 +733,60 @@ Probability and Likelihood
|
|||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<p>A couple things to note about our table (1) + (2) = .4 and (2) + (4) = .6. (1) + (2) + (3) + (4) = 1.</p>
|
||||
<ol type="1">
|
||||
<li><p><span class="math inline">\(P(A \cap B) = P(A|B)P(B)\)</span> we know the likelihood of <span class="math inline">\(L(B|A) = P(A|B)\)</span> and we also know the prior so we insert these to get <span class="math display">\[ P(A \cap B) = P(A|B)P(B) = .267 \times .4 = .1068\]</span></p></li>
|
||||
<li><p><span class="math inline">\(P(A^c \cap B) = P(A^c|B)P(B)\)</span> in this case we do know the prior <span class="math inline">\(P(B) = .4\)</span>, but we don’t directly know the value of <span class="math inline">\(P(A^c|B)\)</span>, however, we note that <span class="math inline">\(P(A|B) + P(A^c|B) = 1\)</span>, therefore we compute <span class="math inline">\(P(A^c|B) = 1 - P(A|B) = 1 - .267 = .733\)</span> <span class="math display">\[ P(A^c \cap B) = P(A^c|B)P(B) = .733 \times .4 = .2932\]</span></p></li>
|
||||
</ol>
|
||||
<p>A couple things to note about our table (1) + (3) = .4 and (2) + (4) = .6. (1) + (2) + (3) + (4) = 1.</p>
|
||||
<p>(1.) <span class="math inline">\(P(A \cap B) = P(A|B)P(B)\)</span> we know the likelihood of <span class="math inline">\(L(B|A) = P(A|B)\)</span> and we also know the prior so we insert these to get <span class="math display">\[ P(A \cap B) = P(A|B)P(B) = .267 \times .4 = .1068\]</span></p>
|
||||
<p>(3.) <span class="math inline">\(P(A^c \cap B) = P(A^c|B)P(B)\)</span> in this case we do know the prior <span class="math inline">\(P(B) = .4\)</span>, but we don’t directly know the value of <span class="math inline">\(P(A^c|B)\)</span>, however, we note that <span class="math inline">\(P(A|B) + P(A^c|B) = 1\)</span>, therefore we compute <span class="math inline">\(P(A^c|B) = 1 - P(A|B) = 1 - .267 = .733\)</span> <span class="math display">\[ P(A^c \cap B) = P(A^c|B)P(B) = .733 \times .4 = .2932\]</span></p>
|
||||
<p>we now can confirm that <span class="math inline">\(.1068 + .2932 = .4\)</span></p>
|
||||
<p>Moving on to (2), (4)</p>
|
||||
<p>(2.) <span class="math inline">\(P(A \cap B^c) = P(A|B^c)P(B^c)\)</span>. In this case know the likelihood <span class="math inline">\(L(B^c|A) = P(A|B^c)\)</span> and we know the prior <span class="math inline">\(P(B^c)\)</span> therefore, <span class="math display">\[P(A \cap B^c) = P(A|B^c)P(B^c) = .022 \times .6 = .0132\]</span></p>
|
||||
<p>(4.) <span class="math inline">\(P(A^c \cap B^c) = P(A^c|B^c)P(B^c) = (1 - .022) \times .6 = .5868\)</span></p>
|
||||
<p>and can confirm that <span class="math inline">\(.0132 + .5868 = .6\)</span></p>
|
||||
<p>and we can fill the rest of the table:</p>
|
||||
<table class="table">
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th></th>
|
||||
<th><span class="math inline">\(B\)</span></th>
|
||||
<th><span class="math inline">\(B^c\)</span></th>
|
||||
<th>Total</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><span class="math inline">\(A\)</span></td>
|
||||
<td>.1068</td>
|
||||
<td>.0132</td>
|
||||
<td>.12</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><span class="math inline">\(A^c\)</span></td>
|
||||
<td>.2932</td>
|
||||
<td>.5868</td>
|
||||
<td>.88</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>Total</td>
|
||||
<td>.4</td>
|
||||
<td>.6</td>
|
||||
<td>1</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<p>An important concept we implemented in above is the idea of <strong>total probability</strong></p>
|
||||
<div class="callout-tip callout callout-style-default no-icon callout-captioned">
|
||||
<div class="callout-header d-flex align-content-center">
|
||||
<div class="callout-icon-container">
|
||||
<i class="callout-icon no-icon"></i>
|
||||
</div>
|
||||
<div class="callout-caption-container flex-fill">
|
||||
total probability
|
||||
</div>
|
||||
</div>
|
||||
<div class="callout-body-container callout-body">
|
||||
<p>The <strong>total probability</strong> of observing a real article is made up the sum of its parts. Namely</p>
|
||||
<p><span class="math display">\[P(B^c) = P(A \cap B^c) + P(A^c \cap B^c)\]</span> <span class="math display">\[=P(A|B^c)P(B^c) + P(A^c|B^c)P(B^c)\]</span> <span class="math display">\[=.0132 + .5868 = .6\]</span></p>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
</main>
|
||||
|
|
37
R/ch2.qmd
37
R/ch2.qmd
|
@ -140,13 +140,44 @@ probability table:
|
|||
A couple things to note about our table (1) + (3) = .4 and (2) + (4) = .6.
|
||||
(1) + (2) + (3) + (4) = 1.
|
||||
|
||||
(1) $P(A \cap B) = P(A|B)P(B)$ we know the likelihood of $L(B|A) = P(A|B)$ and we also
|
||||
(1.) $P(A \cap B) = P(A|B)P(B)$ we know the likelihood of $L(B|A) = P(A|B)$ and we also
|
||||
know the prior so we insert these to get
|
||||
$$ P(A \cap B) = P(A|B)P(B) = .267 \times .4 = .1068$$
|
||||
|
||||
(3) $P(A^c \cap B) = P(A^c|B)P(B)$ in this case we do know the prior $P(B) = .4$, but we
|
||||
(3.) $P(A^c \cap B) = P(A^c|B)P(B)$ in this case we do know the prior $P(B) = .4$, but we
|
||||
don't directly know the value of $P(A^c|B)$, however, we note that $P(A|B) + P(A^c|B) = 1$,
|
||||
therefore we compute $P(A^c|B) = 1 - P(A|B) = 1 - .267 = .733$
|
||||
$$ P(A^c \cap B) = P(A^c|B)P(B) = .733 \times .4 = .2932$$
|
||||
|
||||
we now can confirm that $.1068 + .2932 = .4$
|
||||
we now can confirm that $.1068 + .2932 = .4$
|
||||
|
||||
Moving on to (2), (4)
|
||||
|
||||
(2.) $P(A \cap B^c) = P(A|B^c)P(B^c)$. In this case know the likelihood $L(B^c|A) = P(A|B^c)$ and
|
||||
we know the prior $P(B^c)$ therefore,
|
||||
$$P(A \cap B^c) = P(A|B^c)P(B^c) = .022 \times .6 = .0132$$
|
||||
|
||||
(4.) $P(A^c \cap B^c) = P(A^c|B^c)P(B^c) = (1 - .022) \times .6 = .5868$
|
||||
|
||||
and can confirm that $.0132 + .5868 = .6$
|
||||
|
||||
and we can fill the rest of the table:
|
||||
|
||||
| | $B$ | $B^c$ | Total |
|
||||
|------|---- |------ |-------|
|
||||
|$A$ | .1068 | .0132 | .12 |
|
||||
|$A^c$ | .2932 | .5868 | .88 |
|
||||
|Total | .4 | .6 | 1 |
|
||||
|
||||
An important concept we implemented in above is the idea of **total probability**
|
||||
|
||||
:::{.callout-tip}
|
||||
## total probability
|
||||
|
||||
The **total probability** of observing a real article is made up the sum of its
|
||||
parts. Namely
|
||||
|
||||
$$P(B^c) = P(A \cap B^c) + P(A^c \cap B^c)$$
|
||||
$$=P(A|B^c)P(B^c) + P(A^c|B^c)P(B^c)$$
|
||||
$$=.0132 + .5868 = .6$$
|
||||
:::
|
Loading…
Reference in New Issue