more work
This commit is contained in:
58
R/ch2.qmd
58
R/ch2.qmd
@@ -11,12 +11,18 @@ format:
|
||||
css: styles.css
|
||||
callout-icon: false
|
||||
callout-apperance: simple
|
||||
toc: true
|
||||
---
|
||||
|
||||
*Note: these notes are a work in progress*
|
||||
|
||||
In this chapter we step through an example
|
||||
of "fake" vs "real" news to build a framework to determine the probability
|
||||
of real vs fake of a new news article titled "The President has a secret!"
|
||||
|
||||
We then go on to build a probability known as the Binomial model using the
|
||||
Bayesian framework
|
||||
|
||||
```{r}
|
||||
#| message: false
|
||||
#| warning: false
|
||||
@@ -34,7 +40,7 @@ fake_news <- tibble::as_tibble(fake_news)
|
||||
What is the proportion of news articles that were labeled fake vs real.
|
||||
|
||||
```{r}
|
||||
fake_news |> glimpse()
|
||||
fake_news |> head()
|
||||
|
||||
fake_news |>
|
||||
group_by(type) |>
|
||||
@@ -316,6 +322,33 @@ articles_sim |>
|
||||
)
|
||||
```
|
||||
|
||||
## Binomial Model and the chess example
|
||||
|
||||
The example used here is the case of a chess match between a human
|
||||
and a computer "Deep Blue". The set up is such that we know the two
|
||||
faced each other in 1996, in which the human won. There is a rematch
|
||||
scheduled for the next 1997. We would like to model the number of games
|
||||
out of 6 that the human can win.
|
||||
|
||||
Let $\pi$ be the probability that the human wins any one match against
|
||||
the computer. To simplify things greatly we assume that $\pi$ takes on
|
||||
values of .2, .5, .8. We also assume the following prior (we are told
|
||||
in the book that we will learn how to build these later on):
|
||||
|
||||
| $\pi$ | .2 | .5 | .8 | total |
|
||||
|--------|----|----|----|-------|
|
||||
|$f(\pi)$|.10 |.25 |.65 | 1 |
|
||||
|
||||
:::{.callout-caution}
|
||||
## Note
|
||||
|
||||
its important to note here that the sum of the values of $\pi$ **do
|
||||
not** add up to 1. $\pi$ represents the chances of winning any single
|
||||
game, we would expect $\pi$ to take on any value in $\mathbb{R}$. On
|
||||
the other hand $f$ is a function that maps $\pi$ into a space of
|
||||
probabilities, this is next.
|
||||
:::
|
||||
|
||||
|
||||
:::{.callout-note}
|
||||
## Discrete Probability Model
|
||||
@@ -336,4 +369,27 @@ and has the following properties
|
||||
what does this mean? well its very straightforward a pmf is a function that takes
|
||||
in a some value y and outputs the probability that the random variable
|
||||
$Y$ equals $y$.
|
||||
:::
|
||||
|
||||
### The Binomial Model
|
||||
|
||||
:::{.callout-note}
|
||||
## Conditional probability model of data $Y$
|
||||
|
||||
Let $Y$ be a discrete random variable that depends on some parameter
|
||||
$\pi$. We define the conditional probability model of $Y$ as the
|
||||
conditional pmf,
|
||||
|
||||
$$f(y|\pi) = P(Y = y | \pi)$$
|
||||
|
||||
and has the following properties,
|
||||
|
||||
1. $0 \leq f(y|\pi) \leq 1\;\; \forall y$
|
||||
2. $\sum_{\forall y}f(y|\pi) = 1$
|
||||
:::
|
||||
|
||||
:::{.callout-caution}
|
||||
## in emanuel's words
|
||||
this is essentially the same probability model had defined above, except
|
||||
now we are condition probabilities by some parameter $\pi$
|
||||
:::
|
||||
Reference in New Issue
Block a user