Prove that the sum and the absolute difference of 2 Bernoulli(0.5) random variables are not independent
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty{ margin-bottom:0;
}
up vote
1
down vote
favorite
Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.
I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.
probability self-study independence bernoulli-distribution
add a comment |
up vote
1
down vote
favorite
Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.
I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.
probability self-study independence bernoulli-distribution
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
3 hours ago
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
2 hours ago
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.
I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.
probability self-study independence bernoulli-distribution
Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.
I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.
probability self-study independence bernoulli-distribution
probability self-study independence bernoulli-distribution
asked 4 hours ago
MSE
668
668
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
3 hours ago
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
2 hours ago
add a comment |
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
3 hours ago
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
2 hours ago
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
3 hours ago
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
3 hours ago
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
2 hours ago
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
2 hours ago
add a comment |
2 Answers
2
active
oldest
votes
up vote
1
down vote
accepted
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
add a comment |
up vote
2
down vote
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
2 hours ago
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
add a comment |
up vote
1
down vote
accepted
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.
However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$
To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$
Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.
answered 4 hours ago
Taylor
11.4k11743
11.4k11743
add a comment |
add a comment |
up vote
2
down vote
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
2 hours ago
add a comment |
up vote
2
down vote
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
2 hours ago
add a comment |
up vote
2
down vote
up vote
2
down vote
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
See Independence of $X+Y$ and $X-Y$
answered 4 hours ago
user158565
4,2391316
4,2391316
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
2 hours ago
add a comment |
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
2 hours ago
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
2 hours ago
I want to mark your solution as correct, too! Thanks, @user158565.
– MSE
2 hours ago
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f379103%2fprove-that-the-sum-and-the-absolute-difference-of-2-bernoulli0-5-random-variab%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber♦
3 hours ago
I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
2 hours ago