A coin, having probability p of landing heads and probability of q=(1-p) of landing on heads. ...
How discoverable are IPv6 addresses and AAAA names by potential attackers?
Are two submodules (where one is contained in the other) isomorphic if their quotientmodules are isomorphic?
Output the ŋarâþ crîþ alphabet song without using (m)any letters
How widely used is the term Treppenwitz? Is it something that most Germans know?
What is the role of the transistor and diode in a soft start circuit?
What is a non-alternating simple group with big order, but relatively few conjugacy classes?
Is there a (better) way to access $wpdb results?
How does the particle を relate to the verb 行く in the structure「A を + B に行く」?
Overriding an object in memory with placement new
Can I cast Passwall to drop an enemy into a 20-foot pit?
If a contract sometimes uses the wrong name, is it still valid?
What is the meaning of the new sigil in Game of Thrones Season 8 intro?
Identifying polygons that intersect with another layer using QGIS?
Can a non-EU citizen traveling with me come with me through the EU passport line?
Apollo command module space walk?
When do you get frequent flier miles - when you buy, or when you fly?
What exactly is a "Meth" in Altered Carbon?
3 doors, three guards, one stone
String `!23` is replaced with `docker` in command line
How does debian/ubuntu knows a package has a updated version
Why do people hide their license plates in the EU?
Can a USB port passively 'listen only'?
How can I make names more distinctive without making them longer?
What's the meaning of 間時肆拾貳 at a car parking sign
A coin, having probability p of landing heads and probability of q=(1-p) of landing on heads.
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Flipping a special coin: probability of getting heads equals the proportion of heads in the flips so farBiased coin flipped until $r$ heads appearBiased coin probabilityCoin-flipping experiment: the expected number of flips that land on headsWhy are odds of a coin landing heads $50%$ after $'n'$ consecutive headsWhat is the probability of a biased coin flipping heads (probability of heads is $frac 35$) exactly $65$ times in $100$ trials?Flipping rigged coin, calculating most common number of flips between headsChernoff bound probability: value of $n$ so that with probability $.999$ at least half of the coin flips come out headsFlip a coin 6 times. Probability with past results and probability without past results are different?Probability density function of flipping until heads and tails
$begingroup$
A coin, having probability p of landing heads and probability of q=(1-p) of landing on heads. It is continuously flipped until at least one head and one tail have been flipped.
This is not part of a homework assignment. I am studying for a final and don't understand the professors solutions.
a.) Find the expected number of flips needed.
Since this is clearly geometric, I would think the solution would be:
E(N)=$Sigma_{i=0}^{infty}ip^{n-1}q+Sigma_{i=0}^{n}iq^{n-1}p=frac{1}{q}+frac{1}{p}$.
However, I am completely wrong.
The answer is
E(N)=$p(1+frac{1}{q})+q(1+frac{1}{p})$
For example, consider we flip for heads first. Then we have:
E(N|H)=$p+pSigma_{i=0}^{infty}np^{n-1}q$... I am not sure why this makes sense.
I am not entirely sure why we have an added 1 and a factored p,q. Could someone carefully explain why it makes sense that this is the right answer?
probability probability-theory probability-distributions expected-value
New contributor
$endgroup$
add a comment |
$begingroup$
A coin, having probability p of landing heads and probability of q=(1-p) of landing on heads. It is continuously flipped until at least one head and one tail have been flipped.
This is not part of a homework assignment. I am studying for a final and don't understand the professors solutions.
a.) Find the expected number of flips needed.
Since this is clearly geometric, I would think the solution would be:
E(N)=$Sigma_{i=0}^{infty}ip^{n-1}q+Sigma_{i=0}^{n}iq^{n-1}p=frac{1}{q}+frac{1}{p}$.
However, I am completely wrong.
The answer is
E(N)=$p(1+frac{1}{q})+q(1+frac{1}{p})$
For example, consider we flip for heads first. Then we have:
E(N|H)=$p+pSigma_{i=0}^{infty}np^{n-1}q$... I am not sure why this makes sense.
I am not entirely sure why we have an added 1 and a factored p,q. Could someone carefully explain why it makes sense that this is the right answer?
probability probability-theory probability-distributions expected-value
New contributor
$endgroup$
2
$begingroup$
It's all a question of the first toss. If it is $H$ then you just get one more than the expected time to get a $T$, if it is $T$ then you just get one more than the expected time to get $H$. Your method is incorrect because the expected number of tosses needed to get one of the two is $1$.
$endgroup$
– lulu
2 hours ago
1
$begingroup$
In both the title and first paragraph it appears there is $0$ chance of landing tails, so you will wait forever.
$endgroup$
– Ross Millikan
2 hours ago
$begingroup$
Note: your sums are hard to follow. What's $n$? The upper limit of the sums should be $infty$, the exponent of the probability ought to be a simple function of $i$. Done correctly, your method ought to work (though it's easier to do it the other way).
$endgroup$
– lulu
2 hours ago
add a comment |
$begingroup$
A coin, having probability p of landing heads and probability of q=(1-p) of landing on heads. It is continuously flipped until at least one head and one tail have been flipped.
This is not part of a homework assignment. I am studying for a final and don't understand the professors solutions.
a.) Find the expected number of flips needed.
Since this is clearly geometric, I would think the solution would be:
E(N)=$Sigma_{i=0}^{infty}ip^{n-1}q+Sigma_{i=0}^{n}iq^{n-1}p=frac{1}{q}+frac{1}{p}$.
However, I am completely wrong.
The answer is
E(N)=$p(1+frac{1}{q})+q(1+frac{1}{p})$
For example, consider we flip for heads first. Then we have:
E(N|H)=$p+pSigma_{i=0}^{infty}np^{n-1}q$... I am not sure why this makes sense.
I am not entirely sure why we have an added 1 and a factored p,q. Could someone carefully explain why it makes sense that this is the right answer?
probability probability-theory probability-distributions expected-value
New contributor
$endgroup$
A coin, having probability p of landing heads and probability of q=(1-p) of landing on heads. It is continuously flipped until at least one head and one tail have been flipped.
This is not part of a homework assignment. I am studying for a final and don't understand the professors solutions.
a.) Find the expected number of flips needed.
Since this is clearly geometric, I would think the solution would be:
E(N)=$Sigma_{i=0}^{infty}ip^{n-1}q+Sigma_{i=0}^{n}iq^{n-1}p=frac{1}{q}+frac{1}{p}$.
However, I am completely wrong.
The answer is
E(N)=$p(1+frac{1}{q})+q(1+frac{1}{p})$
For example, consider we flip for heads first. Then we have:
E(N|H)=$p+pSigma_{i=0}^{infty}np^{n-1}q$... I am not sure why this makes sense.
I am not entirely sure why we have an added 1 and a factored p,q. Could someone carefully explain why it makes sense that this is the right answer?
probability probability-theory probability-distributions expected-value
probability probability-theory probability-distributions expected-value
New contributor
New contributor
edited 2 hours ago
Mistah White
New contributor
asked 2 hours ago
Mistah WhiteMistah White
62
62
New contributor
New contributor
2
$begingroup$
It's all a question of the first toss. If it is $H$ then you just get one more than the expected time to get a $T$, if it is $T$ then you just get one more than the expected time to get $H$. Your method is incorrect because the expected number of tosses needed to get one of the two is $1$.
$endgroup$
– lulu
2 hours ago
1
$begingroup$
In both the title and first paragraph it appears there is $0$ chance of landing tails, so you will wait forever.
$endgroup$
– Ross Millikan
2 hours ago
$begingroup$
Note: your sums are hard to follow. What's $n$? The upper limit of the sums should be $infty$, the exponent of the probability ought to be a simple function of $i$. Done correctly, your method ought to work (though it's easier to do it the other way).
$endgroup$
– lulu
2 hours ago
add a comment |
2
$begingroup$
It's all a question of the first toss. If it is $H$ then you just get one more than the expected time to get a $T$, if it is $T$ then you just get one more than the expected time to get $H$. Your method is incorrect because the expected number of tosses needed to get one of the two is $1$.
$endgroup$
– lulu
2 hours ago
1
$begingroup$
In both the title and first paragraph it appears there is $0$ chance of landing tails, so you will wait forever.
$endgroup$
– Ross Millikan
2 hours ago
$begingroup$
Note: your sums are hard to follow. What's $n$? The upper limit of the sums should be $infty$, the exponent of the probability ought to be a simple function of $i$. Done correctly, your method ought to work (though it's easier to do it the other way).
$endgroup$
– lulu
2 hours ago
2
2
$begingroup$
It's all a question of the first toss. If it is $H$ then you just get one more than the expected time to get a $T$, if it is $T$ then you just get one more than the expected time to get $H$. Your method is incorrect because the expected number of tosses needed to get one of the two is $1$.
$endgroup$
– lulu
2 hours ago
$begingroup$
It's all a question of the first toss. If it is $H$ then you just get one more than the expected time to get a $T$, if it is $T$ then you just get one more than the expected time to get $H$. Your method is incorrect because the expected number of tosses needed to get one of the two is $1$.
$endgroup$
– lulu
2 hours ago
1
1
$begingroup$
In both the title and first paragraph it appears there is $0$ chance of landing tails, so you will wait forever.
$endgroup$
– Ross Millikan
2 hours ago
$begingroup$
In both the title and first paragraph it appears there is $0$ chance of landing tails, so you will wait forever.
$endgroup$
– Ross Millikan
2 hours ago
$begingroup$
Note: your sums are hard to follow. What's $n$? The upper limit of the sums should be $infty$, the exponent of the probability ought to be a simple function of $i$. Done correctly, your method ought to work (though it's easier to do it the other way).
$endgroup$
– lulu
2 hours ago
$begingroup$
Note: your sums are hard to follow. What's $n$? The upper limit of the sums should be $infty$, the exponent of the probability ought to be a simple function of $i$. Done correctly, your method ought to work (though it's easier to do it the other way).
$endgroup$
– lulu
2 hours ago
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
If you get a head with probability $p$ then the expected number of throws is $1+E(X)$ where $X$ is a geometric distribution requiring a tail to be thrown with probability $q$ so $1+E(X)=1+frac1q$. Similarly if you throw a tail with probability $q$ then the expected number of throws is $1+E(Y)$ where $Y$ is a geometric distribution requiring a head to be thrown with probability $p$ so $1+E(Y)=1+frac1p$. This means that the overall expected number of throws is
$$pleft(1+frac1qright)+qleft(1+frac1pright)$$
because there is a probability $p$ that the expected number of throws is given by $1+E(X)$ and probability $q$ that it is given by $1+E(Y)$.
$endgroup$
add a comment |
$begingroup$
Let $X$ be the time of the first head, and $Y$ the time of the first tail, and $W$ the first time when a head and a tail has been flipped.
You are right in assuming that $E[X]=frac{1}{p}$ and $E[Y]=frac{1}{q}$, But you are wrong in assuming that $W=X+Y$, that's simply not true, actually $W=max(X,Y)$.
A possible approach. Let $A$ be the indicator variable of the event: "first coin was a head" (hence $X=1$).
Then use $$E[W]=E[E[W | A ]] = P(A=1) E[W|A=1]+P(A=0) E[W|A=0]=\=p(E[Y]+1)+q(E[X]+1)$$
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Mistah White is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3190435%2fa-coin-having-probability-p-of-landing-heads-and-probability-of-q-1-p-of-land%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
If you get a head with probability $p$ then the expected number of throws is $1+E(X)$ where $X$ is a geometric distribution requiring a tail to be thrown with probability $q$ so $1+E(X)=1+frac1q$. Similarly if you throw a tail with probability $q$ then the expected number of throws is $1+E(Y)$ where $Y$ is a geometric distribution requiring a head to be thrown with probability $p$ so $1+E(Y)=1+frac1p$. This means that the overall expected number of throws is
$$pleft(1+frac1qright)+qleft(1+frac1pright)$$
because there is a probability $p$ that the expected number of throws is given by $1+E(X)$ and probability $q$ that it is given by $1+E(Y)$.
$endgroup$
add a comment |
$begingroup$
If you get a head with probability $p$ then the expected number of throws is $1+E(X)$ where $X$ is a geometric distribution requiring a tail to be thrown with probability $q$ so $1+E(X)=1+frac1q$. Similarly if you throw a tail with probability $q$ then the expected number of throws is $1+E(Y)$ where $Y$ is a geometric distribution requiring a head to be thrown with probability $p$ so $1+E(Y)=1+frac1p$. This means that the overall expected number of throws is
$$pleft(1+frac1qright)+qleft(1+frac1pright)$$
because there is a probability $p$ that the expected number of throws is given by $1+E(X)$ and probability $q$ that it is given by $1+E(Y)$.
$endgroup$
add a comment |
$begingroup$
If you get a head with probability $p$ then the expected number of throws is $1+E(X)$ where $X$ is a geometric distribution requiring a tail to be thrown with probability $q$ so $1+E(X)=1+frac1q$. Similarly if you throw a tail with probability $q$ then the expected number of throws is $1+E(Y)$ where $Y$ is a geometric distribution requiring a head to be thrown with probability $p$ so $1+E(Y)=1+frac1p$. This means that the overall expected number of throws is
$$pleft(1+frac1qright)+qleft(1+frac1pright)$$
because there is a probability $p$ that the expected number of throws is given by $1+E(X)$ and probability $q$ that it is given by $1+E(Y)$.
$endgroup$
If you get a head with probability $p$ then the expected number of throws is $1+E(X)$ where $X$ is a geometric distribution requiring a tail to be thrown with probability $q$ so $1+E(X)=1+frac1q$. Similarly if you throw a tail with probability $q$ then the expected number of throws is $1+E(Y)$ where $Y$ is a geometric distribution requiring a head to be thrown with probability $p$ so $1+E(Y)=1+frac1p$. This means that the overall expected number of throws is
$$pleft(1+frac1qright)+qleft(1+frac1pright)$$
because there is a probability $p$ that the expected number of throws is given by $1+E(X)$ and probability $q$ that it is given by $1+E(Y)$.
edited 2 hours ago
answered 2 hours ago
Peter ForemanPeter Foreman
7,8921320
7,8921320
add a comment |
add a comment |
$begingroup$
Let $X$ be the time of the first head, and $Y$ the time of the first tail, and $W$ the first time when a head and a tail has been flipped.
You are right in assuming that $E[X]=frac{1}{p}$ and $E[Y]=frac{1}{q}$, But you are wrong in assuming that $W=X+Y$, that's simply not true, actually $W=max(X,Y)$.
A possible approach. Let $A$ be the indicator variable of the event: "first coin was a head" (hence $X=1$).
Then use $$E[W]=E[E[W | A ]] = P(A=1) E[W|A=1]+P(A=0) E[W|A=0]=\=p(E[Y]+1)+q(E[X]+1)$$
$endgroup$
add a comment |
$begingroup$
Let $X$ be the time of the first head, and $Y$ the time of the first tail, and $W$ the first time when a head and a tail has been flipped.
You are right in assuming that $E[X]=frac{1}{p}$ and $E[Y]=frac{1}{q}$, But you are wrong in assuming that $W=X+Y$, that's simply not true, actually $W=max(X,Y)$.
A possible approach. Let $A$ be the indicator variable of the event: "first coin was a head" (hence $X=1$).
Then use $$E[W]=E[E[W | A ]] = P(A=1) E[W|A=1]+P(A=0) E[W|A=0]=\=p(E[Y]+1)+q(E[X]+1)$$
$endgroup$
add a comment |
$begingroup$
Let $X$ be the time of the first head, and $Y$ the time of the first tail, and $W$ the first time when a head and a tail has been flipped.
You are right in assuming that $E[X]=frac{1}{p}$ and $E[Y]=frac{1}{q}$, But you are wrong in assuming that $W=X+Y$, that's simply not true, actually $W=max(X,Y)$.
A possible approach. Let $A$ be the indicator variable of the event: "first coin was a head" (hence $X=1$).
Then use $$E[W]=E[E[W | A ]] = P(A=1) E[W|A=1]+P(A=0) E[W|A=0]=\=p(E[Y]+1)+q(E[X]+1)$$
$endgroup$
Let $X$ be the time of the first head, and $Y$ the time of the first tail, and $W$ the first time when a head and a tail has been flipped.
You are right in assuming that $E[X]=frac{1}{p}$ and $E[Y]=frac{1}{q}$, But you are wrong in assuming that $W=X+Y$, that's simply not true, actually $W=max(X,Y)$.
A possible approach. Let $A$ be the indicator variable of the event: "first coin was a head" (hence $X=1$).
Then use $$E[W]=E[E[W | A ]] = P(A=1) E[W|A=1]+P(A=0) E[W|A=0]=\=p(E[Y]+1)+q(E[X]+1)$$
edited 2 hours ago
answered 2 hours ago
leonbloyleonbloy
42.5k647108
42.5k647108
add a comment |
add a comment |
Mistah White is a new contributor. Be nice, and check out our Code of Conduct.
Mistah White is a new contributor. Be nice, and check out our Code of Conduct.
Mistah White is a new contributor. Be nice, and check out our Code of Conduct.
Mistah White is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3190435%2fa-coin-having-probability-p-of-landing-heads-and-probability-of-q-1-p-of-land%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
It's all a question of the first toss. If it is $H$ then you just get one more than the expected time to get a $T$, if it is $T$ then you just get one more than the expected time to get $H$. Your method is incorrect because the expected number of tosses needed to get one of the two is $1$.
$endgroup$
– lulu
2 hours ago
1
$begingroup$
In both the title and first paragraph it appears there is $0$ chance of landing tails, so you will wait forever.
$endgroup$
– Ross Millikan
2 hours ago
$begingroup$
Note: your sums are hard to follow. What's $n$? The upper limit of the sums should be $infty$, the exponent of the probability ought to be a simple function of $i$. Done correctly, your method ought to work (though it's easier to do it the other way).
$endgroup$
– lulu
2 hours ago