Tip: Chebyshev Inequality for $n$ throws of a dice












1












$begingroup$


Say a fair dice is thrown $n$ times. Showing using the Chebyshev Inequality that the probability that the number of sixes thrown lies between $frac{1}{6}n-sqrt{n}$ and $frac{1}{6}n+sqrt{n}$ is at least $frac{31}{36}$.



Idea:



The at least leads me to believe that I be doing it over the complement of ${frac{1}{6}n-sqrt{n}< X<frac{1}{6}n+sqrt{n}}={-sqrt{n}< X-frac{1}{6}n<sqrt{n}}$ and note $mathbb E[X]=frac{n}{6}$, where $X$ is number of sixes thrown



This looks similar to Chebyshev.
So, for $epsilon > 0$



$P(|X-frac{1}{6}n|< sqrt{n})=1-P(|X-frac{1}{6}n|geq sqrt{n})leq1-frac{operatorname{Var}{X}}{epsilon^2}iff P(|X-frac{1}{6}n|geqsqrt{n})leqfrac{operatorname{Var}{X}}{epsilon^2}$



But how can I calculate $operatorname{Var}{X}$? Do I have to explicitly define the Random Variable and then find an appropriate density funtion? All I have is $operatorname{Var}{X}=mathbb E[(X-mathbb E[X])^2]=int X-frac{1}{6}noperatorname{dP}$



Any help is greatly appreciated.










share|cite|improve this question









$endgroup$

















    1












    $begingroup$


    Say a fair dice is thrown $n$ times. Showing using the Chebyshev Inequality that the probability that the number of sixes thrown lies between $frac{1}{6}n-sqrt{n}$ and $frac{1}{6}n+sqrt{n}$ is at least $frac{31}{36}$.



    Idea:



    The at least leads me to believe that I be doing it over the complement of ${frac{1}{6}n-sqrt{n}< X<frac{1}{6}n+sqrt{n}}={-sqrt{n}< X-frac{1}{6}n<sqrt{n}}$ and note $mathbb E[X]=frac{n}{6}$, where $X$ is number of sixes thrown



    This looks similar to Chebyshev.
    So, for $epsilon > 0$



    $P(|X-frac{1}{6}n|< sqrt{n})=1-P(|X-frac{1}{6}n|geq sqrt{n})leq1-frac{operatorname{Var}{X}}{epsilon^2}iff P(|X-frac{1}{6}n|geqsqrt{n})leqfrac{operatorname{Var}{X}}{epsilon^2}$



    But how can I calculate $operatorname{Var}{X}$? Do I have to explicitly define the Random Variable and then find an appropriate density funtion? All I have is $operatorname{Var}{X}=mathbb E[(X-mathbb E[X])^2]=int X-frac{1}{6}noperatorname{dP}$



    Any help is greatly appreciated.










    share|cite|improve this question









    $endgroup$















      1












      1








      1





      $begingroup$


      Say a fair dice is thrown $n$ times. Showing using the Chebyshev Inequality that the probability that the number of sixes thrown lies between $frac{1}{6}n-sqrt{n}$ and $frac{1}{6}n+sqrt{n}$ is at least $frac{31}{36}$.



      Idea:



      The at least leads me to believe that I be doing it over the complement of ${frac{1}{6}n-sqrt{n}< X<frac{1}{6}n+sqrt{n}}={-sqrt{n}< X-frac{1}{6}n<sqrt{n}}$ and note $mathbb E[X]=frac{n}{6}$, where $X$ is number of sixes thrown



      This looks similar to Chebyshev.
      So, for $epsilon > 0$



      $P(|X-frac{1}{6}n|< sqrt{n})=1-P(|X-frac{1}{6}n|geq sqrt{n})leq1-frac{operatorname{Var}{X}}{epsilon^2}iff P(|X-frac{1}{6}n|geqsqrt{n})leqfrac{operatorname{Var}{X}}{epsilon^2}$



      But how can I calculate $operatorname{Var}{X}$? Do I have to explicitly define the Random Variable and then find an appropriate density funtion? All I have is $operatorname{Var}{X}=mathbb E[(X-mathbb E[X])^2]=int X-frac{1}{6}noperatorname{dP}$



      Any help is greatly appreciated.










      share|cite|improve this question









      $endgroup$




      Say a fair dice is thrown $n$ times. Showing using the Chebyshev Inequality that the probability that the number of sixes thrown lies between $frac{1}{6}n-sqrt{n}$ and $frac{1}{6}n+sqrt{n}$ is at least $frac{31}{36}$.



      Idea:



      The at least leads me to believe that I be doing it over the complement of ${frac{1}{6}n-sqrt{n}< X<frac{1}{6}n+sqrt{n}}={-sqrt{n}< X-frac{1}{6}n<sqrt{n}}$ and note $mathbb E[X]=frac{n}{6}$, where $X$ is number of sixes thrown



      This looks similar to Chebyshev.
      So, for $epsilon > 0$



      $P(|X-frac{1}{6}n|< sqrt{n})=1-P(|X-frac{1}{6}n|geq sqrt{n})leq1-frac{operatorname{Var}{X}}{epsilon^2}iff P(|X-frac{1}{6}n|geqsqrt{n})leqfrac{operatorname{Var}{X}}{epsilon^2}$



      But how can I calculate $operatorname{Var}{X}$? Do I have to explicitly define the Random Variable and then find an appropriate density funtion? All I have is $operatorname{Var}{X}=mathbb E[(X-mathbb E[X])^2]=int X-frac{1}{6}noperatorname{dP}$



      Any help is greatly appreciated.







      probability probability-theory measure-theory probability-distributions variance






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 9 at 23:55









      SABOYSABOY

      656311




      656311






















          2 Answers
          2






          active

          oldest

          votes


















          1












          $begingroup$

          You're sort of on the right track, but your inequality in $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)leq1-frac{operatorname{Var}{X}}{epsilon^2}$ is the wrong way round ( it should be $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)geq1-frac{operatorname{Var}{X}}{epsilon^2}$ ), you haven't identified what $epsilon$ is, and a power of 2 is missing from your integral for the variance. The latter should be $int left( X - frac{1}{6} n right)^2 dP$ .



          I presume the form of Chebyshev's inequality you're using is $P(|X-frac{1}{6}n|geq epsilon)leqfrac{operatorname{Var}{X}}{epsilon^2}$ , in which case your $epsilon$ is just $sqrt{n}$ , and your inequality becomes $P(|X-frac{1}{6}n|geq sqrt{n})leqfrac{operatorname{Var}{X}}{n}$



          You could evaluate the integral for the variance by working out what the distribution $F_X$ of $X$ is (Hint: $F_Xleft(jright) = Pleft(X=jright)$ is the probability of getting $j$ sixes with $n$
          independent throws of a fair die), but there's also a simpler way of calculating it.



          If $X_i$ is the number of sixes you get on the $i^mbox{th}$ throw, then $Pleft( X_i = 1 right) = frac{1}{6}$ , $Pleft( X_i = 0 right) = frac{5}{6}$ , and $X_1, X_2, dots , X_n$ are independent, identically distributed random variables with $X = X_1 + X_2 + dots + X_n$. Now there's a theorem which tells us that the variance of a sum of $n$ independent identically distributed random variables is just $n$ times the common variance of the summands. That is, $mbox{Var}left(Xright) = n mbox{Var}left(X_1right)$ , so you can prove your result just by calculating the variance of the simple two-valued random variable $X_1$ .



          Elaboration of hint about $F_X$:



          Since $X$ can only take on one of the values $0, 1, dots , n$ , the sample space (call it $Omega$) can be partitioned into a union of the disjoint events $ E_j = left{ omega in Omega | Xleft(omegaright) = j right} mbox{ for } j=0, 1, dots , n $ . The integral $int left( X - frac{1}{6} n right)^2 dP$ can then be written as $int_{bigcup_{j=0}^n E_j}left( X - frac{1}{6} n right)^2 dP = sum_{j=0}^n int_{E_j}left( X - frac{1}{6} n right)^2 dP $ . Since $X$ has the fixed value $j$ everywhere in $E_j$ , then $int_{E_j}left( X - frac{1}{6} n right)^2 dP = $ $ left( j- frac{1}{6} n right)^2 int_{E_j} dP = left( j- frac{1}{6} n right)^2 Pleft(E_jright) = left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ . So $Varleft(Xright) = sum_{j=0}^n left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ .



          As callculus noted in his answer, $X$ is $ n, frac{1}{6}$-binomially distributed, which gives you the expression for $F_Xleft(jright)$ as a function of $n$ and $j$ . If you don't know this expression, you will find it (as well as its variance!) in any good text on elementary probability theory (such as Volume 1 of William Feller's classic, An Introduction to Probability Theory and Its Applications—3rd Edition—where you will find the material on pp.147-8 and p.230) .






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Can you elaborate on your hint for $F_{X}(j)$ and evaluating the integral
            $endgroup$
            – SABOY
            Jan 10 at 10:45










          • $begingroup$
            Why is $P(X=j)=F_{X}(j)$ when $F_{X}(j)$ by definition is equal to $P(Xleq j)$?
            $endgroup$
            – SABOY
            Jan 14 at 14:15












          • $begingroup$
            Applied to a discrete-valued random variable, $X$ , "distribution of $X$" is normally used to mean the function defined by $F_Xleft(jright) = Pleft(X=jright)$, rather than the cumulative distribution function given by $Pleft(Xle jright)$ . At least, this was the case 50 years ago when I learnt probability theory from Feller's book, where you'll find the distinction drawn on p.213.
            $endgroup$
            – lonza leggiera
            Jan 14 at 22:29





















          1












          $begingroup$

          Hint: The number of sixes are distributed as $Xsim Binleft( n, frac16right)$. The variance of $X$ is well known. Then indeed you can use the inequality



          $$P(|X-frac{1}{6}n|< sqrt{n})geq 1-frac{operatorname{Var}{X}}{epsilon^2}$$



          I think you know what to plug in for $epsilon^2$.






          share|cite|improve this answer









          $endgroup$













            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3068076%2ftip-chebyshev-inequality-for-n-throws-of-a-dice%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            1












            $begingroup$

            You're sort of on the right track, but your inequality in $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)leq1-frac{operatorname{Var}{X}}{epsilon^2}$ is the wrong way round ( it should be $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)geq1-frac{operatorname{Var}{X}}{epsilon^2}$ ), you haven't identified what $epsilon$ is, and a power of 2 is missing from your integral for the variance. The latter should be $int left( X - frac{1}{6} n right)^2 dP$ .



            I presume the form of Chebyshev's inequality you're using is $P(|X-frac{1}{6}n|geq epsilon)leqfrac{operatorname{Var}{X}}{epsilon^2}$ , in which case your $epsilon$ is just $sqrt{n}$ , and your inequality becomes $P(|X-frac{1}{6}n|geq sqrt{n})leqfrac{operatorname{Var}{X}}{n}$



            You could evaluate the integral for the variance by working out what the distribution $F_X$ of $X$ is (Hint: $F_Xleft(jright) = Pleft(X=jright)$ is the probability of getting $j$ sixes with $n$
            independent throws of a fair die), but there's also a simpler way of calculating it.



            If $X_i$ is the number of sixes you get on the $i^mbox{th}$ throw, then $Pleft( X_i = 1 right) = frac{1}{6}$ , $Pleft( X_i = 0 right) = frac{5}{6}$ , and $X_1, X_2, dots , X_n$ are independent, identically distributed random variables with $X = X_1 + X_2 + dots + X_n$. Now there's a theorem which tells us that the variance of a sum of $n$ independent identically distributed random variables is just $n$ times the common variance of the summands. That is, $mbox{Var}left(Xright) = n mbox{Var}left(X_1right)$ , so you can prove your result just by calculating the variance of the simple two-valued random variable $X_1$ .



            Elaboration of hint about $F_X$:



            Since $X$ can only take on one of the values $0, 1, dots , n$ , the sample space (call it $Omega$) can be partitioned into a union of the disjoint events $ E_j = left{ omega in Omega | Xleft(omegaright) = j right} mbox{ for } j=0, 1, dots , n $ . The integral $int left( X - frac{1}{6} n right)^2 dP$ can then be written as $int_{bigcup_{j=0}^n E_j}left( X - frac{1}{6} n right)^2 dP = sum_{j=0}^n int_{E_j}left( X - frac{1}{6} n right)^2 dP $ . Since $X$ has the fixed value $j$ everywhere in $E_j$ , then $int_{E_j}left( X - frac{1}{6} n right)^2 dP = $ $ left( j- frac{1}{6} n right)^2 int_{E_j} dP = left( j- frac{1}{6} n right)^2 Pleft(E_jright) = left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ . So $Varleft(Xright) = sum_{j=0}^n left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ .



            As callculus noted in his answer, $X$ is $ n, frac{1}{6}$-binomially distributed, which gives you the expression for $F_Xleft(jright)$ as a function of $n$ and $j$ . If you don't know this expression, you will find it (as well as its variance!) in any good text on elementary probability theory (such as Volume 1 of William Feller's classic, An Introduction to Probability Theory and Its Applications—3rd Edition—where you will find the material on pp.147-8 and p.230) .






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              Can you elaborate on your hint for $F_{X}(j)$ and evaluating the integral
              $endgroup$
              – SABOY
              Jan 10 at 10:45










            • $begingroup$
              Why is $P(X=j)=F_{X}(j)$ when $F_{X}(j)$ by definition is equal to $P(Xleq j)$?
              $endgroup$
              – SABOY
              Jan 14 at 14:15












            • $begingroup$
              Applied to a discrete-valued random variable, $X$ , "distribution of $X$" is normally used to mean the function defined by $F_Xleft(jright) = Pleft(X=jright)$, rather than the cumulative distribution function given by $Pleft(Xle jright)$ . At least, this was the case 50 years ago when I learnt probability theory from Feller's book, where you'll find the distinction drawn on p.213.
              $endgroup$
              – lonza leggiera
              Jan 14 at 22:29


















            1












            $begingroup$

            You're sort of on the right track, but your inequality in $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)leq1-frac{operatorname{Var}{X}}{epsilon^2}$ is the wrong way round ( it should be $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)geq1-frac{operatorname{Var}{X}}{epsilon^2}$ ), you haven't identified what $epsilon$ is, and a power of 2 is missing from your integral for the variance. The latter should be $int left( X - frac{1}{6} n right)^2 dP$ .



            I presume the form of Chebyshev's inequality you're using is $P(|X-frac{1}{6}n|geq epsilon)leqfrac{operatorname{Var}{X}}{epsilon^2}$ , in which case your $epsilon$ is just $sqrt{n}$ , and your inequality becomes $P(|X-frac{1}{6}n|geq sqrt{n})leqfrac{operatorname{Var}{X}}{n}$



            You could evaluate the integral for the variance by working out what the distribution $F_X$ of $X$ is (Hint: $F_Xleft(jright) = Pleft(X=jright)$ is the probability of getting $j$ sixes with $n$
            independent throws of a fair die), but there's also a simpler way of calculating it.



            If $X_i$ is the number of sixes you get on the $i^mbox{th}$ throw, then $Pleft( X_i = 1 right) = frac{1}{6}$ , $Pleft( X_i = 0 right) = frac{5}{6}$ , and $X_1, X_2, dots , X_n$ are independent, identically distributed random variables with $X = X_1 + X_2 + dots + X_n$. Now there's a theorem which tells us that the variance of a sum of $n$ independent identically distributed random variables is just $n$ times the common variance of the summands. That is, $mbox{Var}left(Xright) = n mbox{Var}left(X_1right)$ , so you can prove your result just by calculating the variance of the simple two-valued random variable $X_1$ .



            Elaboration of hint about $F_X$:



            Since $X$ can only take on one of the values $0, 1, dots , n$ , the sample space (call it $Omega$) can be partitioned into a union of the disjoint events $ E_j = left{ omega in Omega | Xleft(omegaright) = j right} mbox{ for } j=0, 1, dots , n $ . The integral $int left( X - frac{1}{6} n right)^2 dP$ can then be written as $int_{bigcup_{j=0}^n E_j}left( X - frac{1}{6} n right)^2 dP = sum_{j=0}^n int_{E_j}left( X - frac{1}{6} n right)^2 dP $ . Since $X$ has the fixed value $j$ everywhere in $E_j$ , then $int_{E_j}left( X - frac{1}{6} n right)^2 dP = $ $ left( j- frac{1}{6} n right)^2 int_{E_j} dP = left( j- frac{1}{6} n right)^2 Pleft(E_jright) = left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ . So $Varleft(Xright) = sum_{j=0}^n left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ .



            As callculus noted in his answer, $X$ is $ n, frac{1}{6}$-binomially distributed, which gives you the expression for $F_Xleft(jright)$ as a function of $n$ and $j$ . If you don't know this expression, you will find it (as well as its variance!) in any good text on elementary probability theory (such as Volume 1 of William Feller's classic, An Introduction to Probability Theory and Its Applications—3rd Edition—where you will find the material on pp.147-8 and p.230) .






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              Can you elaborate on your hint for $F_{X}(j)$ and evaluating the integral
              $endgroup$
              – SABOY
              Jan 10 at 10:45










            • $begingroup$
              Why is $P(X=j)=F_{X}(j)$ when $F_{X}(j)$ by definition is equal to $P(Xleq j)$?
              $endgroup$
              – SABOY
              Jan 14 at 14:15












            • $begingroup$
              Applied to a discrete-valued random variable, $X$ , "distribution of $X$" is normally used to mean the function defined by $F_Xleft(jright) = Pleft(X=jright)$, rather than the cumulative distribution function given by $Pleft(Xle jright)$ . At least, this was the case 50 years ago when I learnt probability theory from Feller's book, where you'll find the distinction drawn on p.213.
              $endgroup$
              – lonza leggiera
              Jan 14 at 22:29
















            1












            1








            1





            $begingroup$

            You're sort of on the right track, but your inequality in $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)leq1-frac{operatorname{Var}{X}}{epsilon^2}$ is the wrong way round ( it should be $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)geq1-frac{operatorname{Var}{X}}{epsilon^2}$ ), you haven't identified what $epsilon$ is, and a power of 2 is missing from your integral for the variance. The latter should be $int left( X - frac{1}{6} n right)^2 dP$ .



            I presume the form of Chebyshev's inequality you're using is $P(|X-frac{1}{6}n|geq epsilon)leqfrac{operatorname{Var}{X}}{epsilon^2}$ , in which case your $epsilon$ is just $sqrt{n}$ , and your inequality becomes $P(|X-frac{1}{6}n|geq sqrt{n})leqfrac{operatorname{Var}{X}}{n}$



            You could evaluate the integral for the variance by working out what the distribution $F_X$ of $X$ is (Hint: $F_Xleft(jright) = Pleft(X=jright)$ is the probability of getting $j$ sixes with $n$
            independent throws of a fair die), but there's also a simpler way of calculating it.



            If $X_i$ is the number of sixes you get on the $i^mbox{th}$ throw, then $Pleft( X_i = 1 right) = frac{1}{6}$ , $Pleft( X_i = 0 right) = frac{5}{6}$ , and $X_1, X_2, dots , X_n$ are independent, identically distributed random variables with $X = X_1 + X_2 + dots + X_n$. Now there's a theorem which tells us that the variance of a sum of $n$ independent identically distributed random variables is just $n$ times the common variance of the summands. That is, $mbox{Var}left(Xright) = n mbox{Var}left(X_1right)$ , so you can prove your result just by calculating the variance of the simple two-valued random variable $X_1$ .



            Elaboration of hint about $F_X$:



            Since $X$ can only take on one of the values $0, 1, dots , n$ , the sample space (call it $Omega$) can be partitioned into a union of the disjoint events $ E_j = left{ omega in Omega | Xleft(omegaright) = j right} mbox{ for } j=0, 1, dots , n $ . The integral $int left( X - frac{1}{6} n right)^2 dP$ can then be written as $int_{bigcup_{j=0}^n E_j}left( X - frac{1}{6} n right)^2 dP = sum_{j=0}^n int_{E_j}left( X - frac{1}{6} n right)^2 dP $ . Since $X$ has the fixed value $j$ everywhere in $E_j$ , then $int_{E_j}left( X - frac{1}{6} n right)^2 dP = $ $ left( j- frac{1}{6} n right)^2 int_{E_j} dP = left( j- frac{1}{6} n right)^2 Pleft(E_jright) = left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ . So $Varleft(Xright) = sum_{j=0}^n left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ .



            As callculus noted in his answer, $X$ is $ n, frac{1}{6}$-binomially distributed, which gives you the expression for $F_Xleft(jright)$ as a function of $n$ and $j$ . If you don't know this expression, you will find it (as well as its variance!) in any good text on elementary probability theory (such as Volume 1 of William Feller's classic, An Introduction to Probability Theory and Its Applications—3rd Edition—where you will find the material on pp.147-8 and p.230) .






            share|cite|improve this answer











            $endgroup$



            You're sort of on the right track, but your inequality in $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)leq1-frac{operatorname{Var}{X}}{epsilon^2}$ is the wrong way round ( it should be $1-Pleft(|X-frac{1}{6}n|geq sqrt{n}right)geq1-frac{operatorname{Var}{X}}{epsilon^2}$ ), you haven't identified what $epsilon$ is, and a power of 2 is missing from your integral for the variance. The latter should be $int left( X - frac{1}{6} n right)^2 dP$ .



            I presume the form of Chebyshev's inequality you're using is $P(|X-frac{1}{6}n|geq epsilon)leqfrac{operatorname{Var}{X}}{epsilon^2}$ , in which case your $epsilon$ is just $sqrt{n}$ , and your inequality becomes $P(|X-frac{1}{6}n|geq sqrt{n})leqfrac{operatorname{Var}{X}}{n}$



            You could evaluate the integral for the variance by working out what the distribution $F_X$ of $X$ is (Hint: $F_Xleft(jright) = Pleft(X=jright)$ is the probability of getting $j$ sixes with $n$
            independent throws of a fair die), but there's also a simpler way of calculating it.



            If $X_i$ is the number of sixes you get on the $i^mbox{th}$ throw, then $Pleft( X_i = 1 right) = frac{1}{6}$ , $Pleft( X_i = 0 right) = frac{5}{6}$ , and $X_1, X_2, dots , X_n$ are independent, identically distributed random variables with $X = X_1 + X_2 + dots + X_n$. Now there's a theorem which tells us that the variance of a sum of $n$ independent identically distributed random variables is just $n$ times the common variance of the summands. That is, $mbox{Var}left(Xright) = n mbox{Var}left(X_1right)$ , so you can prove your result just by calculating the variance of the simple two-valued random variable $X_1$ .



            Elaboration of hint about $F_X$:



            Since $X$ can only take on one of the values $0, 1, dots , n$ , the sample space (call it $Omega$) can be partitioned into a union of the disjoint events $ E_j = left{ omega in Omega | Xleft(omegaright) = j right} mbox{ for } j=0, 1, dots , n $ . The integral $int left( X - frac{1}{6} n right)^2 dP$ can then be written as $int_{bigcup_{j=0}^n E_j}left( X - frac{1}{6} n right)^2 dP = sum_{j=0}^n int_{E_j}left( X - frac{1}{6} n right)^2 dP $ . Since $X$ has the fixed value $j$ everywhere in $E_j$ , then $int_{E_j}left( X - frac{1}{6} n right)^2 dP = $ $ left( j- frac{1}{6} n right)^2 int_{E_j} dP = left( j- frac{1}{6} n right)^2 Pleft(E_jright) = left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ . So $Varleft(Xright) = sum_{j=0}^n left( j- frac{1}{6} n right)^2 F_Xleft(jright)$ .



            As callculus noted in his answer, $X$ is $ n, frac{1}{6}$-binomially distributed, which gives you the expression for $F_Xleft(jright)$ as a function of $n$ and $j$ . If you don't know this expression, you will find it (as well as its variance!) in any good text on elementary probability theory (such as Volume 1 of William Feller's classic, An Introduction to Probability Theory and Its Applications—3rd Edition—where you will find the material on pp.147-8 and p.230) .







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited Jan 14 at 21:59

























            answered Jan 10 at 1:57









            lonza leggieralonza leggiera

            93617




            93617












            • $begingroup$
              Can you elaborate on your hint for $F_{X}(j)$ and evaluating the integral
              $endgroup$
              – SABOY
              Jan 10 at 10:45










            • $begingroup$
              Why is $P(X=j)=F_{X}(j)$ when $F_{X}(j)$ by definition is equal to $P(Xleq j)$?
              $endgroup$
              – SABOY
              Jan 14 at 14:15












            • $begingroup$
              Applied to a discrete-valued random variable, $X$ , "distribution of $X$" is normally used to mean the function defined by $F_Xleft(jright) = Pleft(X=jright)$, rather than the cumulative distribution function given by $Pleft(Xle jright)$ . At least, this was the case 50 years ago when I learnt probability theory from Feller's book, where you'll find the distinction drawn on p.213.
              $endgroup$
              – lonza leggiera
              Jan 14 at 22:29




















            • $begingroup$
              Can you elaborate on your hint for $F_{X}(j)$ and evaluating the integral
              $endgroup$
              – SABOY
              Jan 10 at 10:45










            • $begingroup$
              Why is $P(X=j)=F_{X}(j)$ when $F_{X}(j)$ by definition is equal to $P(Xleq j)$?
              $endgroup$
              – SABOY
              Jan 14 at 14:15












            • $begingroup$
              Applied to a discrete-valued random variable, $X$ , "distribution of $X$" is normally used to mean the function defined by $F_Xleft(jright) = Pleft(X=jright)$, rather than the cumulative distribution function given by $Pleft(Xle jright)$ . At least, this was the case 50 years ago when I learnt probability theory from Feller's book, where you'll find the distinction drawn on p.213.
              $endgroup$
              – lonza leggiera
              Jan 14 at 22:29


















            $begingroup$
            Can you elaborate on your hint for $F_{X}(j)$ and evaluating the integral
            $endgroup$
            – SABOY
            Jan 10 at 10:45




            $begingroup$
            Can you elaborate on your hint for $F_{X}(j)$ and evaluating the integral
            $endgroup$
            – SABOY
            Jan 10 at 10:45












            $begingroup$
            Why is $P(X=j)=F_{X}(j)$ when $F_{X}(j)$ by definition is equal to $P(Xleq j)$?
            $endgroup$
            – SABOY
            Jan 14 at 14:15






            $begingroup$
            Why is $P(X=j)=F_{X}(j)$ when $F_{X}(j)$ by definition is equal to $P(Xleq j)$?
            $endgroup$
            – SABOY
            Jan 14 at 14:15














            $begingroup$
            Applied to a discrete-valued random variable, $X$ , "distribution of $X$" is normally used to mean the function defined by $F_Xleft(jright) = Pleft(X=jright)$, rather than the cumulative distribution function given by $Pleft(Xle jright)$ . At least, this was the case 50 years ago when I learnt probability theory from Feller's book, where you'll find the distinction drawn on p.213.
            $endgroup$
            – lonza leggiera
            Jan 14 at 22:29






            $begingroup$
            Applied to a discrete-valued random variable, $X$ , "distribution of $X$" is normally used to mean the function defined by $F_Xleft(jright) = Pleft(X=jright)$, rather than the cumulative distribution function given by $Pleft(Xle jright)$ . At least, this was the case 50 years ago when I learnt probability theory from Feller's book, where you'll find the distinction drawn on p.213.
            $endgroup$
            – lonza leggiera
            Jan 14 at 22:29













            1












            $begingroup$

            Hint: The number of sixes are distributed as $Xsim Binleft( n, frac16right)$. The variance of $X$ is well known. Then indeed you can use the inequality



            $$P(|X-frac{1}{6}n|< sqrt{n})geq 1-frac{operatorname{Var}{X}}{epsilon^2}$$



            I think you know what to plug in for $epsilon^2$.






            share|cite|improve this answer









            $endgroup$


















              1












              $begingroup$

              Hint: The number of sixes are distributed as $Xsim Binleft( n, frac16right)$. The variance of $X$ is well known. Then indeed you can use the inequality



              $$P(|X-frac{1}{6}n|< sqrt{n})geq 1-frac{operatorname{Var}{X}}{epsilon^2}$$



              I think you know what to plug in for $epsilon^2$.






              share|cite|improve this answer









              $endgroup$
















                1












                1








                1





                $begingroup$

                Hint: The number of sixes are distributed as $Xsim Binleft( n, frac16right)$. The variance of $X$ is well known. Then indeed you can use the inequality



                $$P(|X-frac{1}{6}n|< sqrt{n})geq 1-frac{operatorname{Var}{X}}{epsilon^2}$$



                I think you know what to plug in for $epsilon^2$.






                share|cite|improve this answer









                $endgroup$



                Hint: The number of sixes are distributed as $Xsim Binleft( n, frac16right)$. The variance of $X$ is well known. Then indeed you can use the inequality



                $$P(|X-frac{1}{6}n|< sqrt{n})geq 1-frac{operatorname{Var}{X}}{epsilon^2}$$



                I think you know what to plug in for $epsilon^2$.







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Jan 10 at 3:06









                callculuscallculus

                18.2k31427




                18.2k31427






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Mathematics Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3068076%2ftip-chebyshev-inequality-for-n-throws-of-a-dice%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Human spaceflight

                    Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

                    File:DeusFollowingSea.jpg