Degrees of Freedom in Covariance: Intuition?












4












$begingroup$


If we say $operatorname{Var}(x)$ has $n-1$ degrees of freedom which are lost after we estimate $operatorname{Var}(x)$, this matches how $n-1$ observations are now constrained to be sufficiently close to the remaining observation of $x$. In my class, $operatorname{Cov}(x,y)$ is also described as having $n-1$ degrees of freedom which are lost after we estimate $operatorname{Cov}(x,y)$. Reference for this use of "degrees of freedom" in covariance



My confusion is that the covariance does not actually seem to constrain $n-1$ $x$ and $y$ values, like the variance did for $x$.



Can you relate the $n-1$ degrees of freedom in covariance and the intuition for degrees of freedom as how many observations are not free to change after estimating $operatorname{Cov}(x,y)$? Is it possible to explain by counting newly restricted observations, like when explaining why the sample mean has degree of freedom $1,$ or variance degrees of freedom $n-1$?





  • The second related question is about the explanation in the reference linked above:




    "Initially, we have $2n$ degrees of freedom in the bivariate data. We
    lose two by computing the sample means $m(x)$ and $m(y)$. Of the
    remaining $2n−2$ degrees of freedom, we lose $n−1$ by computing the
    product deviations. Thus, we are left with $n−1$ degrees of freedom
    total."





Do you see what are the "product deviations" and how does each one "lose" a degree of freedom?










share|cite|improve this question











$endgroup$

















    4












    $begingroup$


    If we say $operatorname{Var}(x)$ has $n-1$ degrees of freedom which are lost after we estimate $operatorname{Var}(x)$, this matches how $n-1$ observations are now constrained to be sufficiently close to the remaining observation of $x$. In my class, $operatorname{Cov}(x,y)$ is also described as having $n-1$ degrees of freedom which are lost after we estimate $operatorname{Cov}(x,y)$. Reference for this use of "degrees of freedom" in covariance



    My confusion is that the covariance does not actually seem to constrain $n-1$ $x$ and $y$ values, like the variance did for $x$.



    Can you relate the $n-1$ degrees of freedom in covariance and the intuition for degrees of freedom as how many observations are not free to change after estimating $operatorname{Cov}(x,y)$? Is it possible to explain by counting newly restricted observations, like when explaining why the sample mean has degree of freedom $1,$ or variance degrees of freedom $n-1$?





    • The second related question is about the explanation in the reference linked above:




      "Initially, we have $2n$ degrees of freedom in the bivariate data. We
      lose two by computing the sample means $m(x)$ and $m(y)$. Of the
      remaining $2n−2$ degrees of freedom, we lose $n−1$ by computing the
      product deviations. Thus, we are left with $n−1$ degrees of freedom
      total."





    Do you see what are the "product deviations" and how does each one "lose" a degree of freedom?










    share|cite|improve this question











    $endgroup$















      4












      4








      4





      $begingroup$


      If we say $operatorname{Var}(x)$ has $n-1$ degrees of freedom which are lost after we estimate $operatorname{Var}(x)$, this matches how $n-1$ observations are now constrained to be sufficiently close to the remaining observation of $x$. In my class, $operatorname{Cov}(x,y)$ is also described as having $n-1$ degrees of freedom which are lost after we estimate $operatorname{Cov}(x,y)$. Reference for this use of "degrees of freedom" in covariance



      My confusion is that the covariance does not actually seem to constrain $n-1$ $x$ and $y$ values, like the variance did for $x$.



      Can you relate the $n-1$ degrees of freedom in covariance and the intuition for degrees of freedom as how many observations are not free to change after estimating $operatorname{Cov}(x,y)$? Is it possible to explain by counting newly restricted observations, like when explaining why the sample mean has degree of freedom $1,$ or variance degrees of freedom $n-1$?





      • The second related question is about the explanation in the reference linked above:




        "Initially, we have $2n$ degrees of freedom in the bivariate data. We
        lose two by computing the sample means $m(x)$ and $m(y)$. Of the
        remaining $2n−2$ degrees of freedom, we lose $n−1$ by computing the
        product deviations. Thus, we are left with $n−1$ degrees of freedom
        total."





      Do you see what are the "product deviations" and how does each one "lose" a degree of freedom?










      share|cite|improve this question











      $endgroup$




      If we say $operatorname{Var}(x)$ has $n-1$ degrees of freedom which are lost after we estimate $operatorname{Var}(x)$, this matches how $n-1$ observations are now constrained to be sufficiently close to the remaining observation of $x$. In my class, $operatorname{Cov}(x,y)$ is also described as having $n-1$ degrees of freedom which are lost after we estimate $operatorname{Cov}(x,y)$. Reference for this use of "degrees of freedom" in covariance



      My confusion is that the covariance does not actually seem to constrain $n-1$ $x$ and $y$ values, like the variance did for $x$.



      Can you relate the $n-1$ degrees of freedom in covariance and the intuition for degrees of freedom as how many observations are not free to change after estimating $operatorname{Cov}(x,y)$? Is it possible to explain by counting newly restricted observations, like when explaining why the sample mean has degree of freedom $1,$ or variance degrees of freedom $n-1$?





      • The second related question is about the explanation in the reference linked above:




        "Initially, we have $2n$ degrees of freedom in the bivariate data. We
        lose two by computing the sample means $m(x)$ and $m(y)$. Of the
        remaining $2n−2$ degrees of freedom, we lose $n−1$ by computing the
        product deviations. Thus, we are left with $n−1$ degrees of freedom
        total."





      Do you see what are the "product deviations" and how does each one "lose" a degree of freedom?







      statistics intuition estimation covariance






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Oct 1 '18 at 22:05









      Michael Hardy

      1




      1










      asked Jan 22 '16 at 16:04









      ikoiko

      212




      212






















          1 Answer
          1






          active

          oldest

          votes


















          0












          $begingroup$

          Intuitively, the deduction of one degree of freedom is necessary to resolve a problem about the "biased"-ness of the estimator. An "unbiased" estimation for a (co)variance is one where it's "expected" to equal the population (co)variance I.E. if you take a SAMPLING distribution of (co)variance estimations and the average (or "expected value") of that distribution is the (co)variance of the population.



          The "intuitive" explanation for the loss in degree of freedom in the variance and coveriance are EXACTLY the same issue in that it is concerning "biased"-ness of the estimator, which needs to be fixed by subtracting a degree of freedom.



          Not sure if an informal proof of this fact would very helpful, but I'm going to cook one up myself (in a later edit of this post) to fine tune my econometrics knowledge so I can effectively tutor that next semester.



          Hope this helps! :)






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your reply. As you say, the estimator with $n-1$ has the true expectation. Can you say more explicitly how it is the same issue as degrees of freedom? In particular, I'm hoping to see how finding the covariance constrains $n-1$ observations of $x$ and $y$? Or, how every "product deviation" loses 1 degree of freedom?
            $endgroup$
            – iko
            Jan 22 '16 at 19:06












          • $begingroup$
            Well I think what you need are a few slightly technical definitions (but I promise to make them accessible). The problem with your question--and more generally with the "intuition" of statistics--is that sometimes the intuition that you want does not exist, since the answer you're looking for is buried beneath a very technical equation. But I think once you understand the basic notion of expected value for a "product distribution", and more specifically the covariance, I think a lot of confusion will be alleviated.
            $endgroup$
            – Logician6
            Jan 22 '16 at 23:47










          • $begingroup$
            I agree the expectation of the sample covariance (using $n-1$) equals the real covariance. Is that what you mean in: "expected value for a "product distribution", and more specifically the covariance"? Can this help me to see $n-1$ parameters (or observations) are no longer free to vary, after I find the covariance? It would help if you can add more explicit detail later :)
            $endgroup$
            – iko
            Jan 23 '16 at 0:31










          • $begingroup$
            Yes. That's what I mean to answer your first question. Not sure what you mean with your follow up question but hopefully my further elaboration will help. Expect it sometime tomorrow if not late tonight.
            $endgroup$
            – Logician6
            Jan 23 '16 at 0:48










          • $begingroup$
            Thanks for your continued input. To clarify, I can do the calculation to show the estimator is unbiased. It is the 2nd question in my comment above that I am asking about. If we say there are $n-1$ degrees of freedom in covariance, doesn't this mean that after we estimate the covariance, we lose the ability to freely choose $n-1$ observations/parameters?
            $endgroup$
            – iko
            Jan 23 '16 at 0:54











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1622579%2fdegrees-of-freedom-in-covariance-intuition%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0












          $begingroup$

          Intuitively, the deduction of one degree of freedom is necessary to resolve a problem about the "biased"-ness of the estimator. An "unbiased" estimation for a (co)variance is one where it's "expected" to equal the population (co)variance I.E. if you take a SAMPLING distribution of (co)variance estimations and the average (or "expected value") of that distribution is the (co)variance of the population.



          The "intuitive" explanation for the loss in degree of freedom in the variance and coveriance are EXACTLY the same issue in that it is concerning "biased"-ness of the estimator, which needs to be fixed by subtracting a degree of freedom.



          Not sure if an informal proof of this fact would very helpful, but I'm going to cook one up myself (in a later edit of this post) to fine tune my econometrics knowledge so I can effectively tutor that next semester.



          Hope this helps! :)






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your reply. As you say, the estimator with $n-1$ has the true expectation. Can you say more explicitly how it is the same issue as degrees of freedom? In particular, I'm hoping to see how finding the covariance constrains $n-1$ observations of $x$ and $y$? Or, how every "product deviation" loses 1 degree of freedom?
            $endgroup$
            – iko
            Jan 22 '16 at 19:06












          • $begingroup$
            Well I think what you need are a few slightly technical definitions (but I promise to make them accessible). The problem with your question--and more generally with the "intuition" of statistics--is that sometimes the intuition that you want does not exist, since the answer you're looking for is buried beneath a very technical equation. But I think once you understand the basic notion of expected value for a "product distribution", and more specifically the covariance, I think a lot of confusion will be alleviated.
            $endgroup$
            – Logician6
            Jan 22 '16 at 23:47










          • $begingroup$
            I agree the expectation of the sample covariance (using $n-1$) equals the real covariance. Is that what you mean in: "expected value for a "product distribution", and more specifically the covariance"? Can this help me to see $n-1$ parameters (or observations) are no longer free to vary, after I find the covariance? It would help if you can add more explicit detail later :)
            $endgroup$
            – iko
            Jan 23 '16 at 0:31










          • $begingroup$
            Yes. That's what I mean to answer your first question. Not sure what you mean with your follow up question but hopefully my further elaboration will help. Expect it sometime tomorrow if not late tonight.
            $endgroup$
            – Logician6
            Jan 23 '16 at 0:48










          • $begingroup$
            Thanks for your continued input. To clarify, I can do the calculation to show the estimator is unbiased. It is the 2nd question in my comment above that I am asking about. If we say there are $n-1$ degrees of freedom in covariance, doesn't this mean that after we estimate the covariance, we lose the ability to freely choose $n-1$ observations/parameters?
            $endgroup$
            – iko
            Jan 23 '16 at 0:54
















          0












          $begingroup$

          Intuitively, the deduction of one degree of freedom is necessary to resolve a problem about the "biased"-ness of the estimator. An "unbiased" estimation for a (co)variance is one where it's "expected" to equal the population (co)variance I.E. if you take a SAMPLING distribution of (co)variance estimations and the average (or "expected value") of that distribution is the (co)variance of the population.



          The "intuitive" explanation for the loss in degree of freedom in the variance and coveriance are EXACTLY the same issue in that it is concerning "biased"-ness of the estimator, which needs to be fixed by subtracting a degree of freedom.



          Not sure if an informal proof of this fact would very helpful, but I'm going to cook one up myself (in a later edit of this post) to fine tune my econometrics knowledge so I can effectively tutor that next semester.



          Hope this helps! :)






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thanks for your reply. As you say, the estimator with $n-1$ has the true expectation. Can you say more explicitly how it is the same issue as degrees of freedom? In particular, I'm hoping to see how finding the covariance constrains $n-1$ observations of $x$ and $y$? Or, how every "product deviation" loses 1 degree of freedom?
            $endgroup$
            – iko
            Jan 22 '16 at 19:06












          • $begingroup$
            Well I think what you need are a few slightly technical definitions (but I promise to make them accessible). The problem with your question--and more generally with the "intuition" of statistics--is that sometimes the intuition that you want does not exist, since the answer you're looking for is buried beneath a very technical equation. But I think once you understand the basic notion of expected value for a "product distribution", and more specifically the covariance, I think a lot of confusion will be alleviated.
            $endgroup$
            – Logician6
            Jan 22 '16 at 23:47










          • $begingroup$
            I agree the expectation of the sample covariance (using $n-1$) equals the real covariance. Is that what you mean in: "expected value for a "product distribution", and more specifically the covariance"? Can this help me to see $n-1$ parameters (or observations) are no longer free to vary, after I find the covariance? It would help if you can add more explicit detail later :)
            $endgroup$
            – iko
            Jan 23 '16 at 0:31










          • $begingroup$
            Yes. That's what I mean to answer your first question. Not sure what you mean with your follow up question but hopefully my further elaboration will help. Expect it sometime tomorrow if not late tonight.
            $endgroup$
            – Logician6
            Jan 23 '16 at 0:48










          • $begingroup$
            Thanks for your continued input. To clarify, I can do the calculation to show the estimator is unbiased. It is the 2nd question in my comment above that I am asking about. If we say there are $n-1$ degrees of freedom in covariance, doesn't this mean that after we estimate the covariance, we lose the ability to freely choose $n-1$ observations/parameters?
            $endgroup$
            – iko
            Jan 23 '16 at 0:54














          0












          0








          0





          $begingroup$

          Intuitively, the deduction of one degree of freedom is necessary to resolve a problem about the "biased"-ness of the estimator. An "unbiased" estimation for a (co)variance is one where it's "expected" to equal the population (co)variance I.E. if you take a SAMPLING distribution of (co)variance estimations and the average (or "expected value") of that distribution is the (co)variance of the population.



          The "intuitive" explanation for the loss in degree of freedom in the variance and coveriance are EXACTLY the same issue in that it is concerning "biased"-ness of the estimator, which needs to be fixed by subtracting a degree of freedom.



          Not sure if an informal proof of this fact would very helpful, but I'm going to cook one up myself (in a later edit of this post) to fine tune my econometrics knowledge so I can effectively tutor that next semester.



          Hope this helps! :)






          share|cite|improve this answer









          $endgroup$



          Intuitively, the deduction of one degree of freedom is necessary to resolve a problem about the "biased"-ness of the estimator. An "unbiased" estimation for a (co)variance is one where it's "expected" to equal the population (co)variance I.E. if you take a SAMPLING distribution of (co)variance estimations and the average (or "expected value") of that distribution is the (co)variance of the population.



          The "intuitive" explanation for the loss in degree of freedom in the variance and coveriance are EXACTLY the same issue in that it is concerning "biased"-ness of the estimator, which needs to be fixed by subtracting a degree of freedom.



          Not sure if an informal proof of this fact would very helpful, but I'm going to cook one up myself (in a later edit of this post) to fine tune my econometrics knowledge so I can effectively tutor that next semester.



          Hope this helps! :)







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 22 '16 at 17:38









          Logician6Logician6

          34314




          34314












          • $begingroup$
            Thanks for your reply. As you say, the estimator with $n-1$ has the true expectation. Can you say more explicitly how it is the same issue as degrees of freedom? In particular, I'm hoping to see how finding the covariance constrains $n-1$ observations of $x$ and $y$? Or, how every "product deviation" loses 1 degree of freedom?
            $endgroup$
            – iko
            Jan 22 '16 at 19:06












          • $begingroup$
            Well I think what you need are a few slightly technical definitions (but I promise to make them accessible). The problem with your question--and more generally with the "intuition" of statistics--is that sometimes the intuition that you want does not exist, since the answer you're looking for is buried beneath a very technical equation. But I think once you understand the basic notion of expected value for a "product distribution", and more specifically the covariance, I think a lot of confusion will be alleviated.
            $endgroup$
            – Logician6
            Jan 22 '16 at 23:47










          • $begingroup$
            I agree the expectation of the sample covariance (using $n-1$) equals the real covariance. Is that what you mean in: "expected value for a "product distribution", and more specifically the covariance"? Can this help me to see $n-1$ parameters (or observations) are no longer free to vary, after I find the covariance? It would help if you can add more explicit detail later :)
            $endgroup$
            – iko
            Jan 23 '16 at 0:31










          • $begingroup$
            Yes. That's what I mean to answer your first question. Not sure what you mean with your follow up question but hopefully my further elaboration will help. Expect it sometime tomorrow if not late tonight.
            $endgroup$
            – Logician6
            Jan 23 '16 at 0:48










          • $begingroup$
            Thanks for your continued input. To clarify, I can do the calculation to show the estimator is unbiased. It is the 2nd question in my comment above that I am asking about. If we say there are $n-1$ degrees of freedom in covariance, doesn't this mean that after we estimate the covariance, we lose the ability to freely choose $n-1$ observations/parameters?
            $endgroup$
            – iko
            Jan 23 '16 at 0:54


















          • $begingroup$
            Thanks for your reply. As you say, the estimator with $n-1$ has the true expectation. Can you say more explicitly how it is the same issue as degrees of freedom? In particular, I'm hoping to see how finding the covariance constrains $n-1$ observations of $x$ and $y$? Or, how every "product deviation" loses 1 degree of freedom?
            $endgroup$
            – iko
            Jan 22 '16 at 19:06












          • $begingroup$
            Well I think what you need are a few slightly technical definitions (but I promise to make them accessible). The problem with your question--and more generally with the "intuition" of statistics--is that sometimes the intuition that you want does not exist, since the answer you're looking for is buried beneath a very technical equation. But I think once you understand the basic notion of expected value for a "product distribution", and more specifically the covariance, I think a lot of confusion will be alleviated.
            $endgroup$
            – Logician6
            Jan 22 '16 at 23:47










          • $begingroup$
            I agree the expectation of the sample covariance (using $n-1$) equals the real covariance. Is that what you mean in: "expected value for a "product distribution", and more specifically the covariance"? Can this help me to see $n-1$ parameters (or observations) are no longer free to vary, after I find the covariance? It would help if you can add more explicit detail later :)
            $endgroup$
            – iko
            Jan 23 '16 at 0:31










          • $begingroup$
            Yes. That's what I mean to answer your first question. Not sure what you mean with your follow up question but hopefully my further elaboration will help. Expect it sometime tomorrow if not late tonight.
            $endgroup$
            – Logician6
            Jan 23 '16 at 0:48










          • $begingroup$
            Thanks for your continued input. To clarify, I can do the calculation to show the estimator is unbiased. It is the 2nd question in my comment above that I am asking about. If we say there are $n-1$ degrees of freedom in covariance, doesn't this mean that after we estimate the covariance, we lose the ability to freely choose $n-1$ observations/parameters?
            $endgroup$
            – iko
            Jan 23 '16 at 0:54
















          $begingroup$
          Thanks for your reply. As you say, the estimator with $n-1$ has the true expectation. Can you say more explicitly how it is the same issue as degrees of freedom? In particular, I'm hoping to see how finding the covariance constrains $n-1$ observations of $x$ and $y$? Or, how every "product deviation" loses 1 degree of freedom?
          $endgroup$
          – iko
          Jan 22 '16 at 19:06






          $begingroup$
          Thanks for your reply. As you say, the estimator with $n-1$ has the true expectation. Can you say more explicitly how it is the same issue as degrees of freedom? In particular, I'm hoping to see how finding the covariance constrains $n-1$ observations of $x$ and $y$? Or, how every "product deviation" loses 1 degree of freedom?
          $endgroup$
          – iko
          Jan 22 '16 at 19:06














          $begingroup$
          Well I think what you need are a few slightly technical definitions (but I promise to make them accessible). The problem with your question--and more generally with the "intuition" of statistics--is that sometimes the intuition that you want does not exist, since the answer you're looking for is buried beneath a very technical equation. But I think once you understand the basic notion of expected value for a "product distribution", and more specifically the covariance, I think a lot of confusion will be alleviated.
          $endgroup$
          – Logician6
          Jan 22 '16 at 23:47




          $begingroup$
          Well I think what you need are a few slightly technical definitions (but I promise to make them accessible). The problem with your question--and more generally with the "intuition" of statistics--is that sometimes the intuition that you want does not exist, since the answer you're looking for is buried beneath a very technical equation. But I think once you understand the basic notion of expected value for a "product distribution", and more specifically the covariance, I think a lot of confusion will be alleviated.
          $endgroup$
          – Logician6
          Jan 22 '16 at 23:47












          $begingroup$
          I agree the expectation of the sample covariance (using $n-1$) equals the real covariance. Is that what you mean in: "expected value for a "product distribution", and more specifically the covariance"? Can this help me to see $n-1$ parameters (or observations) are no longer free to vary, after I find the covariance? It would help if you can add more explicit detail later :)
          $endgroup$
          – iko
          Jan 23 '16 at 0:31




          $begingroup$
          I agree the expectation of the sample covariance (using $n-1$) equals the real covariance. Is that what you mean in: "expected value for a "product distribution", and more specifically the covariance"? Can this help me to see $n-1$ parameters (or observations) are no longer free to vary, after I find the covariance? It would help if you can add more explicit detail later :)
          $endgroup$
          – iko
          Jan 23 '16 at 0:31












          $begingroup$
          Yes. That's what I mean to answer your first question. Not sure what you mean with your follow up question but hopefully my further elaboration will help. Expect it sometime tomorrow if not late tonight.
          $endgroup$
          – Logician6
          Jan 23 '16 at 0:48




          $begingroup$
          Yes. That's what I mean to answer your first question. Not sure what you mean with your follow up question but hopefully my further elaboration will help. Expect it sometime tomorrow if not late tonight.
          $endgroup$
          – Logician6
          Jan 23 '16 at 0:48












          $begingroup$
          Thanks for your continued input. To clarify, I can do the calculation to show the estimator is unbiased. It is the 2nd question in my comment above that I am asking about. If we say there are $n-1$ degrees of freedom in covariance, doesn't this mean that after we estimate the covariance, we lose the ability to freely choose $n-1$ observations/parameters?
          $endgroup$
          – iko
          Jan 23 '16 at 0:54




          $begingroup$
          Thanks for your continued input. To clarify, I can do the calculation to show the estimator is unbiased. It is the 2nd question in my comment above that I am asking about. If we say there are $n-1$ degrees of freedom in covariance, doesn't this mean that after we estimate the covariance, we lose the ability to freely choose $n-1$ observations/parameters?
          $endgroup$
          – iko
          Jan 23 '16 at 0:54


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1622579%2fdegrees-of-freedom-in-covariance-intuition%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Human spaceflight

          Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

          張江高科駅