Conditional Expectation Decomposition in Regression Analysis












1












$begingroup$


I am currently working on my understanding of regression fundamentals and I checked this source (one can find the (even exact) same statement in multiple sources).
In Theorem 3.1.1, the author claims to prove the so called Conditional Expectation Function (CEF) Decomposition property, which states:



Theorem 3.1.1:



We have



$$Y_i=E(Y_i|X_i)+epsilon_i$$



with the property that $(a)$ $E(epsilon_i|X_i)=0$ and $(b)$ $E(f(X_i)epsilon_i)=0$ for any function $f$.



The problem is, the proof only checks $(a)$ and $(b)$ but never checks the actual existence of the decomposition of $Y_i=E(Y_i|X_i)+epsilon_i$.



The author then claims that




This theorem says that any random variable,
$Y_i$
, can be decomposed into a piece that's "explained by
$X_i$", i.e., the CEF, and a piece left over which is orthogonal to (i.e., uncorrelated with) any function of $X_i$
.




Do we have to prove this or comes this as some sort of assumption (couldn't find it, but surely one could just build the model like this) or is this a result from somewhere else, like factorization lemma or so?
The whole thing sounds a bit sloppy, clearly things like measurability are dropped, but still, what do I miss?










share|cite|improve this question









$endgroup$

















    1












    $begingroup$


    I am currently working on my understanding of regression fundamentals and I checked this source (one can find the (even exact) same statement in multiple sources).
    In Theorem 3.1.1, the author claims to prove the so called Conditional Expectation Function (CEF) Decomposition property, which states:



    Theorem 3.1.1:



    We have



    $$Y_i=E(Y_i|X_i)+epsilon_i$$



    with the property that $(a)$ $E(epsilon_i|X_i)=0$ and $(b)$ $E(f(X_i)epsilon_i)=0$ for any function $f$.



    The problem is, the proof only checks $(a)$ and $(b)$ but never checks the actual existence of the decomposition of $Y_i=E(Y_i|X_i)+epsilon_i$.



    The author then claims that




    This theorem says that any random variable,
    $Y_i$
    , can be decomposed into a piece that's "explained by
    $X_i$", i.e., the CEF, and a piece left over which is orthogonal to (i.e., uncorrelated with) any function of $X_i$
    .




    Do we have to prove this or comes this as some sort of assumption (couldn't find it, but surely one could just build the model like this) or is this a result from somewhere else, like factorization lemma or so?
    The whole thing sounds a bit sloppy, clearly things like measurability are dropped, but still, what do I miss?










    share|cite|improve this question









    $endgroup$















      1












      1








      1





      $begingroup$


      I am currently working on my understanding of regression fundamentals and I checked this source (one can find the (even exact) same statement in multiple sources).
      In Theorem 3.1.1, the author claims to prove the so called Conditional Expectation Function (CEF) Decomposition property, which states:



      Theorem 3.1.1:



      We have



      $$Y_i=E(Y_i|X_i)+epsilon_i$$



      with the property that $(a)$ $E(epsilon_i|X_i)=0$ and $(b)$ $E(f(X_i)epsilon_i)=0$ for any function $f$.



      The problem is, the proof only checks $(a)$ and $(b)$ but never checks the actual existence of the decomposition of $Y_i=E(Y_i|X_i)+epsilon_i$.



      The author then claims that




      This theorem says that any random variable,
      $Y_i$
      , can be decomposed into a piece that's "explained by
      $X_i$", i.e., the CEF, and a piece left over which is orthogonal to (i.e., uncorrelated with) any function of $X_i$
      .




      Do we have to prove this or comes this as some sort of assumption (couldn't find it, but surely one could just build the model like this) or is this a result from somewhere else, like factorization lemma or so?
      The whole thing sounds a bit sloppy, clearly things like measurability are dropped, but still, what do I miss?










      share|cite|improve this question









      $endgroup$




      I am currently working on my understanding of regression fundamentals and I checked this source (one can find the (even exact) same statement in multiple sources).
      In Theorem 3.1.1, the author claims to prove the so called Conditional Expectation Function (CEF) Decomposition property, which states:



      Theorem 3.1.1:



      We have



      $$Y_i=E(Y_i|X_i)+epsilon_i$$



      with the property that $(a)$ $E(epsilon_i|X_i)=0$ and $(b)$ $E(f(X_i)epsilon_i)=0$ for any function $f$.



      The problem is, the proof only checks $(a)$ and $(b)$ but never checks the actual existence of the decomposition of $Y_i=E(Y_i|X_i)+epsilon_i$.



      The author then claims that




      This theorem says that any random variable,
      $Y_i$
      , can be decomposed into a piece that's "explained by
      $X_i$", i.e., the CEF, and a piece left over which is orthogonal to (i.e., uncorrelated with) any function of $X_i$
      .




      Do we have to prove this or comes this as some sort of assumption (couldn't find it, but surely one could just build the model like this) or is this a result from somewhere else, like factorization lemma or so?
      The whole thing sounds a bit sloppy, clearly things like measurability are dropped, but still, what do I miss?







      probability-theory statistics regression linear-regression regression-analysis






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 10 at 9:54









      user190080user190080

      3,30621427




      3,30621427






















          1 Answer
          1






          active

          oldest

          votes


















          2












          $begingroup$

          This follows from the equality
          $$
          Y_i=mathbb Eleft[Y_imid X_iright]+Y_i-mathbb Eleft[Y_imid X_iright];
          $$

          defining $varepsilon_i:=Y_i-mathbb Eleft[Y_imid X_iright]$ gives that
          $mathbb Eleft[varepsilon_imid X_iright]=0$, from which $mathbb Eleft[fleft(X_iright)varepsilon_iright]=0$ follows.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Ah, that makes sense. I was able to follow the proof for $(a/b)$ (even without knowing this decomposition exists) but somehow missed this idea, thanks!
            $endgroup$
            – user190080
            Jan 10 at 10:17











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3068446%2fconditional-expectation-decomposition-in-regression-analysis%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2












          $begingroup$

          This follows from the equality
          $$
          Y_i=mathbb Eleft[Y_imid X_iright]+Y_i-mathbb Eleft[Y_imid X_iright];
          $$

          defining $varepsilon_i:=Y_i-mathbb Eleft[Y_imid X_iright]$ gives that
          $mathbb Eleft[varepsilon_imid X_iright]=0$, from which $mathbb Eleft[fleft(X_iright)varepsilon_iright]=0$ follows.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Ah, that makes sense. I was able to follow the proof for $(a/b)$ (even without knowing this decomposition exists) but somehow missed this idea, thanks!
            $endgroup$
            – user190080
            Jan 10 at 10:17
















          2












          $begingroup$

          This follows from the equality
          $$
          Y_i=mathbb Eleft[Y_imid X_iright]+Y_i-mathbb Eleft[Y_imid X_iright];
          $$

          defining $varepsilon_i:=Y_i-mathbb Eleft[Y_imid X_iright]$ gives that
          $mathbb Eleft[varepsilon_imid X_iright]=0$, from which $mathbb Eleft[fleft(X_iright)varepsilon_iright]=0$ follows.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Ah, that makes sense. I was able to follow the proof for $(a/b)$ (even without knowing this decomposition exists) but somehow missed this idea, thanks!
            $endgroup$
            – user190080
            Jan 10 at 10:17














          2












          2








          2





          $begingroup$

          This follows from the equality
          $$
          Y_i=mathbb Eleft[Y_imid X_iright]+Y_i-mathbb Eleft[Y_imid X_iright];
          $$

          defining $varepsilon_i:=Y_i-mathbb Eleft[Y_imid X_iright]$ gives that
          $mathbb Eleft[varepsilon_imid X_iright]=0$, from which $mathbb Eleft[fleft(X_iright)varepsilon_iright]=0$ follows.






          share|cite|improve this answer









          $endgroup$



          This follows from the equality
          $$
          Y_i=mathbb Eleft[Y_imid X_iright]+Y_i-mathbb Eleft[Y_imid X_iright];
          $$

          defining $varepsilon_i:=Y_i-mathbb Eleft[Y_imid X_iright]$ gives that
          $mathbb Eleft[varepsilon_imid X_iright]=0$, from which $mathbb Eleft[fleft(X_iright)varepsilon_iright]=0$ follows.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 10 at 10:04









          Davide GiraudoDavide Giraudo

          127k16152266




          127k16152266












          • $begingroup$
            Ah, that makes sense. I was able to follow the proof for $(a/b)$ (even without knowing this decomposition exists) but somehow missed this idea, thanks!
            $endgroup$
            – user190080
            Jan 10 at 10:17


















          • $begingroup$
            Ah, that makes sense. I was able to follow the proof for $(a/b)$ (even without knowing this decomposition exists) but somehow missed this idea, thanks!
            $endgroup$
            – user190080
            Jan 10 at 10:17
















          $begingroup$
          Ah, that makes sense. I was able to follow the proof for $(a/b)$ (even without knowing this decomposition exists) but somehow missed this idea, thanks!
          $endgroup$
          – user190080
          Jan 10 at 10:17




          $begingroup$
          Ah, that makes sense. I was able to follow the proof for $(a/b)$ (even without knowing this decomposition exists) but somehow missed this idea, thanks!
          $endgroup$
          – user190080
          Jan 10 at 10:17


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3068446%2fconditional-expectation-decomposition-in-regression-analysis%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Human spaceflight

          Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

          張江高科駅