Fisher information of the Rayleigh distribution












0












$begingroup$


Problem description: Find the Fisher information of the Rayleigh distribution. I was satisfied with my solution until I saw that it disagreed with the solution obtained in one of the problem sets from Princeton. http://www.princeton.edu/~cuff/ele530/files/hw4_sn.pdf p2. I calculate the Fisher information by the following Thm: $$I(theta) = -E(frac{partial^2 log L}{partial theta^2}) $$ where L is the likelihood function of the pdf. The princeton problem set uses another argument I am not familiar with, where they obtain $I(theta) = frac{n}{theta^2}$. I will show my calculations below, maybe someone can spot the error(if there is one). $$log L = prod log f(x_{i})$$ where $X_{i}$'s are iid Rayleigh distributed for $i=1,2,...n$. I end up with $$log L = sum_{i}(log x_{i} -frac{2}{theta} - frac{x_{i}^2}{2theta^2}) $$ Then I take the partial derivative with respect to $theta$ two times and obtain $$frac{partial^2 log L}{partial theta^2} = frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} $$ So $I(theta) = -E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$. This is where I am confused, should this expected value be evauated with respect to the $x_{i}^2$'s or $theta$?










share|cite|improve this question









$endgroup$












  • $begingroup$
    how to take an expected value with respect to the $x_i^2$?
    $endgroup$
    – Karl
    Nov 16 '14 at 13:26










  • $begingroup$
    The expectation is always respected to $x_i$'s, how can you take the expectation of a parameter $theta$?
    $endgroup$
    – Sayan
    Nov 16 '14 at 13:44












  • $begingroup$
    I did take them with respect to $x_{i}$'s. I will post the result asap.
    $endgroup$
    – user29163
    Nov 16 '14 at 13:54
















0












$begingroup$


Problem description: Find the Fisher information of the Rayleigh distribution. I was satisfied with my solution until I saw that it disagreed with the solution obtained in one of the problem sets from Princeton. http://www.princeton.edu/~cuff/ele530/files/hw4_sn.pdf p2. I calculate the Fisher information by the following Thm: $$I(theta) = -E(frac{partial^2 log L}{partial theta^2}) $$ where L is the likelihood function of the pdf. The princeton problem set uses another argument I am not familiar with, where they obtain $I(theta) = frac{n}{theta^2}$. I will show my calculations below, maybe someone can spot the error(if there is one). $$log L = prod log f(x_{i})$$ where $X_{i}$'s are iid Rayleigh distributed for $i=1,2,...n$. I end up with $$log L = sum_{i}(log x_{i} -frac{2}{theta} - frac{x_{i}^2}{2theta^2}) $$ Then I take the partial derivative with respect to $theta$ two times and obtain $$frac{partial^2 log L}{partial theta^2} = frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} $$ So $I(theta) = -E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$. This is where I am confused, should this expected value be evauated with respect to the $x_{i}^2$'s or $theta$?










share|cite|improve this question









$endgroup$












  • $begingroup$
    how to take an expected value with respect to the $x_i^2$?
    $endgroup$
    – Karl
    Nov 16 '14 at 13:26










  • $begingroup$
    The expectation is always respected to $x_i$'s, how can you take the expectation of a parameter $theta$?
    $endgroup$
    – Sayan
    Nov 16 '14 at 13:44












  • $begingroup$
    I did take them with respect to $x_{i}$'s. I will post the result asap.
    $endgroup$
    – user29163
    Nov 16 '14 at 13:54














0












0








0





$begingroup$


Problem description: Find the Fisher information of the Rayleigh distribution. I was satisfied with my solution until I saw that it disagreed with the solution obtained in one of the problem sets from Princeton. http://www.princeton.edu/~cuff/ele530/files/hw4_sn.pdf p2. I calculate the Fisher information by the following Thm: $$I(theta) = -E(frac{partial^2 log L}{partial theta^2}) $$ where L is the likelihood function of the pdf. The princeton problem set uses another argument I am not familiar with, where they obtain $I(theta) = frac{n}{theta^2}$. I will show my calculations below, maybe someone can spot the error(if there is one). $$log L = prod log f(x_{i})$$ where $X_{i}$'s are iid Rayleigh distributed for $i=1,2,...n$. I end up with $$log L = sum_{i}(log x_{i} -frac{2}{theta} - frac{x_{i}^2}{2theta^2}) $$ Then I take the partial derivative with respect to $theta$ two times and obtain $$frac{partial^2 log L}{partial theta^2} = frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} $$ So $I(theta) = -E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$. This is where I am confused, should this expected value be evauated with respect to the $x_{i}^2$'s or $theta$?










share|cite|improve this question









$endgroup$




Problem description: Find the Fisher information of the Rayleigh distribution. I was satisfied with my solution until I saw that it disagreed with the solution obtained in one of the problem sets from Princeton. http://www.princeton.edu/~cuff/ele530/files/hw4_sn.pdf p2. I calculate the Fisher information by the following Thm: $$I(theta) = -E(frac{partial^2 log L}{partial theta^2}) $$ where L is the likelihood function of the pdf. The princeton problem set uses another argument I am not familiar with, where they obtain $I(theta) = frac{n}{theta^2}$. I will show my calculations below, maybe someone can spot the error(if there is one). $$log L = prod log f(x_{i})$$ where $X_{i}$'s are iid Rayleigh distributed for $i=1,2,...n$. I end up with $$log L = sum_{i}(log x_{i} -frac{2}{theta} - frac{x_{i}^2}{2theta^2}) $$ Then I take the partial derivative with respect to $theta$ two times and obtain $$frac{partial^2 log L}{partial theta^2} = frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} $$ So $I(theta) = -E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$. This is where I am confused, should this expected value be evauated with respect to the $x_{i}^2$'s or $theta$?







probability-distributions expectation






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 16 '14 at 13:06









user29163user29163

360313




360313












  • $begingroup$
    how to take an expected value with respect to the $x_i^2$?
    $endgroup$
    – Karl
    Nov 16 '14 at 13:26










  • $begingroup$
    The expectation is always respected to $x_i$'s, how can you take the expectation of a parameter $theta$?
    $endgroup$
    – Sayan
    Nov 16 '14 at 13:44












  • $begingroup$
    I did take them with respect to $x_{i}$'s. I will post the result asap.
    $endgroup$
    – user29163
    Nov 16 '14 at 13:54


















  • $begingroup$
    how to take an expected value with respect to the $x_i^2$?
    $endgroup$
    – Karl
    Nov 16 '14 at 13:26










  • $begingroup$
    The expectation is always respected to $x_i$'s, how can you take the expectation of a parameter $theta$?
    $endgroup$
    – Sayan
    Nov 16 '14 at 13:44












  • $begingroup$
    I did take them with respect to $x_{i}$'s. I will post the result asap.
    $endgroup$
    – user29163
    Nov 16 '14 at 13:54
















$begingroup$
how to take an expected value with respect to the $x_i^2$?
$endgroup$
– Karl
Nov 16 '14 at 13:26




$begingroup$
how to take an expected value with respect to the $x_i^2$?
$endgroup$
– Karl
Nov 16 '14 at 13:26












$begingroup$
The expectation is always respected to $x_i$'s, how can you take the expectation of a parameter $theta$?
$endgroup$
– Sayan
Nov 16 '14 at 13:44






$begingroup$
The expectation is always respected to $x_i$'s, how can you take the expectation of a parameter $theta$?
$endgroup$
– Sayan
Nov 16 '14 at 13:44














$begingroup$
I did take them with respect to $x_{i}$'s. I will post the result asap.
$endgroup$
– user29163
Nov 16 '14 at 13:54




$begingroup$
I did take them with respect to $x_{i}$'s. I will post the result asap.
$endgroup$
– user29163
Nov 16 '14 at 13:54










2 Answers
2






active

oldest

votes


















0












$begingroup$

Continuing the above discussion, I find the following when computing $-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$.
$$-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4}) = -E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4})$$
now since I found that $$E(T(x)) = E(-x_{j}^2) = -2theta^2 $$ for arbitrary $x_{j}$. $$E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{2n}{theta^2} + frac{3nE(T(x))}{theta^4} = frac{-4n}{theta^2} $$ since E is linear and $x_{i}$'s are iid, hence $$-E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{4n}{theta^2}$$






share|cite|improve this answer









$endgroup$





















    0












    $begingroup$

    The Princeton version is correct. When you compute the Fisher Information for a Rayleigh you have to exploit the fact that if a r.v. X~Rayleigh with parameter k then a r.v. Y=X^2 has a negative exponential distribution with parameter 1/k. The property would change depending on the definition that you use of both distribution, but it must work in any case.






    share|cite|improve this answer









    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1024198%2ffisher-information-of-the-rayleigh-distribution%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      0












      $begingroup$

      Continuing the above discussion, I find the following when computing $-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$.
      $$-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4}) = -E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4})$$
      now since I found that $$E(T(x)) = E(-x_{j}^2) = -2theta^2 $$ for arbitrary $x_{j}$. $$E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{2n}{theta^2} + frac{3nE(T(x))}{theta^4} = frac{-4n}{theta^2} $$ since E is linear and $x_{i}$'s are iid, hence $$-E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{4n}{theta^2}$$






      share|cite|improve this answer









      $endgroup$


















        0












        $begingroup$

        Continuing the above discussion, I find the following when computing $-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$.
        $$-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4}) = -E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4})$$
        now since I found that $$E(T(x)) = E(-x_{j}^2) = -2theta^2 $$ for arbitrary $x_{j}$. $$E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{2n}{theta^2} + frac{3nE(T(x))}{theta^4} = frac{-4n}{theta^2} $$ since E is linear and $x_{i}$'s are iid, hence $$-E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{4n}{theta^2}$$






        share|cite|improve this answer









        $endgroup$
















          0












          0








          0





          $begingroup$

          Continuing the above discussion, I find the following when computing $-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$.
          $$-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4}) = -E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4})$$
          now since I found that $$E(T(x)) = E(-x_{j}^2) = -2theta^2 $$ for arbitrary $x_{j}$. $$E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{2n}{theta^2} + frac{3nE(T(x))}{theta^4} = frac{-4n}{theta^2} $$ since E is linear and $x_{i}$'s are iid, hence $$-E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{4n}{theta^2}$$






          share|cite|improve this answer









          $endgroup$



          Continuing the above discussion, I find the following when computing $-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4} )$.
          $$-E(frac{2ntheta^2-3sum_{i}x_{i}^2}{theta^4}) = -E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4})$$
          now since I found that $$E(T(x)) = E(-x_{j}^2) = -2theta^2 $$ for arbitrary $x_{j}$. $$E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{2n}{theta^2} + frac{3nE(T(x))}{theta^4} = frac{-4n}{theta^2} $$ since E is linear and $x_{i}$'s are iid, hence $$-E(frac{2ntheta^2+3sum_{i}-x_{i}^2}{theta^4}) = frac{4n}{theta^2}$$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Nov 16 '14 at 19:12









          user29163user29163

          360313




          360313























              0












              $begingroup$

              The Princeton version is correct. When you compute the Fisher Information for a Rayleigh you have to exploit the fact that if a r.v. X~Rayleigh with parameter k then a r.v. Y=X^2 has a negative exponential distribution with parameter 1/k. The property would change depending on the definition that you use of both distribution, but it must work in any case.






              share|cite|improve this answer









              $endgroup$


















                0












                $begingroup$

                The Princeton version is correct. When you compute the Fisher Information for a Rayleigh you have to exploit the fact that if a r.v. X~Rayleigh with parameter k then a r.v. Y=X^2 has a negative exponential distribution with parameter 1/k. The property would change depending on the definition that you use of both distribution, but it must work in any case.






                share|cite|improve this answer









                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  The Princeton version is correct. When you compute the Fisher Information for a Rayleigh you have to exploit the fact that if a r.v. X~Rayleigh with parameter k then a r.v. Y=X^2 has a negative exponential distribution with parameter 1/k. The property would change depending on the definition that you use of both distribution, but it must work in any case.






                  share|cite|improve this answer









                  $endgroup$



                  The Princeton version is correct. When you compute the Fisher Information for a Rayleigh you have to exploit the fact that if a r.v. X~Rayleigh with parameter k then a r.v. Y=X^2 has a negative exponential distribution with parameter 1/k. The property would change depending on the definition that you use of both distribution, but it must work in any case.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Oct 14 '16 at 23:16









                  Filippo PalombaFilippo Palomba

                  11




                  11






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1024198%2ffisher-information-of-the-rayleigh-distribution%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Human spaceflight

                      Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

                      File:DeusFollowingSea.jpg