Let $Xsim exp(lambda=1)$ and $Y sim U(1,2) $ two independent random variables; what is the joint distribution...












0












$begingroup$



Let $Xsim exp(lambda=1)$ and $Y sim U(1,2) $ two independent random variables; what is the joint distribution function? Compute $P(X>Y)$




I started by calculating:



$$ f(x) = begin{cases}
e^{-x} & xgeq 0\
0 &x < 0
end{cases} $$



$$ f(y) = begin{cases}
1 & 1leq yleq 2\
0 & text{otherwise}
end{cases} $$



Now I am not sure about the joint density. I do know that the fact that X,Y are independent means that I should some how multiply their densities, though I am not so sure what would be the product range.
In addition, I am not sure how I can from that derive $P(X>Y)$. Again, I have some clue about a 2-d integral but I can't figure out what's the logic behind it, or how to even define it.



Thanks!










share|cite|improve this question











$endgroup$

















    0












    $begingroup$



    Let $Xsim exp(lambda=1)$ and $Y sim U(1,2) $ two independent random variables; what is the joint distribution function? Compute $P(X>Y)$




    I started by calculating:



    $$ f(x) = begin{cases}
    e^{-x} & xgeq 0\
    0 &x < 0
    end{cases} $$



    $$ f(y) = begin{cases}
    1 & 1leq yleq 2\
    0 & text{otherwise}
    end{cases} $$



    Now I am not sure about the joint density. I do know that the fact that X,Y are independent means that I should some how multiply their densities, though I am not so sure what would be the product range.
    In addition, I am not sure how I can from that derive $P(X>Y)$. Again, I have some clue about a 2-d integral but I can't figure out what's the logic behind it, or how to even define it.



    Thanks!










    share|cite|improve this question











    $endgroup$















      0












      0








      0





      $begingroup$



      Let $Xsim exp(lambda=1)$ and $Y sim U(1,2) $ two independent random variables; what is the joint distribution function? Compute $P(X>Y)$




      I started by calculating:



      $$ f(x) = begin{cases}
      e^{-x} & xgeq 0\
      0 &x < 0
      end{cases} $$



      $$ f(y) = begin{cases}
      1 & 1leq yleq 2\
      0 & text{otherwise}
      end{cases} $$



      Now I am not sure about the joint density. I do know that the fact that X,Y are independent means that I should some how multiply their densities, though I am not so sure what would be the product range.
      In addition, I am not sure how I can from that derive $P(X>Y)$. Again, I have some clue about a 2-d integral but I can't figure out what's the logic behind it, or how to even define it.



      Thanks!










      share|cite|improve this question











      $endgroup$





      Let $Xsim exp(lambda=1)$ and $Y sim U(1,2) $ two independent random variables; what is the joint distribution function? Compute $P(X>Y)$




      I started by calculating:



      $$ f(x) = begin{cases}
      e^{-x} & xgeq 0\
      0 &x < 0
      end{cases} $$



      $$ f(y) = begin{cases}
      1 & 1leq yleq 2\
      0 & text{otherwise}
      end{cases} $$



      Now I am not sure about the joint density. I do know that the fact that X,Y are independent means that I should some how multiply their densities, though I am not so sure what would be the product range.
      In addition, I am not sure how I can from that derive $P(X>Y)$. Again, I have some clue about a 2-d integral but I can't figure out what's the logic behind it, or how to even define it.



      Thanks!







      probability probability-distributions random-variables






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Jan 14 at 17:17









      Song

      18.6k21651




      18.6k21651










      asked Jan 14 at 16:47









      superuser123superuser123

      48628




      48628






















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          Consider that



          $$P(X>Y)=Eleft[mathbf1_{X>Y}right]$$



          , where $mathbf1_{X>Y}$ equals $1$ if $X>Y$, and equals $0$ otherwise.



          And for any (measurable) function $g$ of $(X,Y)$, courtesy of this theorem, we have



          $$Eleft[g(X,Y)right]=iint g(x,y)f_{X,Y}(x,y),mathrm{d}x,mathrm{d}y$$



          , where $f_{X,Y}$ is the joint density of $(X,Y)$.



          You are right that independence of $X$ and $Y$ implies that $f_{X,Y}$ is just the product of the marginal densities of $X$ and $Y$. So you have to evaluate



          begin{align}
          P(X>Y)&=iint mathbf1_{x>y},e^{-x}mathbf1_{x>0}mathbf1_{1<y<2},mathrm{d}x,mathrm{d}y
          \&=iint mathbf1_{x>y,,x>0,1<y<2},e^{-x},mathrm{d}x,mathrm{d}y
          end{align}






          share|cite|improve this answer











          $endgroup$





















            0












            $begingroup$

            For the first part the joint distribution is $e^{-x}$ for the range $xgeq 0$ and $1leq yleq 2$, and 0 otherwise.



            For the second part



            $$text{Pr}[X>Y]=int_{y=1}^2text{Pr}[X>y],dy=int_{1}^2e^{-y},dy$$






            share|cite|improve this answer









            $endgroup$













              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3073442%2flet-x-sim-exp-lambda-1-and-y-sim-u1-2-two-independent-random-variable%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              0












              $begingroup$

              Consider that



              $$P(X>Y)=Eleft[mathbf1_{X>Y}right]$$



              , where $mathbf1_{X>Y}$ equals $1$ if $X>Y$, and equals $0$ otherwise.



              And for any (measurable) function $g$ of $(X,Y)$, courtesy of this theorem, we have



              $$Eleft[g(X,Y)right]=iint g(x,y)f_{X,Y}(x,y),mathrm{d}x,mathrm{d}y$$



              , where $f_{X,Y}$ is the joint density of $(X,Y)$.



              You are right that independence of $X$ and $Y$ implies that $f_{X,Y}$ is just the product of the marginal densities of $X$ and $Y$. So you have to evaluate



              begin{align}
              P(X>Y)&=iint mathbf1_{x>y},e^{-x}mathbf1_{x>0}mathbf1_{1<y<2},mathrm{d}x,mathrm{d}y
              \&=iint mathbf1_{x>y,,x>0,1<y<2},e^{-x},mathrm{d}x,mathrm{d}y
              end{align}






              share|cite|improve this answer











              $endgroup$


















                0












                $begingroup$

                Consider that



                $$P(X>Y)=Eleft[mathbf1_{X>Y}right]$$



                , where $mathbf1_{X>Y}$ equals $1$ if $X>Y$, and equals $0$ otherwise.



                And for any (measurable) function $g$ of $(X,Y)$, courtesy of this theorem, we have



                $$Eleft[g(X,Y)right]=iint g(x,y)f_{X,Y}(x,y),mathrm{d}x,mathrm{d}y$$



                , where $f_{X,Y}$ is the joint density of $(X,Y)$.



                You are right that independence of $X$ and $Y$ implies that $f_{X,Y}$ is just the product of the marginal densities of $X$ and $Y$. So you have to evaluate



                begin{align}
                P(X>Y)&=iint mathbf1_{x>y},e^{-x}mathbf1_{x>0}mathbf1_{1<y<2},mathrm{d}x,mathrm{d}y
                \&=iint mathbf1_{x>y,,x>0,1<y<2},e^{-x},mathrm{d}x,mathrm{d}y
                end{align}






                share|cite|improve this answer











                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  Consider that



                  $$P(X>Y)=Eleft[mathbf1_{X>Y}right]$$



                  , where $mathbf1_{X>Y}$ equals $1$ if $X>Y$, and equals $0$ otherwise.



                  And for any (measurable) function $g$ of $(X,Y)$, courtesy of this theorem, we have



                  $$Eleft[g(X,Y)right]=iint g(x,y)f_{X,Y}(x,y),mathrm{d}x,mathrm{d}y$$



                  , where $f_{X,Y}$ is the joint density of $(X,Y)$.



                  You are right that independence of $X$ and $Y$ implies that $f_{X,Y}$ is just the product of the marginal densities of $X$ and $Y$. So you have to evaluate



                  begin{align}
                  P(X>Y)&=iint mathbf1_{x>y},e^{-x}mathbf1_{x>0}mathbf1_{1<y<2},mathrm{d}x,mathrm{d}y
                  \&=iint mathbf1_{x>y,,x>0,1<y<2},e^{-x},mathrm{d}x,mathrm{d}y
                  end{align}






                  share|cite|improve this answer











                  $endgroup$



                  Consider that



                  $$P(X>Y)=Eleft[mathbf1_{X>Y}right]$$



                  , where $mathbf1_{X>Y}$ equals $1$ if $X>Y$, and equals $0$ otherwise.



                  And for any (measurable) function $g$ of $(X,Y)$, courtesy of this theorem, we have



                  $$Eleft[g(X,Y)right]=iint g(x,y)f_{X,Y}(x,y),mathrm{d}x,mathrm{d}y$$



                  , where $f_{X,Y}$ is the joint density of $(X,Y)$.



                  You are right that independence of $X$ and $Y$ implies that $f_{X,Y}$ is just the product of the marginal densities of $X$ and $Y$. So you have to evaluate



                  begin{align}
                  P(X>Y)&=iint mathbf1_{x>y},e^{-x}mathbf1_{x>0}mathbf1_{1<y<2},mathrm{d}x,mathrm{d}y
                  \&=iint mathbf1_{x>y,,x>0,1<y<2},e^{-x},mathrm{d}x,mathrm{d}y
                  end{align}







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Jan 14 at 17:26

























                  answered Jan 14 at 17:16









                  StubbornAtomStubbornAtom

                  6,30831340




                  6,30831340























                      0












                      $begingroup$

                      For the first part the joint distribution is $e^{-x}$ for the range $xgeq 0$ and $1leq yleq 2$, and 0 otherwise.



                      For the second part



                      $$text{Pr}[X>Y]=int_{y=1}^2text{Pr}[X>y],dy=int_{1}^2e^{-y},dy$$






                      share|cite|improve this answer









                      $endgroup$


















                        0












                        $begingroup$

                        For the first part the joint distribution is $e^{-x}$ for the range $xgeq 0$ and $1leq yleq 2$, and 0 otherwise.



                        For the second part



                        $$text{Pr}[X>Y]=int_{y=1}^2text{Pr}[X>y],dy=int_{1}^2e^{-y},dy$$






                        share|cite|improve this answer









                        $endgroup$
















                          0












                          0








                          0





                          $begingroup$

                          For the first part the joint distribution is $e^{-x}$ for the range $xgeq 0$ and $1leq yleq 2$, and 0 otherwise.



                          For the second part



                          $$text{Pr}[X>Y]=int_{y=1}^2text{Pr}[X>y],dy=int_{1}^2e^{-y},dy$$






                          share|cite|improve this answer









                          $endgroup$



                          For the first part the joint distribution is $e^{-x}$ for the range $xgeq 0$ and $1leq yleq 2$, and 0 otherwise.



                          For the second part



                          $$text{Pr}[X>Y]=int_{y=1}^2text{Pr}[X>y],dy=int_{1}^2e^{-y},dy$$







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Jan 14 at 17:40









                          BlackMathBlackMath

                          31518




                          31518






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3073442%2flet-x-sim-exp-lambda-1-and-y-sim-u1-2-two-independent-random-variable%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Human spaceflight

                              Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

                              張江高科駅