Purpose of calculation not clear [Fubini theorem; lebesgue measure; multifractals]












0












$begingroup$


In page 3 of these notes - Why study multifractal spectra? -, starting at "We begin with...", they get



$$f(x)=pmb{E}big[,log[,N(x),],big]quad pmb{(1)}$$



where $pmb{E}[... ]$ seems to be the expected value
$$lim_{ntoinfty} frac{1}{n} sum_{j=1}^{n} [...]_j $$
as written above, for a certain $x$.



What is the point of the calculation that follows, and what does the outer $pmb{E}$ in the integral stand for?



I'm guessing this $pmb{E}$ is just the expected value considering different sequences $(N_1,N_2,...)_1$, $(N_1,N_2,...)_2$, ..., but I still don't get the purpose of the calculation.










share|cite|improve this question











$endgroup$












  • $begingroup$
    What background do you have? I am not sure what I need to explain and what you already know. Glossing over it I don't immediately know what it is about and given that you seem to not know about random variables or expected values I am getting the feeling that you should start with an introduction to probablility theory first.
    $endgroup$
    – Felix B.
    Jan 15 at 20:05










  • $begingroup$
    I just don't know much about measures and Lebesgue related stuff, but I've been and will be reading about it, so feel free to answer as convenient.
    $endgroup$
    – xihiro
    Jan 15 at 22:05
















0












$begingroup$


In page 3 of these notes - Why study multifractal spectra? -, starting at "We begin with...", they get



$$f(x)=pmb{E}big[,log[,N(x),],big]quad pmb{(1)}$$



where $pmb{E}[... ]$ seems to be the expected value
$$lim_{ntoinfty} frac{1}{n} sum_{j=1}^{n} [...]_j $$
as written above, for a certain $x$.



What is the point of the calculation that follows, and what does the outer $pmb{E}$ in the integral stand for?



I'm guessing this $pmb{E}$ is just the expected value considering different sequences $(N_1,N_2,...)_1$, $(N_1,N_2,...)_2$, ..., but I still don't get the purpose of the calculation.










share|cite|improve this question











$endgroup$












  • $begingroup$
    What background do you have? I am not sure what I need to explain and what you already know. Glossing over it I don't immediately know what it is about and given that you seem to not know about random variables or expected values I am getting the feeling that you should start with an introduction to probablility theory first.
    $endgroup$
    – Felix B.
    Jan 15 at 20:05










  • $begingroup$
    I just don't know much about measures and Lebesgue related stuff, but I've been and will be reading about it, so feel free to answer as convenient.
    $endgroup$
    – xihiro
    Jan 15 at 22:05














0












0








0





$begingroup$


In page 3 of these notes - Why study multifractal spectra? -, starting at "We begin with...", they get



$$f(x)=pmb{E}big[,log[,N(x),],big]quad pmb{(1)}$$



where $pmb{E}[... ]$ seems to be the expected value
$$lim_{ntoinfty} frac{1}{n} sum_{j=1}^{n} [...]_j $$
as written above, for a certain $x$.



What is the point of the calculation that follows, and what does the outer $pmb{E}$ in the integral stand for?



I'm guessing this $pmb{E}$ is just the expected value considering different sequences $(N_1,N_2,...)_1$, $(N_1,N_2,...)_2$, ..., but I still don't get the purpose of the calculation.










share|cite|improve this question











$endgroup$




In page 3 of these notes - Why study multifractal spectra? -, starting at "We begin with...", they get



$$f(x)=pmb{E}big[,log[,N(x),],big]quad pmb{(1)}$$



where $pmb{E}[... ]$ seems to be the expected value
$$lim_{ntoinfty} frac{1}{n} sum_{j=1}^{n} [...]_j $$
as written above, for a certain $x$.



What is the point of the calculation that follows, and what does the outer $pmb{E}$ in the integral stand for?



I'm guessing this $pmb{E}$ is just the expected value considering different sequences $(N_1,N_2,...)_1$, $(N_1,N_2,...)_2$, ..., but I still don't get the purpose of the calculation.







calculus probability-theory lebesgue-measure






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 16 at 16:58







xihiro

















asked Jan 15 at 19:57









xihiroxihiro

728




728












  • $begingroup$
    What background do you have? I am not sure what I need to explain and what you already know. Glossing over it I don't immediately know what it is about and given that you seem to not know about random variables or expected values I am getting the feeling that you should start with an introduction to probablility theory first.
    $endgroup$
    – Felix B.
    Jan 15 at 20:05










  • $begingroup$
    I just don't know much about measures and Lebesgue related stuff, but I've been and will be reading about it, so feel free to answer as convenient.
    $endgroup$
    – xihiro
    Jan 15 at 22:05


















  • $begingroup$
    What background do you have? I am not sure what I need to explain and what you already know. Glossing over it I don't immediately know what it is about and given that you seem to not know about random variables or expected values I am getting the feeling that you should start with an introduction to probablility theory first.
    $endgroup$
    – Felix B.
    Jan 15 at 20:05










  • $begingroup$
    I just don't know much about measures and Lebesgue related stuff, but I've been and will be reading about it, so feel free to answer as convenient.
    $endgroup$
    – xihiro
    Jan 15 at 22:05
















$begingroup$
What background do you have? I am not sure what I need to explain and what you already know. Glossing over it I don't immediately know what it is about and given that you seem to not know about random variables or expected values I am getting the feeling that you should start with an introduction to probablility theory first.
$endgroup$
– Felix B.
Jan 15 at 20:05




$begingroup$
What background do you have? I am not sure what I need to explain and what you already know. Glossing over it I don't immediately know what it is about and given that you seem to not know about random variables or expected values I am getting the feeling that you should start with an introduction to probablility theory first.
$endgroup$
– Felix B.
Jan 15 at 20:05












$begingroup$
I just don't know much about measures and Lebesgue related stuff, but I've been and will be reading about it, so feel free to answer as convenient.
$endgroup$
– xihiro
Jan 15 at 22:05




$begingroup$
I just don't know much about measures and Lebesgue related stuff, but I've been and will be reading about it, so feel free to answer as convenient.
$endgroup$
– xihiro
Jan 15 at 22:05










1 Answer
1






active

oldest

votes


















0












$begingroup$

For X a real numbers valued random variable:
$$E[X]=int X P = int x P_X$$
is the expected value of the random variable. (Where $P_X(A)=P(X^{-1}(A))$ is the probability distribution of X). Note that this a Lebesgue integral over probability meassures.



In case of a discrete random variable with ($P(X=x_n)>0$ and $P(X=x)=0$ for all x which are not in the sequence of $x_n$) this simplifies to:
$$E[X]=sum_{nin mathbb{N}}x_nP(X=x_n)$$
For continuous random variables with a density f this simplifies to:
$$E[X]= int xf(x)dx$$
Fubini is a theorem which allows you to switch the order of integration. And since the expected value is in fact an integral you can use it to switch the expectation with the other integral.



"Almost all" means that the set of x which do not have that property has the measure zero. The lebesgue measure is the normal distance measure on the real numbers. So lebesgue almost all means that the set which doesn't fulfill this property has zero length.



I can't answer the rest of the questions, because I don't understand what is happening at a glance and I am not too familiar with stochastic processes yet, so I would need some time to understand it. Given that you struggle with these basics though, I would either recommend asking someone to distill the contents of the paper down for you without actually understanding the maths or start with an intro to probability theory.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Thank you for your answer, but as you can infer from what I wrote, I already know what E[X] typically stands for. My main issue is the outer E[...], I don't quite get how that expected value is being calculated (it is taking the average over what exactly?). As for Fubini's theorem, I also understand its use in the calculation, I just don't understand the point of the calculation itself. I'll change the title to make it clearer.
    $endgroup$
    – xihiro
    Jan 15 at 21:57












  • $begingroup$
    @nvon Oh so we had a communication problem - I was really thrown off by you calling it "average". Which I wouldn't use in that context. As it is usually used for "taking the average" over realized random variables i.e. numbers. And without any special weights. So just $1/nsum_{k=1}^n x_k$ otherwise I would call it weighted average at least. While E[X] is really the expected value. But maybe we are just having a language problem
    $endgroup$
    – Felix B.
    Jan 15 at 22:41












  • $begingroup$
    Yeah, sorry for using the two interchangeably, it's now corrected. So, in the first case where the $pmb{E}[,]$ appears, the "$x_k$" is $log[N_k]$. But what about the second time, what is the "$x_k$"?
    $endgroup$
    – xihiro
    Jan 15 at 22:51










  • $begingroup$
    The first case is not indexed. The N is a random variable which has a certain distribution. And then later on you sample that random variable in every iteration step getting copies of the same random variable, those are the $N_n(x)$. And with the law of large numbers the average converges almost surely to the expected value. The integral is the difference between the average and the expected value integrated over the interval.
    $endgroup$
    – Felix B.
    Jan 16 at 0:22










  • $begingroup$
    Oh! So they use the law of large numbers to conclude $f(x)=pmb{E}[log[N]]$ for a fixed $x$, and then use the integral and Fubini's theorem to prove that is also true for (almost) all $x$ (i.e., that $f(x)$ (almost) always converges to $pmb{E}[log[N]]$ even though different $x$'s may correspond to different sequences of the $N_i, i = 1,2,...$). Correct?
    $endgroup$
    – xihiro
    Jan 16 at 12:18












Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074883%2fpurpose-of-calculation-not-clear-fubini-theorem-lebesgue-measure-multifractal%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

For X a real numbers valued random variable:
$$E[X]=int X P = int x P_X$$
is the expected value of the random variable. (Where $P_X(A)=P(X^{-1}(A))$ is the probability distribution of X). Note that this a Lebesgue integral over probability meassures.



In case of a discrete random variable with ($P(X=x_n)>0$ and $P(X=x)=0$ for all x which are not in the sequence of $x_n$) this simplifies to:
$$E[X]=sum_{nin mathbb{N}}x_nP(X=x_n)$$
For continuous random variables with a density f this simplifies to:
$$E[X]= int xf(x)dx$$
Fubini is a theorem which allows you to switch the order of integration. And since the expected value is in fact an integral you can use it to switch the expectation with the other integral.



"Almost all" means that the set of x which do not have that property has the measure zero. The lebesgue measure is the normal distance measure on the real numbers. So lebesgue almost all means that the set which doesn't fulfill this property has zero length.



I can't answer the rest of the questions, because I don't understand what is happening at a glance and I am not too familiar with stochastic processes yet, so I would need some time to understand it. Given that you struggle with these basics though, I would either recommend asking someone to distill the contents of the paper down for you without actually understanding the maths or start with an intro to probability theory.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Thank you for your answer, but as you can infer from what I wrote, I already know what E[X] typically stands for. My main issue is the outer E[...], I don't quite get how that expected value is being calculated (it is taking the average over what exactly?). As for Fubini's theorem, I also understand its use in the calculation, I just don't understand the point of the calculation itself. I'll change the title to make it clearer.
    $endgroup$
    – xihiro
    Jan 15 at 21:57












  • $begingroup$
    @nvon Oh so we had a communication problem - I was really thrown off by you calling it "average". Which I wouldn't use in that context. As it is usually used for "taking the average" over realized random variables i.e. numbers. And without any special weights. So just $1/nsum_{k=1}^n x_k$ otherwise I would call it weighted average at least. While E[X] is really the expected value. But maybe we are just having a language problem
    $endgroup$
    – Felix B.
    Jan 15 at 22:41












  • $begingroup$
    Yeah, sorry for using the two interchangeably, it's now corrected. So, in the first case where the $pmb{E}[,]$ appears, the "$x_k$" is $log[N_k]$. But what about the second time, what is the "$x_k$"?
    $endgroup$
    – xihiro
    Jan 15 at 22:51










  • $begingroup$
    The first case is not indexed. The N is a random variable which has a certain distribution. And then later on you sample that random variable in every iteration step getting copies of the same random variable, those are the $N_n(x)$. And with the law of large numbers the average converges almost surely to the expected value. The integral is the difference between the average and the expected value integrated over the interval.
    $endgroup$
    – Felix B.
    Jan 16 at 0:22










  • $begingroup$
    Oh! So they use the law of large numbers to conclude $f(x)=pmb{E}[log[N]]$ for a fixed $x$, and then use the integral and Fubini's theorem to prove that is also true for (almost) all $x$ (i.e., that $f(x)$ (almost) always converges to $pmb{E}[log[N]]$ even though different $x$'s may correspond to different sequences of the $N_i, i = 1,2,...$). Correct?
    $endgroup$
    – xihiro
    Jan 16 at 12:18
















0












$begingroup$

For X a real numbers valued random variable:
$$E[X]=int X P = int x P_X$$
is the expected value of the random variable. (Where $P_X(A)=P(X^{-1}(A))$ is the probability distribution of X). Note that this a Lebesgue integral over probability meassures.



In case of a discrete random variable with ($P(X=x_n)>0$ and $P(X=x)=0$ for all x which are not in the sequence of $x_n$) this simplifies to:
$$E[X]=sum_{nin mathbb{N}}x_nP(X=x_n)$$
For continuous random variables with a density f this simplifies to:
$$E[X]= int xf(x)dx$$
Fubini is a theorem which allows you to switch the order of integration. And since the expected value is in fact an integral you can use it to switch the expectation with the other integral.



"Almost all" means that the set of x which do not have that property has the measure zero. The lebesgue measure is the normal distance measure on the real numbers. So lebesgue almost all means that the set which doesn't fulfill this property has zero length.



I can't answer the rest of the questions, because I don't understand what is happening at a glance and I am not too familiar with stochastic processes yet, so I would need some time to understand it. Given that you struggle with these basics though, I would either recommend asking someone to distill the contents of the paper down for you without actually understanding the maths or start with an intro to probability theory.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Thank you for your answer, but as you can infer from what I wrote, I already know what E[X] typically stands for. My main issue is the outer E[...], I don't quite get how that expected value is being calculated (it is taking the average over what exactly?). As for Fubini's theorem, I also understand its use in the calculation, I just don't understand the point of the calculation itself. I'll change the title to make it clearer.
    $endgroup$
    – xihiro
    Jan 15 at 21:57












  • $begingroup$
    @nvon Oh so we had a communication problem - I was really thrown off by you calling it "average". Which I wouldn't use in that context. As it is usually used for "taking the average" over realized random variables i.e. numbers. And without any special weights. So just $1/nsum_{k=1}^n x_k$ otherwise I would call it weighted average at least. While E[X] is really the expected value. But maybe we are just having a language problem
    $endgroup$
    – Felix B.
    Jan 15 at 22:41












  • $begingroup$
    Yeah, sorry for using the two interchangeably, it's now corrected. So, in the first case where the $pmb{E}[,]$ appears, the "$x_k$" is $log[N_k]$. But what about the second time, what is the "$x_k$"?
    $endgroup$
    – xihiro
    Jan 15 at 22:51










  • $begingroup$
    The first case is not indexed. The N is a random variable which has a certain distribution. And then later on you sample that random variable in every iteration step getting copies of the same random variable, those are the $N_n(x)$. And with the law of large numbers the average converges almost surely to the expected value. The integral is the difference between the average and the expected value integrated over the interval.
    $endgroup$
    – Felix B.
    Jan 16 at 0:22










  • $begingroup$
    Oh! So they use the law of large numbers to conclude $f(x)=pmb{E}[log[N]]$ for a fixed $x$, and then use the integral and Fubini's theorem to prove that is also true for (almost) all $x$ (i.e., that $f(x)$ (almost) always converges to $pmb{E}[log[N]]$ even though different $x$'s may correspond to different sequences of the $N_i, i = 1,2,...$). Correct?
    $endgroup$
    – xihiro
    Jan 16 at 12:18














0












0








0





$begingroup$

For X a real numbers valued random variable:
$$E[X]=int X P = int x P_X$$
is the expected value of the random variable. (Where $P_X(A)=P(X^{-1}(A))$ is the probability distribution of X). Note that this a Lebesgue integral over probability meassures.



In case of a discrete random variable with ($P(X=x_n)>0$ and $P(X=x)=0$ for all x which are not in the sequence of $x_n$) this simplifies to:
$$E[X]=sum_{nin mathbb{N}}x_nP(X=x_n)$$
For continuous random variables with a density f this simplifies to:
$$E[X]= int xf(x)dx$$
Fubini is a theorem which allows you to switch the order of integration. And since the expected value is in fact an integral you can use it to switch the expectation with the other integral.



"Almost all" means that the set of x which do not have that property has the measure zero. The lebesgue measure is the normal distance measure on the real numbers. So lebesgue almost all means that the set which doesn't fulfill this property has zero length.



I can't answer the rest of the questions, because I don't understand what is happening at a glance and I am not too familiar with stochastic processes yet, so I would need some time to understand it. Given that you struggle with these basics though, I would either recommend asking someone to distill the contents of the paper down for you without actually understanding the maths or start with an intro to probability theory.






share|cite|improve this answer









$endgroup$



For X a real numbers valued random variable:
$$E[X]=int X P = int x P_X$$
is the expected value of the random variable. (Where $P_X(A)=P(X^{-1}(A))$ is the probability distribution of X). Note that this a Lebesgue integral over probability meassures.



In case of a discrete random variable with ($P(X=x_n)>0$ and $P(X=x)=0$ for all x which are not in the sequence of $x_n$) this simplifies to:
$$E[X]=sum_{nin mathbb{N}}x_nP(X=x_n)$$
For continuous random variables with a density f this simplifies to:
$$E[X]= int xf(x)dx$$
Fubini is a theorem which allows you to switch the order of integration. And since the expected value is in fact an integral you can use it to switch the expectation with the other integral.



"Almost all" means that the set of x which do not have that property has the measure zero. The lebesgue measure is the normal distance measure on the real numbers. So lebesgue almost all means that the set which doesn't fulfill this property has zero length.



I can't answer the rest of the questions, because I don't understand what is happening at a glance and I am not too familiar with stochastic processes yet, so I would need some time to understand it. Given that you struggle with these basics though, I would either recommend asking someone to distill the contents of the paper down for you without actually understanding the maths or start with an intro to probability theory.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 15 at 20:24









Felix B.Felix B.

694317




694317












  • $begingroup$
    Thank you for your answer, but as you can infer from what I wrote, I already know what E[X] typically stands for. My main issue is the outer E[...], I don't quite get how that expected value is being calculated (it is taking the average over what exactly?). As for Fubini's theorem, I also understand its use in the calculation, I just don't understand the point of the calculation itself. I'll change the title to make it clearer.
    $endgroup$
    – xihiro
    Jan 15 at 21:57












  • $begingroup$
    @nvon Oh so we had a communication problem - I was really thrown off by you calling it "average". Which I wouldn't use in that context. As it is usually used for "taking the average" over realized random variables i.e. numbers. And without any special weights. So just $1/nsum_{k=1}^n x_k$ otherwise I would call it weighted average at least. While E[X] is really the expected value. But maybe we are just having a language problem
    $endgroup$
    – Felix B.
    Jan 15 at 22:41












  • $begingroup$
    Yeah, sorry for using the two interchangeably, it's now corrected. So, in the first case where the $pmb{E}[,]$ appears, the "$x_k$" is $log[N_k]$. But what about the second time, what is the "$x_k$"?
    $endgroup$
    – xihiro
    Jan 15 at 22:51










  • $begingroup$
    The first case is not indexed. The N is a random variable which has a certain distribution. And then later on you sample that random variable in every iteration step getting copies of the same random variable, those are the $N_n(x)$. And with the law of large numbers the average converges almost surely to the expected value. The integral is the difference between the average and the expected value integrated over the interval.
    $endgroup$
    – Felix B.
    Jan 16 at 0:22










  • $begingroup$
    Oh! So they use the law of large numbers to conclude $f(x)=pmb{E}[log[N]]$ for a fixed $x$, and then use the integral and Fubini's theorem to prove that is also true for (almost) all $x$ (i.e., that $f(x)$ (almost) always converges to $pmb{E}[log[N]]$ even though different $x$'s may correspond to different sequences of the $N_i, i = 1,2,...$). Correct?
    $endgroup$
    – xihiro
    Jan 16 at 12:18


















  • $begingroup$
    Thank you for your answer, but as you can infer from what I wrote, I already know what E[X] typically stands for. My main issue is the outer E[...], I don't quite get how that expected value is being calculated (it is taking the average over what exactly?). As for Fubini's theorem, I also understand its use in the calculation, I just don't understand the point of the calculation itself. I'll change the title to make it clearer.
    $endgroup$
    – xihiro
    Jan 15 at 21:57












  • $begingroup$
    @nvon Oh so we had a communication problem - I was really thrown off by you calling it "average". Which I wouldn't use in that context. As it is usually used for "taking the average" over realized random variables i.e. numbers. And without any special weights. So just $1/nsum_{k=1}^n x_k$ otherwise I would call it weighted average at least. While E[X] is really the expected value. But maybe we are just having a language problem
    $endgroup$
    – Felix B.
    Jan 15 at 22:41












  • $begingroup$
    Yeah, sorry for using the two interchangeably, it's now corrected. So, in the first case where the $pmb{E}[,]$ appears, the "$x_k$" is $log[N_k]$. But what about the second time, what is the "$x_k$"?
    $endgroup$
    – xihiro
    Jan 15 at 22:51










  • $begingroup$
    The first case is not indexed. The N is a random variable which has a certain distribution. And then later on you sample that random variable in every iteration step getting copies of the same random variable, those are the $N_n(x)$. And with the law of large numbers the average converges almost surely to the expected value. The integral is the difference between the average and the expected value integrated over the interval.
    $endgroup$
    – Felix B.
    Jan 16 at 0:22










  • $begingroup$
    Oh! So they use the law of large numbers to conclude $f(x)=pmb{E}[log[N]]$ for a fixed $x$, and then use the integral and Fubini's theorem to prove that is also true for (almost) all $x$ (i.e., that $f(x)$ (almost) always converges to $pmb{E}[log[N]]$ even though different $x$'s may correspond to different sequences of the $N_i, i = 1,2,...$). Correct?
    $endgroup$
    – xihiro
    Jan 16 at 12:18
















$begingroup$
Thank you for your answer, but as you can infer from what I wrote, I already know what E[X] typically stands for. My main issue is the outer E[...], I don't quite get how that expected value is being calculated (it is taking the average over what exactly?). As for Fubini's theorem, I also understand its use in the calculation, I just don't understand the point of the calculation itself. I'll change the title to make it clearer.
$endgroup$
– xihiro
Jan 15 at 21:57






$begingroup$
Thank you for your answer, but as you can infer from what I wrote, I already know what E[X] typically stands for. My main issue is the outer E[...], I don't quite get how that expected value is being calculated (it is taking the average over what exactly?). As for Fubini's theorem, I also understand its use in the calculation, I just don't understand the point of the calculation itself. I'll change the title to make it clearer.
$endgroup$
– xihiro
Jan 15 at 21:57














$begingroup$
@nvon Oh so we had a communication problem - I was really thrown off by you calling it "average". Which I wouldn't use in that context. As it is usually used for "taking the average" over realized random variables i.e. numbers. And without any special weights. So just $1/nsum_{k=1}^n x_k$ otherwise I would call it weighted average at least. While E[X] is really the expected value. But maybe we are just having a language problem
$endgroup$
– Felix B.
Jan 15 at 22:41






$begingroup$
@nvon Oh so we had a communication problem - I was really thrown off by you calling it "average". Which I wouldn't use in that context. As it is usually used for "taking the average" over realized random variables i.e. numbers. And without any special weights. So just $1/nsum_{k=1}^n x_k$ otherwise I would call it weighted average at least. While E[X] is really the expected value. But maybe we are just having a language problem
$endgroup$
– Felix B.
Jan 15 at 22:41














$begingroup$
Yeah, sorry for using the two interchangeably, it's now corrected. So, in the first case where the $pmb{E}[,]$ appears, the "$x_k$" is $log[N_k]$. But what about the second time, what is the "$x_k$"?
$endgroup$
– xihiro
Jan 15 at 22:51




$begingroup$
Yeah, sorry for using the two interchangeably, it's now corrected. So, in the first case where the $pmb{E}[,]$ appears, the "$x_k$" is $log[N_k]$. But what about the second time, what is the "$x_k$"?
$endgroup$
– xihiro
Jan 15 at 22:51












$begingroup$
The first case is not indexed. The N is a random variable which has a certain distribution. And then later on you sample that random variable in every iteration step getting copies of the same random variable, those are the $N_n(x)$. And with the law of large numbers the average converges almost surely to the expected value. The integral is the difference between the average and the expected value integrated over the interval.
$endgroup$
– Felix B.
Jan 16 at 0:22




$begingroup$
The first case is not indexed. The N is a random variable which has a certain distribution. And then later on you sample that random variable in every iteration step getting copies of the same random variable, those are the $N_n(x)$. And with the law of large numbers the average converges almost surely to the expected value. The integral is the difference between the average and the expected value integrated over the interval.
$endgroup$
– Felix B.
Jan 16 at 0:22












$begingroup$
Oh! So they use the law of large numbers to conclude $f(x)=pmb{E}[log[N]]$ for a fixed $x$, and then use the integral and Fubini's theorem to prove that is also true for (almost) all $x$ (i.e., that $f(x)$ (almost) always converges to $pmb{E}[log[N]]$ even though different $x$'s may correspond to different sequences of the $N_i, i = 1,2,...$). Correct?
$endgroup$
– xihiro
Jan 16 at 12:18




$begingroup$
Oh! So they use the law of large numbers to conclude $f(x)=pmb{E}[log[N]]$ for a fixed $x$, and then use the integral and Fubini's theorem to prove that is also true for (almost) all $x$ (i.e., that $f(x)$ (almost) always converges to $pmb{E}[log[N]]$ even though different $x$'s may correspond to different sequences of the $N_i, i = 1,2,...$). Correct?
$endgroup$
– xihiro
Jan 16 at 12:18


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074883%2fpurpose-of-calculation-not-clear-fubini-theorem-lebesgue-measure-multifractal%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Human spaceflight

Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

張江高科駅