Showing independence of increments of a stochastic process












4














The textbook on stochastic calculus I am now reading says that
if $Xcolon [0,infty)timesOmegarightarrowmathbb R$ is a stochastic process such that




  1. $X(t)-X(s)sim N(0,t-s)$ for all $t geq s geq 0$,


  2. $E[X(t)X(s)]=min{s,t}$ for all $s,t geq 0$,



then, $X$ exhibits independence increment, i.e. for every $0 leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.



Here $X(t)$ denotes a random variable $X(t):Omegarightarrow mathbb R$ such that $X(t)(omega)=X(t,omega)$.



But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)



Am I on the right track? If so, can you give me some counterexamples?



Or can it be shown without assuming Gaussian process?



Any hint would be appreciated! Thanks and regards.










share|cite|improve this question




















  • 1




    Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
    – AddSup
    Dec 25 '18 at 19:31










  • @AddSup yes also interested to get the original source to check for myself as well!
    – Ezy
    Dec 29 '18 at 4:34
















4














The textbook on stochastic calculus I am now reading says that
if $Xcolon [0,infty)timesOmegarightarrowmathbb R$ is a stochastic process such that




  1. $X(t)-X(s)sim N(0,t-s)$ for all $t geq s geq 0$,


  2. $E[X(t)X(s)]=min{s,t}$ for all $s,t geq 0$,



then, $X$ exhibits independence increment, i.e. for every $0 leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.



Here $X(t)$ denotes a random variable $X(t):Omegarightarrow mathbb R$ such that $X(t)(omega)=X(t,omega)$.



But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)



Am I on the right track? If so, can you give me some counterexamples?



Or can it be shown without assuming Gaussian process?



Any hint would be appreciated! Thanks and regards.










share|cite|improve this question




















  • 1




    Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
    – AddSup
    Dec 25 '18 at 19:31










  • @AddSup yes also interested to get the original source to check for myself as well!
    – Ezy
    Dec 29 '18 at 4:34














4












4








4


2





The textbook on stochastic calculus I am now reading says that
if $Xcolon [0,infty)timesOmegarightarrowmathbb R$ is a stochastic process such that




  1. $X(t)-X(s)sim N(0,t-s)$ for all $t geq s geq 0$,


  2. $E[X(t)X(s)]=min{s,t}$ for all $s,t geq 0$,



then, $X$ exhibits independence increment, i.e. for every $0 leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.



Here $X(t)$ denotes a random variable $X(t):Omegarightarrow mathbb R$ such that $X(t)(omega)=X(t,omega)$.



But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)



Am I on the right track? If so, can you give me some counterexamples?



Or can it be shown without assuming Gaussian process?



Any hint would be appreciated! Thanks and regards.










share|cite|improve this question















The textbook on stochastic calculus I am now reading says that
if $Xcolon [0,infty)timesOmegarightarrowmathbb R$ is a stochastic process such that




  1. $X(t)-X(s)sim N(0,t-s)$ for all $t geq s geq 0$,


  2. $E[X(t)X(s)]=min{s,t}$ for all $s,t geq 0$,



then, $X$ exhibits independence increment, i.e. for every $0 leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.



Here $X(t)$ denotes a random variable $X(t):Omegarightarrow mathbb R$ such that $X(t)(omega)=X(t,omega)$.



But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)



Am I on the right track? If so, can you give me some counterexamples?



Or can it be shown without assuming Gaussian process?



Any hint would be appreciated! Thanks and regards.







probability-theory stochastic-processes brownian-motion






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 21 '18 at 1:24









Saad

19.7k92252




19.7k92252










asked Nov 4 '18 at 13:45









Mhr

54119




54119








  • 1




    Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
    – AddSup
    Dec 25 '18 at 19:31










  • @AddSup yes also interested to get the original source to check for myself as well!
    – Ezy
    Dec 29 '18 at 4:34














  • 1




    Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
    – AddSup
    Dec 25 '18 at 19:31










  • @AddSup yes also interested to get the original source to check for myself as well!
    – Ezy
    Dec 29 '18 at 4:34








1




1




Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
– AddSup
Dec 25 '18 at 19:31




Seems this will return to oblivion. @Mhr, could you let me know which textbook you're referring to? Maybe there's a hint in it.
– AddSup
Dec 25 '18 at 19:31












@AddSup yes also interested to get the original source to check for myself as well!
– Ezy
Dec 29 '18 at 4:34




@AddSup yes also interested to get the original source to check for myself as well!
– Ezy
Dec 29 '18 at 4:34










2 Answers
2






active

oldest

votes


















0














Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.






share|cite|improve this answer



















  • 2




    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    – Mhr
    Nov 4 '18 at 14:13



















0














I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf






share|cite|improve this answer





















  • Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    – AddSup
    Dec 27 '18 at 8:19












  • To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    – AddSup
    Dec 27 '18 at 8:55










  • @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    – Ezy
    Dec 27 '18 at 8:56










  • Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    – AddSup
    Dec 27 '18 at 9:42










  • @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    – Ezy
    Dec 27 '18 at 12:09











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2984235%2fshowing-independence-of-increments-of-a-stochastic-process%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.






share|cite|improve this answer



















  • 2




    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    – Mhr
    Nov 4 '18 at 14:13
















0














Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.






share|cite|improve this answer



















  • 2




    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    – Mhr
    Nov 4 '18 at 14:13














0












0








0






Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.






share|cite|improve this answer














Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $iin[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality)
begin{align*}
mathbb{E}[Y_i Y_j] &= mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\
&=mathbb{E}[X(t_i)X(t_j)]-mathbb{E}[X(t_{i-1})X(t_j)]-mathbb{E}[X(t_i)X(t_{j-1})]+mathbb{E}[X(t_{i-1})X(t_{j-1})]\
&=t_i-t_{i-1}-t_i+t_{i-1}\
&=0
end{align*}

So the covariance matrix of $(Y_1,dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $iin[2:n]$.



The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.





All this is true only if $Y=(Y_1,dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $mathbf a$ of size $n$, $mathbf a^T Y$ is a Gaussian random variable.



I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 4 '18 at 14:52

























answered Nov 4 '18 at 14:10









P. Quinton

1,441213




1,441213








  • 2




    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    – Mhr
    Nov 4 '18 at 14:13














  • 2




    That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
    – Mhr
    Nov 4 '18 at 14:13








2




2




That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
– Mhr
Nov 4 '18 at 14:13




That's just uncorrelatedness, not independence, I guess? How do you know that $(Y_1,...,Y_n)$ is jointly normal?
– Mhr
Nov 4 '18 at 14:13











0














I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf






share|cite|improve this answer





















  • Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    – AddSup
    Dec 27 '18 at 8:19












  • To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    – AddSup
    Dec 27 '18 at 8:55










  • @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    – Ezy
    Dec 27 '18 at 8:56










  • Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    – AddSup
    Dec 27 '18 at 9:42










  • @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    – Ezy
    Dec 27 '18 at 12:09
















0














I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf






share|cite|improve this answer





















  • Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    – AddSup
    Dec 27 '18 at 8:19












  • To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    – AddSup
    Dec 27 '18 at 8:55










  • @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    – Ezy
    Dec 27 '18 at 8:56










  • Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    – AddSup
    Dec 27 '18 at 9:42










  • @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    – Ezy
    Dec 27 '18 at 12:09














0












0








0






I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf






share|cite|improve this answer












I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.



See this article for details



http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Dec 27 '18 at 6:10









Ezy

54429




54429












  • Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    – AddSup
    Dec 27 '18 at 8:19












  • To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    – AddSup
    Dec 27 '18 at 8:55










  • @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    – Ezy
    Dec 27 '18 at 8:56










  • Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    – AddSup
    Dec 27 '18 at 9:42










  • @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    – Ezy
    Dec 27 '18 at 12:09


















  • Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
    – AddSup
    Dec 27 '18 at 8:19












  • To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
    – AddSup
    Dec 27 '18 at 8:55










  • @AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
    – Ezy
    Dec 27 '18 at 8:56










  • Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
    – AddSup
    Dec 27 '18 at 9:42










  • @AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
    – Ezy
    Dec 27 '18 at 12:09
















Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
– AddSup
Dec 27 '18 at 8:19






Ezy, thank you, that seems promising. But it also seems that to apply Levy's characterization, we need $X$ to be a square-integrable continuous martingale (or to be a continuous local martingale and $[X]_t=t$). Could you show me how we can get this condition from the two assumptions in the original post?
– AddSup
Dec 27 '18 at 8:19














To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
– AddSup
Dec 27 '18 at 8:55




To be specific, in page 3 of the note you linked, how do we know $f(M_t,t)$ is a martingale when $M$ is only known to satisfy the two conditions in the OP?
– AddSup
Dec 27 '18 at 8:55












@AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
– Ezy
Dec 27 '18 at 8:56




@AddSup well you can easily get the quadratic variation of the process from the assumption about the increments and thr square integrability from the second assumption. You also get the martingale relation from the assumption about increments. I believe you also get the continuity as from the assumption about increments
– Ezy
Dec 27 '18 at 8:56












Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
– AddSup
Dec 27 '18 at 9:42




Thank you. But still lost: (i) QV: The derivation of the QV of BM I know of (e.g., Revuz and Yor, p. 29) relies on the independence of increments. (ii) Square Integrability: True. (iii) Martingality: No idea. (The assumption is about unconditional distributions.) (iv) Path Continuity: No idea.
– AddSup
Dec 27 '18 at 9:42












@AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
– Ezy
Dec 27 '18 at 12:09




@AddSup for the QV part i dont think you need independence, just that up to moment 4 of increments factorize which i believe you can do similarly to the derivation made in the other answer above. See p4 of ocw.mit.edu/courses/sloan-school-of-management/… . For continuity i agree i went perhaps too quickly but i believe jump processes would have quadratic variation always strictly higher than $t$ due to the jump contribution. But i agree i need to find a more convincing proof for that (if that’s correct)
– Ezy
Dec 27 '18 at 12:09


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2984235%2fshowing-independence-of-increments-of-a-stochastic-process%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Human spaceflight

Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

張江高科駅