How to prove that exponential grows faster than polynomial?
$begingroup$
In other words, how to prove:
For all real constants $a$ and $b$ such that $a > 1$,
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
I know the definition of limit but I feel that it's not enough to prove this theorem.
calculus limits exponential-function
$endgroup$
add a comment |
$begingroup$
In other words, how to prove:
For all real constants $a$ and $b$ such that $a > 1$,
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
I know the definition of limit but I feel that it's not enough to prove this theorem.
calculus limits exponential-function
$endgroup$
5
$begingroup$
Have you tried expressing both in terms of the exponential?
$endgroup$
– gary
Aug 4 '11 at 1:41
$begingroup$
You could write the exponential as an infinite polynomial power series. You could also see what happens when you take higher and higher order derivatives of $n^b$ and $a^n$: the polynomial vanishes by an order each time and the exponential is only multiplied by a constant factors ($ln a$) each time.
$endgroup$
– jnm2
Aug 4 '11 at 2:36
add a comment |
$begingroup$
In other words, how to prove:
For all real constants $a$ and $b$ such that $a > 1$,
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
I know the definition of limit but I feel that it's not enough to prove this theorem.
calculus limits exponential-function
$endgroup$
In other words, how to prove:
For all real constants $a$ and $b$ such that $a > 1$,
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
I know the definition of limit but I feel that it's not enough to prove this theorem.
calculus limits exponential-function
calculus limits exponential-function
edited Jun 21 '18 at 18:21
Wouter
5,96921436
5,96921436
asked Aug 4 '11 at 1:33
ablmfablmf
2,58352452
2,58352452
5
$begingroup$
Have you tried expressing both in terms of the exponential?
$endgroup$
– gary
Aug 4 '11 at 1:41
$begingroup$
You could write the exponential as an infinite polynomial power series. You could also see what happens when you take higher and higher order derivatives of $n^b$ and $a^n$: the polynomial vanishes by an order each time and the exponential is only multiplied by a constant factors ($ln a$) each time.
$endgroup$
– jnm2
Aug 4 '11 at 2:36
add a comment |
5
$begingroup$
Have you tried expressing both in terms of the exponential?
$endgroup$
– gary
Aug 4 '11 at 1:41
$begingroup$
You could write the exponential as an infinite polynomial power series. You could also see what happens when you take higher and higher order derivatives of $n^b$ and $a^n$: the polynomial vanishes by an order each time and the exponential is only multiplied by a constant factors ($ln a$) each time.
$endgroup$
– jnm2
Aug 4 '11 at 2:36
5
5
$begingroup$
Have you tried expressing both in terms of the exponential?
$endgroup$
– gary
Aug 4 '11 at 1:41
$begingroup$
Have you tried expressing both in terms of the exponential?
$endgroup$
– gary
Aug 4 '11 at 1:41
$begingroup$
You could write the exponential as an infinite polynomial power series. You could also see what happens when you take higher and higher order derivatives of $n^b$ and $a^n$: the polynomial vanishes by an order each time and the exponential is only multiplied by a constant factors ($ln a$) each time.
$endgroup$
– jnm2
Aug 4 '11 at 2:36
$begingroup$
You could write the exponential as an infinite polynomial power series. You could also see what happens when you take higher and higher order derivatives of $n^b$ and $a^n$: the polynomial vanishes by an order each time and the exponential is only multiplied by a constant factors ($ln a$) each time.
$endgroup$
– jnm2
Aug 4 '11 at 2:36
add a comment |
13 Answers
13
active
oldest
votes
$begingroup$
We could prove this by induction on integers $k$:
$$
lim_{n to infty} frac{n^k}{a^n} = 0.
$$
The case $k = 0$ is straightforward. I will leave the induction step to you. To see how this implies the statement for all real $b$, just note that every real number is less than some integer. In particular, $b leq lceil b rceil$. Thus,
$$
0 leq lim_{n to infty} frac{n^b}{a^n} leq lim_{n to infty} frac{n^{lceil b rceil}}{a^n} = 0.
$$
The first inequality follows since all the terms are positive. The last equality follows from the induction we established previously.
$endgroup$
$begingroup$
Could this really work? I mean $lim_{nrightarrowinfty}n = infty$. How can I do the induction?
$endgroup$
– ablmf
Aug 4 '11 at 2:55
$begingroup$
What you said is true, but I don't see how it enters the picture. As far as the induction process: when $k = 0$, you need to show $lim_{n to infty} frac{1}{a^n} =0$. As far as the induction step, assume that it holds for $k$, then show that it also holds for $k+1$ (Apply L'Hopital). This is pretty much the entire solution, but again, I will let you fill in the details.
$endgroup$
– JavaMan
Aug 4 '11 at 3:27
2
$begingroup$
Oh, I didn't know L'Hopital before. So the induction is $lim_{nrightarrowinfty}frac{n^(k+1)}{a^n} = lim_{nrightarrowinfty}frac{(k + 1) n^k}{ln(a)a^n} = 0$. Is that right?
$endgroup$
– ablmf
Aug 4 '11 at 3:41
$begingroup$
Yes that's right. Since $k+1$ and $ln(a)$ are constants, we can pull them out of the limit to get $frac{k+1}{ln(a)}lim_{n to infty} frac{n^{k}}{a^n} = frac{k+1}{ln(a)} cdot 0 = 0$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:45
$begingroup$
Also, I should point out that when using TeX, if you enclose your exponent with braces like "a^{2x}", it will exponentiation the entire term and it will look like $a^{2x}$. Without braces, you get $a^2x$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:48
add a comment |
$begingroup$
We can confine attention to $b ge 1$. This is because, if $0<b<1$, then $n^b le n$. If we can prove that $n/a^n$ approaches $0$, it will follow that $n^b/a^n$ approaches $0$ for any positive $ble 1$. So from now on we take $b ge 1$.
Now look at $n^b/a^n$, and take the $b$-th root. We get
$$frac{n}{(a^{1/b})^n}$$
or equivalently
$$frac{n}{c^n}$$
where $c=a^{1/b}$.
Note that $c>1$. If we can prove that $n/c^n$ approaches $0$ as $ntoinfty$, we will be finished.
For our original sequence consists of the $b$-th powers of the new sequence $(n/c^n)$. If we can show that $n/c^n$ has limit $0$, then after a while, $n/c^n le 1$, and so, after a while, the old sequence is, term by term, $le$ the new sequence. (Recall that $bge 1$.)
Progress, we need only look at $n/c^n$.
How do we continue? Any of the ways suggested by the other posts. Or else, let
$c=1+d$. Note that $d$ is positive. Note also that from the Binomial Theorem, if $n ge 2$ we have
$$c^n=(1+d)^n ge 1+dn+d^2n(n-1)/2gt d^2(n)(n-1)/2.$$
It follows that
$$frac{n}{c^n}< frac{n}{d^2(n)(n-1)/2}=frac{2}{d^2(n-1)}.$$
and it is clear that $dfrac{2}{d^2(n-1)} to 0$ as $ntoinfty$.
$endgroup$
8
$begingroup$
+1: I really, honestly and whole-heartedly think that this is the way to do this problem. From scratch. No l'Hospital, no properties of logarithms. Nothing shaky :-). Just the binomial theorem (or Bernoulli's inequality) to do the case $b=1$.
$endgroup$
– Jyrki Lahtonen
Aug 4 '11 at 8:03
$begingroup$
This is an elegant proof that could easily be understood. I will one day I could solve these problems as well as you do.
$endgroup$
– ablmf
Aug 4 '11 at 14:04
$begingroup$
Excellent answer.
$endgroup$
– YoTengoUnLCD
Jun 9 '16 at 23:56
add a comment |
$begingroup$
First, notice that $(n+1)^b/n^b=1$ plus terms in negative powers of $n$, so it goes to 1 as $ntoinfty$ (with $b$ fixed). Now going from $n^b/a^n$ to $(n+1)^b/a^{n+1}$, you multiply the numerator by something that's going to 1 but the denominator by something that isn't (namely, by $agt1$).
$endgroup$
3
$begingroup$
This is the simplest solution I can think of.
$endgroup$
– Qiaochu Yuan
Aug 4 '11 at 2:59
1
$begingroup$
This is a special case of multiplicative telescopy. See my post here for much further discussion of this example. Similar ideas work more generally for hyperrational functions.
$endgroup$
– Bill Dubuque
Aug 4 '11 at 3:02
$begingroup$
That means $lim_{nrightarrowinfty}frac{n^b}{a^n}=lim_{nrightarrowinfty}frac{(n+1)^b}{a^(n+1)}$, so it must be zero. Is my understanding right?
$endgroup$
– ablmf
Aug 4 '11 at 3:18
$begingroup$
Yes, that's one good way to understand it.
$endgroup$
– Gerry Myerson
Aug 4 '11 at 3:50
$begingroup$
@GerryMyerson I’m just reading this right now, but that’s a bad way to understand it: It isn’t clear yet that the limit exists at all and for any converging sequence $lim a_n = lim a_{n+1}$ holds, so one cannot deduce anything about the limit from this or am I missing something? Maybe ablmf meant $lim_{n → ∞} {n^b}/{a^n} = lim_{n → ∞} {n^b}/{a^{n+1}}$? But here again you first need to know that the limit exists.
$endgroup$
– k.stm
Jun 2 '13 at 17:55
|
show 1 more comment
$begingroup$
You can expand as$$a^n = e^{nlog a} = 1+nlog a +cdots +frac{(nlog a)^b}{b!}+cdots$$
$$frac{a^n}{n^b} =frac{1}{n^b}+frac{1}{n^{b-1}}log a+cdots+frac{(log a)^{b}}{b!}+nfrac{(log a)^{b+1}}{(b+1)!}+cdots $$
So, $$frac{a^n}{n^b}geq nfrac{(log a)^{b+1}}{(b+1)!}$$. This becomes arbitrarily large with large $n$
Therefore, $$lim_{nrightarrowinfty}frac{a^n}{n^b} = infty$$ Or $$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
$endgroup$
add a comment |
$begingroup$
Regard the exponential function as the unique solution to the initial value problem $f(0) = 1$ with $f' = f$. The equivalent integral formulation is
$f(t) = 1 + int_0^t f(s) ds$
First prove that $f$ is an increasing function. For $t geq 0$, you can use a method of continuity argument by considering the set ${ s ~:~ f mbox{ increases on } [0, s] }$. The supremum of this set cannot be finite. For $s < 0$ you can also consider the ODE satisfied by $f^2$, but you're not really concerned with that. Now that we know $f$ increase, we also know $f$ grows at least linearly:
$f(t) geq 1 + int_0^t 1 ds = (1 + t)$
It also grows at least quadratically..
$f(t) geq f(frac{t}{2}) + int_{t/2}^t f(s) ds geq (1 + t/2) f(t/2)$
And repeating yields
$f(t) geq (1 + t/2)^2$
In general, $f(t) geq (1 + t/n)^n$ by the same method. It's even true for $n$ not an integer. In this case, the function $(1 + t/n)^n$ is defined in the first place to be $e^{n log( 1 + t/n) }$. Then, since log x is defined by the integral $int_1^x frac{1}{t} dt = x int_0^1 (1 + sx)^{-1} ds $, we can rewrite
$(1 + t/n)^n = e^{n log (1 + t/n) } = mbox{exp}(t int_0^1 (1 + frac{s t}{n})^{-1} ds ) leq e^t$
EDIT: This answer may not be so helpful to you. I didn't read the end of your question, so you may not know how to work with most of the things I've been writing about. The main reason why the exponential grows faster than a polynomial is because if $f$ is exponential, then $f(n+1)$ is at least a constant times $f(n)$, whereas when $f$ is a polynomial, $f(n+1)$ is roughly the same size as $f(n)$ when $n$ is large. After all, you can hardly tell the difference between the volume of a huge cube, and a huge cube 1 unit longer in each direction.
$endgroup$
$begingroup$
Oops, seems that I won't be able to understand this before I finish my calculus course.
$endgroup$
– ablmf
Aug 4 '11 at 2:59
add a comment |
$begingroup$
Try expressing both in terms of the exponential $e^x$: $n^b$=$e^{bln(n)}$; $a^n=e^{nln(a)}$, so that the quotient becomes:
$frac{n^b}{a^n}$= $frac{e^{bln(n)}}{e^{nln(a)}}=e^{bln(n)-nln(a)}=e^{bln(n)-kn}=frac{e^{nlna}}{e^{kn}}$ , where k is a real constant. Can you tell which of ln(n) or n grows faster?
$endgroup$
$begingroup$
So the question becomes to prove $lim_{nrightarrowinfty}(bln(n)-kn) = -infty$ right? BTW, you made some typos.
$endgroup$
– ablmf
Aug 4 '11 at 2:58
$begingroup$
Yes; I think too, if you use that ln(n) grows slower than n. I will check for typos.
$endgroup$
– gary
Aug 4 '11 at 3:46
add a comment |
$begingroup$
This is inspired by André Nicolas's answer, but instead of taking $b^{th}$ roots, I'll take $n^{th}$ roots.
The limit of the sequence of $n^{th}$ roots of the terms in the original sequence is
$$lim_{ntoinfty}frac{(n^b)^{1/n}}{a}=lim_{ntoinfty}frac{left(n^{1/n}right)^b}{a}=frac{left(limlimits_{ntoinfty} n^{1/n}right)^b}{a}=frac{1^b}{a}=frac{1}{a}.$$
(I used here the fact that $limlimits_{ntoinfty}n^{1/n}=1$, which was the subject of another question on this site, and which can also be proved in many ways.)
If $c$ is any number such that $frac{1}{a}<c<1$, then the previous limit implies that there is an $N>0$ such that for all $n>N$, $$left(frac{n^b}{a^n}right)^{1/n}lt c.$$ Then for all $n>N$, $$frac{n^b}{a^n}lt c^n.$$ Since $c^nto 0$, this shows that the original series goes to zero.
Alternatively, once you know that the sequence of $n^{th}$ roots converges to a number less than $1$, you can apply the root test to conclude that $sumlimits_{n=1}^inftyfrac{n^b}{a^n}$ converges, which implies that the terms go to $0$. (The proof of the root test actually just uses the inequality derived above and the convergence of the geometric series with common ratio $c$.)
$endgroup$
add a comment |
$begingroup$
Since it is almost always better to expand around zero, let $a = 1+c$, where $c > 0$. So we want to show $lim frac{n^b}{(1+c)^n} = 0$.
The ratio of consecutive terms is $frac{(1+1/n)^b}{1+c}$, so if we can show that $(1+1/n)^b < 1+c/2$ for large enough n, we are done. But this means that
$n > 1/((1+c/2)^{1/b}-1)$.
Restating, letting $N = frac{1}{(1+c/2)^{1/b}-1}$ and $r = frac{1+c/2}{1+c}$, if $n > N$ then $frac{(1+1/n)^b}{1+c} < frac{1+c/2}{1+c}$, which shows that
$$ frac{n^b}{(1+c)^n} < frac{N^b}{(1+c)^N}r^{n-N}$$
which goes to $0$ as $n rightarrow infty$ since $N$ is fixed and $0 < r < 1$.
Note: An elementary proof that $r^n rightarrow 0$ is in "What is Mathematics?" by Courant and Robbins.
Let $r = 1/(1+s)$, where $s > 0$. By Bernoulli's inequality,
$$r^n = frac{1}{(1+s)^n} < frac{1}{1+s n} = frac{1}{1+ n (1/r-1)}.$$
They similarly show that, if $a > 1$, then $a^n rightarrow infty$ like this:
Let $ a = 1+b$, where $b > 0$. Then
$$a^n = (1+b)^n > 1+n b = 1+n(a-1).$$
The keys in both these are expanding around zero and using Bernoulli's inequality.
Archimede's axiom (for any positive reals $x$ and $y$, there is an integer $n$ such that $x < n y$) does the rest.
$endgroup$
add a comment |
$begingroup$
Tom Apostol, in volume I of his 2-volume text on calculus, gives what I consider to be the canonical proof of this assertion. He calls it the theorem of “the slow growth of the logarithm”. By making the appropriate substitution (namely, exp(x) for x), you can get what we can call the theorem on “the fast growth of the exponential”.
I think you will have no difficulty agreeing that as far as comparing a polynomial to the exponential goes, the only part of the polynomial that we have to consider is the term involving the highest power, and that we can even disregard its coefficient. (This agreement is currently shown in your question, but I believe that is the result of someone else’s edit, and so I am answering what I believe your original question was.) That is what that theorem in Apostol does: it shows that if a > 0 and b > 0, then the limit as x goes to infinity of ((log(x))^a)/(x^b) goes to 0.
You might be interested is knowing a significant application of this fact, namely, in the proof of the recursion formula for what is known as the Gamma function (which is the canonical continuous extension of the factorial function but, for historical reasons, shifted one unit). That is, in proving that Gamma(x + 1) = xGamma(x), one typically applies the technique of integration-by-parts, and one piece of the resulting expression is a power divided by an exponential, which goes to 0 as one takes the limit as the upper limit of the integral goes to infinity.
Another application of this fact is the shape of the graph of y = x*exp(x), namely, that as x goes to negative infinity, y goes to 0.
$endgroup$
add a comment |
$begingroup$
As several of the previous answers have noted it is enough to show that $lim_{xto infty} frac{x}{a^x}=0$. To prove this, in turn it is enough to prove that $lim_{x to infty} frac{a^x}{x} = infty$.
Lets compute the derivative and second derivative of $f(x) = frac{a^x}{x}$. We find that
$f'(x) = frac{a^x(x ln(a)-1)}{x^2}$ and $f''(x) = frac{a^x(x^2 ln^2(a)-2ln(a) x+2)}{x^3}$.
We see that for large enough values of $x$, both $f'(x)$ and $f''(x)$ are positive. Thus, $f$ not only grows, it grows at an increasing rate. The only possible conclusion is that $lim_{xto infty} frac{a^x}{x} = infty$.
With a little work this can be made into a completely rigorous proof.
$endgroup$
add a comment |
$begingroup$
Another method is the following (this method was originally? used by W. Rudin in his PMA):
For $ngeqslant2(|lceil brceil|+1)$ (the absolute value of the ceiling of $b$) we have, using the binomial theorem:$$begin{aligned}a^n&=(1+(a-1))^n\\&>binom{n}{|lceil brceil|+1}(a-1)^{|lceil brceil|+1}\\&=dfrac{n(n-1)cdots(n-|lceil brceil|)}{(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}\\&>dfrac{n^{|lceil brceil|+1}}{2^{|lceil brceil|+1}(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}end{aligned}$$ so $$left(dfrac{2^{|lceil brceil|+1}(|lceil brceil|+1)!}{(a-1)^{|lceil brceil|+1}}right)cdotdfrac{1}{n}>dfrac{n^{|lceil brceil|}}{a^n}geqslantdfrac{n^b}{a^n}$$ and hence $$lim_{ntoinfty}dfrac{n^b}{a^n}=0.$$
$endgroup$
add a comment |
$begingroup$
The case $ ble 0$ is blatantly obvious since $n^b le 1$ for $ble 0$ and hence since $a>1$ we have, $$frac{n^b}{a^n} le frac{1}{a^n} =left(frac{1}{a}right)^nto 0$$
we assume $ b>0$ then,
$$frac{n^b}{a^n}=left(frac{b}{ln a} right)^bleft(nfrac{ln a}{b} e^{-nfrac{ln a}{b}}right)^b color{red}{overset{u= nfrac{ln a}{b}}{:=}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b} ~~~~ b>0,~~$$
Hence, $$lim_{ntoinfty}frac{n^b}{a^n}= lim_{utoinfty}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b=0$$
$endgroup$
add a comment |
$begingroup$
You could also see the following:
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}$$
Now $lim_{nrightarrowinfty}log (frac{n^b}{a^n})=lim_{nrightarrowinfty}(log (n^b) - log (a^n))=lim_{nrightarrowinfty} (blog(n)-nlog(a))=-infty$ assuming $a>1$.
So $lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}=0=lim_{nrightarrowinfty}frac{n^b}{a^n}$
Just to clarify the comment, I am adding the proof that $$lim_{nrightarrowinfty} (blog (n)-nlog (a))=-infty$$
First by applying L'Hôpital's rule we get
$$lim_{nrightarrowinfty} frac{n}{log(n)}=lim_{nrightarrowinfty} frac{1}{frac{1}{n}} to infty$$
We have then
$$lim_{nrightarrowinfty} (blog (n)-nlog (a))=lim_{nrightarrowinfty} log(n)(b-log(a)frac{n}{log(n)})=-infty$$
$endgroup$
$begingroup$
Then you need an upper bound of $log(n)$.
$endgroup$
– ablmf
Apr 15 '18 at 9:18
$begingroup$
I assumed obviously that is a known fact that $log (n) to infty$ as $n to infty$ and that you know how to prove $lim_{nrightarrowinfty} (blog n-nlog a)=-infty$ assuming $a>1$
$endgroup$
– Marco Bellocchi
Apr 15 '18 at 10:03
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f55468%2fhow-to-prove-that-exponential-grows-faster-than-polynomial%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
13 Answers
13
active
oldest
votes
13 Answers
13
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
We could prove this by induction on integers $k$:
$$
lim_{n to infty} frac{n^k}{a^n} = 0.
$$
The case $k = 0$ is straightforward. I will leave the induction step to you. To see how this implies the statement for all real $b$, just note that every real number is less than some integer. In particular, $b leq lceil b rceil$. Thus,
$$
0 leq lim_{n to infty} frac{n^b}{a^n} leq lim_{n to infty} frac{n^{lceil b rceil}}{a^n} = 0.
$$
The first inequality follows since all the terms are positive. The last equality follows from the induction we established previously.
$endgroup$
$begingroup$
Could this really work? I mean $lim_{nrightarrowinfty}n = infty$. How can I do the induction?
$endgroup$
– ablmf
Aug 4 '11 at 2:55
$begingroup$
What you said is true, but I don't see how it enters the picture. As far as the induction process: when $k = 0$, you need to show $lim_{n to infty} frac{1}{a^n} =0$. As far as the induction step, assume that it holds for $k$, then show that it also holds for $k+1$ (Apply L'Hopital). This is pretty much the entire solution, but again, I will let you fill in the details.
$endgroup$
– JavaMan
Aug 4 '11 at 3:27
2
$begingroup$
Oh, I didn't know L'Hopital before. So the induction is $lim_{nrightarrowinfty}frac{n^(k+1)}{a^n} = lim_{nrightarrowinfty}frac{(k + 1) n^k}{ln(a)a^n} = 0$. Is that right?
$endgroup$
– ablmf
Aug 4 '11 at 3:41
$begingroup$
Yes that's right. Since $k+1$ and $ln(a)$ are constants, we can pull them out of the limit to get $frac{k+1}{ln(a)}lim_{n to infty} frac{n^{k}}{a^n} = frac{k+1}{ln(a)} cdot 0 = 0$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:45
$begingroup$
Also, I should point out that when using TeX, if you enclose your exponent with braces like "a^{2x}", it will exponentiation the entire term and it will look like $a^{2x}$. Without braces, you get $a^2x$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:48
add a comment |
$begingroup$
We could prove this by induction on integers $k$:
$$
lim_{n to infty} frac{n^k}{a^n} = 0.
$$
The case $k = 0$ is straightforward. I will leave the induction step to you. To see how this implies the statement for all real $b$, just note that every real number is less than some integer. In particular, $b leq lceil b rceil$. Thus,
$$
0 leq lim_{n to infty} frac{n^b}{a^n} leq lim_{n to infty} frac{n^{lceil b rceil}}{a^n} = 0.
$$
The first inequality follows since all the terms are positive. The last equality follows from the induction we established previously.
$endgroup$
$begingroup$
Could this really work? I mean $lim_{nrightarrowinfty}n = infty$. How can I do the induction?
$endgroup$
– ablmf
Aug 4 '11 at 2:55
$begingroup$
What you said is true, but I don't see how it enters the picture. As far as the induction process: when $k = 0$, you need to show $lim_{n to infty} frac{1}{a^n} =0$. As far as the induction step, assume that it holds for $k$, then show that it also holds for $k+1$ (Apply L'Hopital). This is pretty much the entire solution, but again, I will let you fill in the details.
$endgroup$
– JavaMan
Aug 4 '11 at 3:27
2
$begingroup$
Oh, I didn't know L'Hopital before. So the induction is $lim_{nrightarrowinfty}frac{n^(k+1)}{a^n} = lim_{nrightarrowinfty}frac{(k + 1) n^k}{ln(a)a^n} = 0$. Is that right?
$endgroup$
– ablmf
Aug 4 '11 at 3:41
$begingroup$
Yes that's right. Since $k+1$ and $ln(a)$ are constants, we can pull them out of the limit to get $frac{k+1}{ln(a)}lim_{n to infty} frac{n^{k}}{a^n} = frac{k+1}{ln(a)} cdot 0 = 0$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:45
$begingroup$
Also, I should point out that when using TeX, if you enclose your exponent with braces like "a^{2x}", it will exponentiation the entire term and it will look like $a^{2x}$. Without braces, you get $a^2x$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:48
add a comment |
$begingroup$
We could prove this by induction on integers $k$:
$$
lim_{n to infty} frac{n^k}{a^n} = 0.
$$
The case $k = 0$ is straightforward. I will leave the induction step to you. To see how this implies the statement for all real $b$, just note that every real number is less than some integer. In particular, $b leq lceil b rceil$. Thus,
$$
0 leq lim_{n to infty} frac{n^b}{a^n} leq lim_{n to infty} frac{n^{lceil b rceil}}{a^n} = 0.
$$
The first inequality follows since all the terms are positive. The last equality follows from the induction we established previously.
$endgroup$
We could prove this by induction on integers $k$:
$$
lim_{n to infty} frac{n^k}{a^n} = 0.
$$
The case $k = 0$ is straightforward. I will leave the induction step to you. To see how this implies the statement for all real $b$, just note that every real number is less than some integer. In particular, $b leq lceil b rceil$. Thus,
$$
0 leq lim_{n to infty} frac{n^b}{a^n} leq lim_{n to infty} frac{n^{lceil b rceil}}{a^n} = 0.
$$
The first inequality follows since all the terms are positive. The last equality follows from the induction we established previously.
answered Aug 4 '11 at 2:38
JavaManJavaMan
11.2k12855
11.2k12855
$begingroup$
Could this really work? I mean $lim_{nrightarrowinfty}n = infty$. How can I do the induction?
$endgroup$
– ablmf
Aug 4 '11 at 2:55
$begingroup$
What you said is true, but I don't see how it enters the picture. As far as the induction process: when $k = 0$, you need to show $lim_{n to infty} frac{1}{a^n} =0$. As far as the induction step, assume that it holds for $k$, then show that it also holds for $k+1$ (Apply L'Hopital). This is pretty much the entire solution, but again, I will let you fill in the details.
$endgroup$
– JavaMan
Aug 4 '11 at 3:27
2
$begingroup$
Oh, I didn't know L'Hopital before. So the induction is $lim_{nrightarrowinfty}frac{n^(k+1)}{a^n} = lim_{nrightarrowinfty}frac{(k + 1) n^k}{ln(a)a^n} = 0$. Is that right?
$endgroup$
– ablmf
Aug 4 '11 at 3:41
$begingroup$
Yes that's right. Since $k+1$ and $ln(a)$ are constants, we can pull them out of the limit to get $frac{k+1}{ln(a)}lim_{n to infty} frac{n^{k}}{a^n} = frac{k+1}{ln(a)} cdot 0 = 0$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:45
$begingroup$
Also, I should point out that when using TeX, if you enclose your exponent with braces like "a^{2x}", it will exponentiation the entire term and it will look like $a^{2x}$. Without braces, you get $a^2x$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:48
add a comment |
$begingroup$
Could this really work? I mean $lim_{nrightarrowinfty}n = infty$. How can I do the induction?
$endgroup$
– ablmf
Aug 4 '11 at 2:55
$begingroup$
What you said is true, but I don't see how it enters the picture. As far as the induction process: when $k = 0$, you need to show $lim_{n to infty} frac{1}{a^n} =0$. As far as the induction step, assume that it holds for $k$, then show that it also holds for $k+1$ (Apply L'Hopital). This is pretty much the entire solution, but again, I will let you fill in the details.
$endgroup$
– JavaMan
Aug 4 '11 at 3:27
2
$begingroup$
Oh, I didn't know L'Hopital before. So the induction is $lim_{nrightarrowinfty}frac{n^(k+1)}{a^n} = lim_{nrightarrowinfty}frac{(k + 1) n^k}{ln(a)a^n} = 0$. Is that right?
$endgroup$
– ablmf
Aug 4 '11 at 3:41
$begingroup$
Yes that's right. Since $k+1$ and $ln(a)$ are constants, we can pull them out of the limit to get $frac{k+1}{ln(a)}lim_{n to infty} frac{n^{k}}{a^n} = frac{k+1}{ln(a)} cdot 0 = 0$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:45
$begingroup$
Also, I should point out that when using TeX, if you enclose your exponent with braces like "a^{2x}", it will exponentiation the entire term and it will look like $a^{2x}$. Without braces, you get $a^2x$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:48
$begingroup$
Could this really work? I mean $lim_{nrightarrowinfty}n = infty$. How can I do the induction?
$endgroup$
– ablmf
Aug 4 '11 at 2:55
$begingroup$
Could this really work? I mean $lim_{nrightarrowinfty}n = infty$. How can I do the induction?
$endgroup$
– ablmf
Aug 4 '11 at 2:55
$begingroup$
What you said is true, but I don't see how it enters the picture. As far as the induction process: when $k = 0$, you need to show $lim_{n to infty} frac{1}{a^n} =0$. As far as the induction step, assume that it holds for $k$, then show that it also holds for $k+1$ (Apply L'Hopital). This is pretty much the entire solution, but again, I will let you fill in the details.
$endgroup$
– JavaMan
Aug 4 '11 at 3:27
$begingroup$
What you said is true, but I don't see how it enters the picture. As far as the induction process: when $k = 0$, you need to show $lim_{n to infty} frac{1}{a^n} =0$. As far as the induction step, assume that it holds for $k$, then show that it also holds for $k+1$ (Apply L'Hopital). This is pretty much the entire solution, but again, I will let you fill in the details.
$endgroup$
– JavaMan
Aug 4 '11 at 3:27
2
2
$begingroup$
Oh, I didn't know L'Hopital before. So the induction is $lim_{nrightarrowinfty}frac{n^(k+1)}{a^n} = lim_{nrightarrowinfty}frac{(k + 1) n^k}{ln(a)a^n} = 0$. Is that right?
$endgroup$
– ablmf
Aug 4 '11 at 3:41
$begingroup$
Oh, I didn't know L'Hopital before. So the induction is $lim_{nrightarrowinfty}frac{n^(k+1)}{a^n} = lim_{nrightarrowinfty}frac{(k + 1) n^k}{ln(a)a^n} = 0$. Is that right?
$endgroup$
– ablmf
Aug 4 '11 at 3:41
$begingroup$
Yes that's right. Since $k+1$ and $ln(a)$ are constants, we can pull them out of the limit to get $frac{k+1}{ln(a)}lim_{n to infty} frac{n^{k}}{a^n} = frac{k+1}{ln(a)} cdot 0 = 0$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:45
$begingroup$
Yes that's right. Since $k+1$ and $ln(a)$ are constants, we can pull them out of the limit to get $frac{k+1}{ln(a)}lim_{n to infty} frac{n^{k}}{a^n} = frac{k+1}{ln(a)} cdot 0 = 0$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:45
$begingroup$
Also, I should point out that when using TeX, if you enclose your exponent with braces like "a^{2x}", it will exponentiation the entire term and it will look like $a^{2x}$. Without braces, you get $a^2x$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:48
$begingroup$
Also, I should point out that when using TeX, if you enclose your exponent with braces like "a^{2x}", it will exponentiation the entire term and it will look like $a^{2x}$. Without braces, you get $a^2x$.
$endgroup$
– JavaMan
Aug 4 '11 at 3:48
add a comment |
$begingroup$
We can confine attention to $b ge 1$. This is because, if $0<b<1$, then $n^b le n$. If we can prove that $n/a^n$ approaches $0$, it will follow that $n^b/a^n$ approaches $0$ for any positive $ble 1$. So from now on we take $b ge 1$.
Now look at $n^b/a^n$, and take the $b$-th root. We get
$$frac{n}{(a^{1/b})^n}$$
or equivalently
$$frac{n}{c^n}$$
where $c=a^{1/b}$.
Note that $c>1$. If we can prove that $n/c^n$ approaches $0$ as $ntoinfty$, we will be finished.
For our original sequence consists of the $b$-th powers of the new sequence $(n/c^n)$. If we can show that $n/c^n$ has limit $0$, then after a while, $n/c^n le 1$, and so, after a while, the old sequence is, term by term, $le$ the new sequence. (Recall that $bge 1$.)
Progress, we need only look at $n/c^n$.
How do we continue? Any of the ways suggested by the other posts. Or else, let
$c=1+d$. Note that $d$ is positive. Note also that from the Binomial Theorem, if $n ge 2$ we have
$$c^n=(1+d)^n ge 1+dn+d^2n(n-1)/2gt d^2(n)(n-1)/2.$$
It follows that
$$frac{n}{c^n}< frac{n}{d^2(n)(n-1)/2}=frac{2}{d^2(n-1)}.$$
and it is clear that $dfrac{2}{d^2(n-1)} to 0$ as $ntoinfty$.
$endgroup$
8
$begingroup$
+1: I really, honestly and whole-heartedly think that this is the way to do this problem. From scratch. No l'Hospital, no properties of logarithms. Nothing shaky :-). Just the binomial theorem (or Bernoulli's inequality) to do the case $b=1$.
$endgroup$
– Jyrki Lahtonen
Aug 4 '11 at 8:03
$begingroup$
This is an elegant proof that could easily be understood. I will one day I could solve these problems as well as you do.
$endgroup$
– ablmf
Aug 4 '11 at 14:04
$begingroup$
Excellent answer.
$endgroup$
– YoTengoUnLCD
Jun 9 '16 at 23:56
add a comment |
$begingroup$
We can confine attention to $b ge 1$. This is because, if $0<b<1$, then $n^b le n$. If we can prove that $n/a^n$ approaches $0$, it will follow that $n^b/a^n$ approaches $0$ for any positive $ble 1$. So from now on we take $b ge 1$.
Now look at $n^b/a^n$, and take the $b$-th root. We get
$$frac{n}{(a^{1/b})^n}$$
or equivalently
$$frac{n}{c^n}$$
where $c=a^{1/b}$.
Note that $c>1$. If we can prove that $n/c^n$ approaches $0$ as $ntoinfty$, we will be finished.
For our original sequence consists of the $b$-th powers of the new sequence $(n/c^n)$. If we can show that $n/c^n$ has limit $0$, then after a while, $n/c^n le 1$, and so, after a while, the old sequence is, term by term, $le$ the new sequence. (Recall that $bge 1$.)
Progress, we need only look at $n/c^n$.
How do we continue? Any of the ways suggested by the other posts. Or else, let
$c=1+d$. Note that $d$ is positive. Note also that from the Binomial Theorem, if $n ge 2$ we have
$$c^n=(1+d)^n ge 1+dn+d^2n(n-1)/2gt d^2(n)(n-1)/2.$$
It follows that
$$frac{n}{c^n}< frac{n}{d^2(n)(n-1)/2}=frac{2}{d^2(n-1)}.$$
and it is clear that $dfrac{2}{d^2(n-1)} to 0$ as $ntoinfty$.
$endgroup$
8
$begingroup$
+1: I really, honestly and whole-heartedly think that this is the way to do this problem. From scratch. No l'Hospital, no properties of logarithms. Nothing shaky :-). Just the binomial theorem (or Bernoulli's inequality) to do the case $b=1$.
$endgroup$
– Jyrki Lahtonen
Aug 4 '11 at 8:03
$begingroup$
This is an elegant proof that could easily be understood. I will one day I could solve these problems as well as you do.
$endgroup$
– ablmf
Aug 4 '11 at 14:04
$begingroup$
Excellent answer.
$endgroup$
– YoTengoUnLCD
Jun 9 '16 at 23:56
add a comment |
$begingroup$
We can confine attention to $b ge 1$. This is because, if $0<b<1$, then $n^b le n$. If we can prove that $n/a^n$ approaches $0$, it will follow that $n^b/a^n$ approaches $0$ for any positive $ble 1$. So from now on we take $b ge 1$.
Now look at $n^b/a^n$, and take the $b$-th root. We get
$$frac{n}{(a^{1/b})^n}$$
or equivalently
$$frac{n}{c^n}$$
where $c=a^{1/b}$.
Note that $c>1$. If we can prove that $n/c^n$ approaches $0$ as $ntoinfty$, we will be finished.
For our original sequence consists of the $b$-th powers of the new sequence $(n/c^n)$. If we can show that $n/c^n$ has limit $0$, then after a while, $n/c^n le 1$, and so, after a while, the old sequence is, term by term, $le$ the new sequence. (Recall that $bge 1$.)
Progress, we need only look at $n/c^n$.
How do we continue? Any of the ways suggested by the other posts. Or else, let
$c=1+d$. Note that $d$ is positive. Note also that from the Binomial Theorem, if $n ge 2$ we have
$$c^n=(1+d)^n ge 1+dn+d^2n(n-1)/2gt d^2(n)(n-1)/2.$$
It follows that
$$frac{n}{c^n}< frac{n}{d^2(n)(n-1)/2}=frac{2}{d^2(n-1)}.$$
and it is clear that $dfrac{2}{d^2(n-1)} to 0$ as $ntoinfty$.
$endgroup$
We can confine attention to $b ge 1$. This is because, if $0<b<1$, then $n^b le n$. If we can prove that $n/a^n$ approaches $0$, it will follow that $n^b/a^n$ approaches $0$ for any positive $ble 1$. So from now on we take $b ge 1$.
Now look at $n^b/a^n$, and take the $b$-th root. We get
$$frac{n}{(a^{1/b})^n}$$
or equivalently
$$frac{n}{c^n}$$
where $c=a^{1/b}$.
Note that $c>1$. If we can prove that $n/c^n$ approaches $0$ as $ntoinfty$, we will be finished.
For our original sequence consists of the $b$-th powers of the new sequence $(n/c^n)$. If we can show that $n/c^n$ has limit $0$, then after a while, $n/c^n le 1$, and so, after a while, the old sequence is, term by term, $le$ the new sequence. (Recall that $bge 1$.)
Progress, we need only look at $n/c^n$.
How do we continue? Any of the ways suggested by the other posts. Or else, let
$c=1+d$. Note that $d$ is positive. Note also that from the Binomial Theorem, if $n ge 2$ we have
$$c^n=(1+d)^n ge 1+dn+d^2n(n-1)/2gt d^2(n)(n-1)/2.$$
It follows that
$$frac{n}{c^n}< frac{n}{d^2(n)(n-1)/2}=frac{2}{d^2(n-1)}.$$
and it is clear that $dfrac{2}{d^2(n-1)} to 0$ as $ntoinfty$.
edited Aug 4 '11 at 5:27
answered Aug 4 '11 at 3:23
André NicolasAndré Nicolas
455k36432822
455k36432822
8
$begingroup$
+1: I really, honestly and whole-heartedly think that this is the way to do this problem. From scratch. No l'Hospital, no properties of logarithms. Nothing shaky :-). Just the binomial theorem (or Bernoulli's inequality) to do the case $b=1$.
$endgroup$
– Jyrki Lahtonen
Aug 4 '11 at 8:03
$begingroup$
This is an elegant proof that could easily be understood. I will one day I could solve these problems as well as you do.
$endgroup$
– ablmf
Aug 4 '11 at 14:04
$begingroup$
Excellent answer.
$endgroup$
– YoTengoUnLCD
Jun 9 '16 at 23:56
add a comment |
8
$begingroup$
+1: I really, honestly and whole-heartedly think that this is the way to do this problem. From scratch. No l'Hospital, no properties of logarithms. Nothing shaky :-). Just the binomial theorem (or Bernoulli's inequality) to do the case $b=1$.
$endgroup$
– Jyrki Lahtonen
Aug 4 '11 at 8:03
$begingroup$
This is an elegant proof that could easily be understood. I will one day I could solve these problems as well as you do.
$endgroup$
– ablmf
Aug 4 '11 at 14:04
$begingroup$
Excellent answer.
$endgroup$
– YoTengoUnLCD
Jun 9 '16 at 23:56
8
8
$begingroup$
+1: I really, honestly and whole-heartedly think that this is the way to do this problem. From scratch. No l'Hospital, no properties of logarithms. Nothing shaky :-). Just the binomial theorem (or Bernoulli's inequality) to do the case $b=1$.
$endgroup$
– Jyrki Lahtonen
Aug 4 '11 at 8:03
$begingroup$
+1: I really, honestly and whole-heartedly think that this is the way to do this problem. From scratch. No l'Hospital, no properties of logarithms. Nothing shaky :-). Just the binomial theorem (or Bernoulli's inequality) to do the case $b=1$.
$endgroup$
– Jyrki Lahtonen
Aug 4 '11 at 8:03
$begingroup$
This is an elegant proof that could easily be understood. I will one day I could solve these problems as well as you do.
$endgroup$
– ablmf
Aug 4 '11 at 14:04
$begingroup$
This is an elegant proof that could easily be understood. I will one day I could solve these problems as well as you do.
$endgroup$
– ablmf
Aug 4 '11 at 14:04
$begingroup$
Excellent answer.
$endgroup$
– YoTengoUnLCD
Jun 9 '16 at 23:56
$begingroup$
Excellent answer.
$endgroup$
– YoTengoUnLCD
Jun 9 '16 at 23:56
add a comment |
$begingroup$
First, notice that $(n+1)^b/n^b=1$ plus terms in negative powers of $n$, so it goes to 1 as $ntoinfty$ (with $b$ fixed). Now going from $n^b/a^n$ to $(n+1)^b/a^{n+1}$, you multiply the numerator by something that's going to 1 but the denominator by something that isn't (namely, by $agt1$).
$endgroup$
3
$begingroup$
This is the simplest solution I can think of.
$endgroup$
– Qiaochu Yuan
Aug 4 '11 at 2:59
1
$begingroup$
This is a special case of multiplicative telescopy. See my post here for much further discussion of this example. Similar ideas work more generally for hyperrational functions.
$endgroup$
– Bill Dubuque
Aug 4 '11 at 3:02
$begingroup$
That means $lim_{nrightarrowinfty}frac{n^b}{a^n}=lim_{nrightarrowinfty}frac{(n+1)^b}{a^(n+1)}$, so it must be zero. Is my understanding right?
$endgroup$
– ablmf
Aug 4 '11 at 3:18
$begingroup$
Yes, that's one good way to understand it.
$endgroup$
– Gerry Myerson
Aug 4 '11 at 3:50
$begingroup$
@GerryMyerson I’m just reading this right now, but that’s a bad way to understand it: It isn’t clear yet that the limit exists at all and for any converging sequence $lim a_n = lim a_{n+1}$ holds, so one cannot deduce anything about the limit from this or am I missing something? Maybe ablmf meant $lim_{n → ∞} {n^b}/{a^n} = lim_{n → ∞} {n^b}/{a^{n+1}}$? But here again you first need to know that the limit exists.
$endgroup$
– k.stm
Jun 2 '13 at 17:55
|
show 1 more comment
$begingroup$
First, notice that $(n+1)^b/n^b=1$ plus terms in negative powers of $n$, so it goes to 1 as $ntoinfty$ (with $b$ fixed). Now going from $n^b/a^n$ to $(n+1)^b/a^{n+1}$, you multiply the numerator by something that's going to 1 but the denominator by something that isn't (namely, by $agt1$).
$endgroup$
3
$begingroup$
This is the simplest solution I can think of.
$endgroup$
– Qiaochu Yuan
Aug 4 '11 at 2:59
1
$begingroup$
This is a special case of multiplicative telescopy. See my post here for much further discussion of this example. Similar ideas work more generally for hyperrational functions.
$endgroup$
– Bill Dubuque
Aug 4 '11 at 3:02
$begingroup$
That means $lim_{nrightarrowinfty}frac{n^b}{a^n}=lim_{nrightarrowinfty}frac{(n+1)^b}{a^(n+1)}$, so it must be zero. Is my understanding right?
$endgroup$
– ablmf
Aug 4 '11 at 3:18
$begingroup$
Yes, that's one good way to understand it.
$endgroup$
– Gerry Myerson
Aug 4 '11 at 3:50
$begingroup$
@GerryMyerson I’m just reading this right now, but that’s a bad way to understand it: It isn’t clear yet that the limit exists at all and for any converging sequence $lim a_n = lim a_{n+1}$ holds, so one cannot deduce anything about the limit from this or am I missing something? Maybe ablmf meant $lim_{n → ∞} {n^b}/{a^n} = lim_{n → ∞} {n^b}/{a^{n+1}}$? But here again you first need to know that the limit exists.
$endgroup$
– k.stm
Jun 2 '13 at 17:55
|
show 1 more comment
$begingroup$
First, notice that $(n+1)^b/n^b=1$ plus terms in negative powers of $n$, so it goes to 1 as $ntoinfty$ (with $b$ fixed). Now going from $n^b/a^n$ to $(n+1)^b/a^{n+1}$, you multiply the numerator by something that's going to 1 but the denominator by something that isn't (namely, by $agt1$).
$endgroup$
First, notice that $(n+1)^b/n^b=1$ plus terms in negative powers of $n$, so it goes to 1 as $ntoinfty$ (with $b$ fixed). Now going from $n^b/a^n$ to $(n+1)^b/a^{n+1}$, you multiply the numerator by something that's going to 1 but the denominator by something that isn't (namely, by $agt1$).
answered Aug 4 '11 at 1:59
Gerry MyersonGerry Myerson
148k8152306
148k8152306
3
$begingroup$
This is the simplest solution I can think of.
$endgroup$
– Qiaochu Yuan
Aug 4 '11 at 2:59
1
$begingroup$
This is a special case of multiplicative telescopy. See my post here for much further discussion of this example. Similar ideas work more generally for hyperrational functions.
$endgroup$
– Bill Dubuque
Aug 4 '11 at 3:02
$begingroup$
That means $lim_{nrightarrowinfty}frac{n^b}{a^n}=lim_{nrightarrowinfty}frac{(n+1)^b}{a^(n+1)}$, so it must be zero. Is my understanding right?
$endgroup$
– ablmf
Aug 4 '11 at 3:18
$begingroup$
Yes, that's one good way to understand it.
$endgroup$
– Gerry Myerson
Aug 4 '11 at 3:50
$begingroup$
@GerryMyerson I’m just reading this right now, but that’s a bad way to understand it: It isn’t clear yet that the limit exists at all and for any converging sequence $lim a_n = lim a_{n+1}$ holds, so one cannot deduce anything about the limit from this or am I missing something? Maybe ablmf meant $lim_{n → ∞} {n^b}/{a^n} = lim_{n → ∞} {n^b}/{a^{n+1}}$? But here again you first need to know that the limit exists.
$endgroup$
– k.stm
Jun 2 '13 at 17:55
|
show 1 more comment
3
$begingroup$
This is the simplest solution I can think of.
$endgroup$
– Qiaochu Yuan
Aug 4 '11 at 2:59
1
$begingroup$
This is a special case of multiplicative telescopy. See my post here for much further discussion of this example. Similar ideas work more generally for hyperrational functions.
$endgroup$
– Bill Dubuque
Aug 4 '11 at 3:02
$begingroup$
That means $lim_{nrightarrowinfty}frac{n^b}{a^n}=lim_{nrightarrowinfty}frac{(n+1)^b}{a^(n+1)}$, so it must be zero. Is my understanding right?
$endgroup$
– ablmf
Aug 4 '11 at 3:18
$begingroup$
Yes, that's one good way to understand it.
$endgroup$
– Gerry Myerson
Aug 4 '11 at 3:50
$begingroup$
@GerryMyerson I’m just reading this right now, but that’s a bad way to understand it: It isn’t clear yet that the limit exists at all and for any converging sequence $lim a_n = lim a_{n+1}$ holds, so one cannot deduce anything about the limit from this or am I missing something? Maybe ablmf meant $lim_{n → ∞} {n^b}/{a^n} = lim_{n → ∞} {n^b}/{a^{n+1}}$? But here again you first need to know that the limit exists.
$endgroup$
– k.stm
Jun 2 '13 at 17:55
3
3
$begingroup$
This is the simplest solution I can think of.
$endgroup$
– Qiaochu Yuan
Aug 4 '11 at 2:59
$begingroup$
This is the simplest solution I can think of.
$endgroup$
– Qiaochu Yuan
Aug 4 '11 at 2:59
1
1
$begingroup$
This is a special case of multiplicative telescopy. See my post here for much further discussion of this example. Similar ideas work more generally for hyperrational functions.
$endgroup$
– Bill Dubuque
Aug 4 '11 at 3:02
$begingroup$
This is a special case of multiplicative telescopy. See my post here for much further discussion of this example. Similar ideas work more generally for hyperrational functions.
$endgroup$
– Bill Dubuque
Aug 4 '11 at 3:02
$begingroup$
That means $lim_{nrightarrowinfty}frac{n^b}{a^n}=lim_{nrightarrowinfty}frac{(n+1)^b}{a^(n+1)}$, so it must be zero. Is my understanding right?
$endgroup$
– ablmf
Aug 4 '11 at 3:18
$begingroup$
That means $lim_{nrightarrowinfty}frac{n^b}{a^n}=lim_{nrightarrowinfty}frac{(n+1)^b}{a^(n+1)}$, so it must be zero. Is my understanding right?
$endgroup$
– ablmf
Aug 4 '11 at 3:18
$begingroup$
Yes, that's one good way to understand it.
$endgroup$
– Gerry Myerson
Aug 4 '11 at 3:50
$begingroup$
Yes, that's one good way to understand it.
$endgroup$
– Gerry Myerson
Aug 4 '11 at 3:50
$begingroup$
@GerryMyerson I’m just reading this right now, but that’s a bad way to understand it: It isn’t clear yet that the limit exists at all and for any converging sequence $lim a_n = lim a_{n+1}$ holds, so one cannot deduce anything about the limit from this or am I missing something? Maybe ablmf meant $lim_{n → ∞} {n^b}/{a^n} = lim_{n → ∞} {n^b}/{a^{n+1}}$? But here again you first need to know that the limit exists.
$endgroup$
– k.stm
Jun 2 '13 at 17:55
$begingroup$
@GerryMyerson I’m just reading this right now, but that’s a bad way to understand it: It isn’t clear yet that the limit exists at all and for any converging sequence $lim a_n = lim a_{n+1}$ holds, so one cannot deduce anything about the limit from this or am I missing something? Maybe ablmf meant $lim_{n → ∞} {n^b}/{a^n} = lim_{n → ∞} {n^b}/{a^{n+1}}$? But here again you first need to know that the limit exists.
$endgroup$
– k.stm
Jun 2 '13 at 17:55
|
show 1 more comment
$begingroup$
You can expand as$$a^n = e^{nlog a} = 1+nlog a +cdots +frac{(nlog a)^b}{b!}+cdots$$
$$frac{a^n}{n^b} =frac{1}{n^b}+frac{1}{n^{b-1}}log a+cdots+frac{(log a)^{b}}{b!}+nfrac{(log a)^{b+1}}{(b+1)!}+cdots $$
So, $$frac{a^n}{n^b}geq nfrac{(log a)^{b+1}}{(b+1)!}$$. This becomes arbitrarily large with large $n$
Therefore, $$lim_{nrightarrowinfty}frac{a^n}{n^b} = infty$$ Or $$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
$endgroup$
add a comment |
$begingroup$
You can expand as$$a^n = e^{nlog a} = 1+nlog a +cdots +frac{(nlog a)^b}{b!}+cdots$$
$$frac{a^n}{n^b} =frac{1}{n^b}+frac{1}{n^{b-1}}log a+cdots+frac{(log a)^{b}}{b!}+nfrac{(log a)^{b+1}}{(b+1)!}+cdots $$
So, $$frac{a^n}{n^b}geq nfrac{(log a)^{b+1}}{(b+1)!}$$. This becomes arbitrarily large with large $n$
Therefore, $$lim_{nrightarrowinfty}frac{a^n}{n^b} = infty$$ Or $$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
$endgroup$
add a comment |
$begingroup$
You can expand as$$a^n = e^{nlog a} = 1+nlog a +cdots +frac{(nlog a)^b}{b!}+cdots$$
$$frac{a^n}{n^b} =frac{1}{n^b}+frac{1}{n^{b-1}}log a+cdots+frac{(log a)^{b}}{b!}+nfrac{(log a)^{b+1}}{(b+1)!}+cdots $$
So, $$frac{a^n}{n^b}geq nfrac{(log a)^{b+1}}{(b+1)!}$$. This becomes arbitrarily large with large $n$
Therefore, $$lim_{nrightarrowinfty}frac{a^n}{n^b} = infty$$ Or $$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
$endgroup$
You can expand as$$a^n = e^{nlog a} = 1+nlog a +cdots +frac{(nlog a)^b}{b!}+cdots$$
$$frac{a^n}{n^b} =frac{1}{n^b}+frac{1}{n^{b-1}}log a+cdots+frac{(log a)^{b}}{b!}+nfrac{(log a)^{b+1}}{(b+1)!}+cdots $$
So, $$frac{a^n}{n^b}geq nfrac{(log a)^{b+1}}{(b+1)!}$$. This becomes arbitrarily large with large $n$
Therefore, $$lim_{nrightarrowinfty}frac{a^n}{n^b} = infty$$ Or $$lim_{nrightarrowinfty}frac{n^b}{a^n} = 0$$
edited Aug 4 '11 at 4:00
answered Aug 4 '11 at 3:54
kuch nahikuch nahi
3,44853267
3,44853267
add a comment |
add a comment |
$begingroup$
Regard the exponential function as the unique solution to the initial value problem $f(0) = 1$ with $f' = f$. The equivalent integral formulation is
$f(t) = 1 + int_0^t f(s) ds$
First prove that $f$ is an increasing function. For $t geq 0$, you can use a method of continuity argument by considering the set ${ s ~:~ f mbox{ increases on } [0, s] }$. The supremum of this set cannot be finite. For $s < 0$ you can also consider the ODE satisfied by $f^2$, but you're not really concerned with that. Now that we know $f$ increase, we also know $f$ grows at least linearly:
$f(t) geq 1 + int_0^t 1 ds = (1 + t)$
It also grows at least quadratically..
$f(t) geq f(frac{t}{2}) + int_{t/2}^t f(s) ds geq (1 + t/2) f(t/2)$
And repeating yields
$f(t) geq (1 + t/2)^2$
In general, $f(t) geq (1 + t/n)^n$ by the same method. It's even true for $n$ not an integer. In this case, the function $(1 + t/n)^n$ is defined in the first place to be $e^{n log( 1 + t/n) }$. Then, since log x is defined by the integral $int_1^x frac{1}{t} dt = x int_0^1 (1 + sx)^{-1} ds $, we can rewrite
$(1 + t/n)^n = e^{n log (1 + t/n) } = mbox{exp}(t int_0^1 (1 + frac{s t}{n})^{-1} ds ) leq e^t$
EDIT: This answer may not be so helpful to you. I didn't read the end of your question, so you may not know how to work with most of the things I've been writing about. The main reason why the exponential grows faster than a polynomial is because if $f$ is exponential, then $f(n+1)$ is at least a constant times $f(n)$, whereas when $f$ is a polynomial, $f(n+1)$ is roughly the same size as $f(n)$ when $n$ is large. After all, you can hardly tell the difference between the volume of a huge cube, and a huge cube 1 unit longer in each direction.
$endgroup$
$begingroup$
Oops, seems that I won't be able to understand this before I finish my calculus course.
$endgroup$
– ablmf
Aug 4 '11 at 2:59
add a comment |
$begingroup$
Regard the exponential function as the unique solution to the initial value problem $f(0) = 1$ with $f' = f$. The equivalent integral formulation is
$f(t) = 1 + int_0^t f(s) ds$
First prove that $f$ is an increasing function. For $t geq 0$, you can use a method of continuity argument by considering the set ${ s ~:~ f mbox{ increases on } [0, s] }$. The supremum of this set cannot be finite. For $s < 0$ you can also consider the ODE satisfied by $f^2$, but you're not really concerned with that. Now that we know $f$ increase, we also know $f$ grows at least linearly:
$f(t) geq 1 + int_0^t 1 ds = (1 + t)$
It also grows at least quadratically..
$f(t) geq f(frac{t}{2}) + int_{t/2}^t f(s) ds geq (1 + t/2) f(t/2)$
And repeating yields
$f(t) geq (1 + t/2)^2$
In general, $f(t) geq (1 + t/n)^n$ by the same method. It's even true for $n$ not an integer. In this case, the function $(1 + t/n)^n$ is defined in the first place to be $e^{n log( 1 + t/n) }$. Then, since log x is defined by the integral $int_1^x frac{1}{t} dt = x int_0^1 (1 + sx)^{-1} ds $, we can rewrite
$(1 + t/n)^n = e^{n log (1 + t/n) } = mbox{exp}(t int_0^1 (1 + frac{s t}{n})^{-1} ds ) leq e^t$
EDIT: This answer may not be so helpful to you. I didn't read the end of your question, so you may not know how to work with most of the things I've been writing about. The main reason why the exponential grows faster than a polynomial is because if $f$ is exponential, then $f(n+1)$ is at least a constant times $f(n)$, whereas when $f$ is a polynomial, $f(n+1)$ is roughly the same size as $f(n)$ when $n$ is large. After all, you can hardly tell the difference between the volume of a huge cube, and a huge cube 1 unit longer in each direction.
$endgroup$
$begingroup$
Oops, seems that I won't be able to understand this before I finish my calculus course.
$endgroup$
– ablmf
Aug 4 '11 at 2:59
add a comment |
$begingroup$
Regard the exponential function as the unique solution to the initial value problem $f(0) = 1$ with $f' = f$. The equivalent integral formulation is
$f(t) = 1 + int_0^t f(s) ds$
First prove that $f$ is an increasing function. For $t geq 0$, you can use a method of continuity argument by considering the set ${ s ~:~ f mbox{ increases on } [0, s] }$. The supremum of this set cannot be finite. For $s < 0$ you can also consider the ODE satisfied by $f^2$, but you're not really concerned with that. Now that we know $f$ increase, we also know $f$ grows at least linearly:
$f(t) geq 1 + int_0^t 1 ds = (1 + t)$
It also grows at least quadratically..
$f(t) geq f(frac{t}{2}) + int_{t/2}^t f(s) ds geq (1 + t/2) f(t/2)$
And repeating yields
$f(t) geq (1 + t/2)^2$
In general, $f(t) geq (1 + t/n)^n$ by the same method. It's even true for $n$ not an integer. In this case, the function $(1 + t/n)^n$ is defined in the first place to be $e^{n log( 1 + t/n) }$. Then, since log x is defined by the integral $int_1^x frac{1}{t} dt = x int_0^1 (1 + sx)^{-1} ds $, we can rewrite
$(1 + t/n)^n = e^{n log (1 + t/n) } = mbox{exp}(t int_0^1 (1 + frac{s t}{n})^{-1} ds ) leq e^t$
EDIT: This answer may not be so helpful to you. I didn't read the end of your question, so you may not know how to work with most of the things I've been writing about. The main reason why the exponential grows faster than a polynomial is because if $f$ is exponential, then $f(n+1)$ is at least a constant times $f(n)$, whereas when $f$ is a polynomial, $f(n+1)$ is roughly the same size as $f(n)$ when $n$ is large. After all, you can hardly tell the difference between the volume of a huge cube, and a huge cube 1 unit longer in each direction.
$endgroup$
Regard the exponential function as the unique solution to the initial value problem $f(0) = 1$ with $f' = f$. The equivalent integral formulation is
$f(t) = 1 + int_0^t f(s) ds$
First prove that $f$ is an increasing function. For $t geq 0$, you can use a method of continuity argument by considering the set ${ s ~:~ f mbox{ increases on } [0, s] }$. The supremum of this set cannot be finite. For $s < 0$ you can also consider the ODE satisfied by $f^2$, but you're not really concerned with that. Now that we know $f$ increase, we also know $f$ grows at least linearly:
$f(t) geq 1 + int_0^t 1 ds = (1 + t)$
It also grows at least quadratically..
$f(t) geq f(frac{t}{2}) + int_{t/2}^t f(s) ds geq (1 + t/2) f(t/2)$
And repeating yields
$f(t) geq (1 + t/2)^2$
In general, $f(t) geq (1 + t/n)^n$ by the same method. It's even true for $n$ not an integer. In this case, the function $(1 + t/n)^n$ is defined in the first place to be $e^{n log( 1 + t/n) }$. Then, since log x is defined by the integral $int_1^x frac{1}{t} dt = x int_0^1 (1 + sx)^{-1} ds $, we can rewrite
$(1 + t/n)^n = e^{n log (1 + t/n) } = mbox{exp}(t int_0^1 (1 + frac{s t}{n})^{-1} ds ) leq e^t$
EDIT: This answer may not be so helpful to you. I didn't read the end of your question, so you may not know how to work with most of the things I've been writing about. The main reason why the exponential grows faster than a polynomial is because if $f$ is exponential, then $f(n+1)$ is at least a constant times $f(n)$, whereas when $f$ is a polynomial, $f(n+1)$ is roughly the same size as $f(n)$ when $n$ is large. After all, you can hardly tell the difference between the volume of a huge cube, and a huge cube 1 unit longer in each direction.
answered Aug 4 '11 at 2:13
user14222user14222
1612
1612
$begingroup$
Oops, seems that I won't be able to understand this before I finish my calculus course.
$endgroup$
– ablmf
Aug 4 '11 at 2:59
add a comment |
$begingroup$
Oops, seems that I won't be able to understand this before I finish my calculus course.
$endgroup$
– ablmf
Aug 4 '11 at 2:59
$begingroup$
Oops, seems that I won't be able to understand this before I finish my calculus course.
$endgroup$
– ablmf
Aug 4 '11 at 2:59
$begingroup$
Oops, seems that I won't be able to understand this before I finish my calculus course.
$endgroup$
– ablmf
Aug 4 '11 at 2:59
add a comment |
$begingroup$
Try expressing both in terms of the exponential $e^x$: $n^b$=$e^{bln(n)}$; $a^n=e^{nln(a)}$, so that the quotient becomes:
$frac{n^b}{a^n}$= $frac{e^{bln(n)}}{e^{nln(a)}}=e^{bln(n)-nln(a)}=e^{bln(n)-kn}=frac{e^{nlna}}{e^{kn}}$ , where k is a real constant. Can you tell which of ln(n) or n grows faster?
$endgroup$
$begingroup$
So the question becomes to prove $lim_{nrightarrowinfty}(bln(n)-kn) = -infty$ right? BTW, you made some typos.
$endgroup$
– ablmf
Aug 4 '11 at 2:58
$begingroup$
Yes; I think too, if you use that ln(n) grows slower than n. I will check for typos.
$endgroup$
– gary
Aug 4 '11 at 3:46
add a comment |
$begingroup$
Try expressing both in terms of the exponential $e^x$: $n^b$=$e^{bln(n)}$; $a^n=e^{nln(a)}$, so that the quotient becomes:
$frac{n^b}{a^n}$= $frac{e^{bln(n)}}{e^{nln(a)}}=e^{bln(n)-nln(a)}=e^{bln(n)-kn}=frac{e^{nlna}}{e^{kn}}$ , where k is a real constant. Can you tell which of ln(n) or n grows faster?
$endgroup$
$begingroup$
So the question becomes to prove $lim_{nrightarrowinfty}(bln(n)-kn) = -infty$ right? BTW, you made some typos.
$endgroup$
– ablmf
Aug 4 '11 at 2:58
$begingroup$
Yes; I think too, if you use that ln(n) grows slower than n. I will check for typos.
$endgroup$
– gary
Aug 4 '11 at 3:46
add a comment |
$begingroup$
Try expressing both in terms of the exponential $e^x$: $n^b$=$e^{bln(n)}$; $a^n=e^{nln(a)}$, so that the quotient becomes:
$frac{n^b}{a^n}$= $frac{e^{bln(n)}}{e^{nln(a)}}=e^{bln(n)-nln(a)}=e^{bln(n)-kn}=frac{e^{nlna}}{e^{kn}}$ , where k is a real constant. Can you tell which of ln(n) or n grows faster?
$endgroup$
Try expressing both in terms of the exponential $e^x$: $n^b$=$e^{bln(n)}$; $a^n=e^{nln(a)}$, so that the quotient becomes:
$frac{n^b}{a^n}$= $frac{e^{bln(n)}}{e^{nln(a)}}=e^{bln(n)-nln(a)}=e^{bln(n)-kn}=frac{e^{nlna}}{e^{kn}}$ , where k is a real constant. Can you tell which of ln(n) or n grows faster?
edited Aug 4 '11 at 3:48
answered Aug 4 '11 at 1:52
garygary
3,49511119
3,49511119
$begingroup$
So the question becomes to prove $lim_{nrightarrowinfty}(bln(n)-kn) = -infty$ right? BTW, you made some typos.
$endgroup$
– ablmf
Aug 4 '11 at 2:58
$begingroup$
Yes; I think too, if you use that ln(n) grows slower than n. I will check for typos.
$endgroup$
– gary
Aug 4 '11 at 3:46
add a comment |
$begingroup$
So the question becomes to prove $lim_{nrightarrowinfty}(bln(n)-kn) = -infty$ right? BTW, you made some typos.
$endgroup$
– ablmf
Aug 4 '11 at 2:58
$begingroup$
Yes; I think too, if you use that ln(n) grows slower than n. I will check for typos.
$endgroup$
– gary
Aug 4 '11 at 3:46
$begingroup$
So the question becomes to prove $lim_{nrightarrowinfty}(bln(n)-kn) = -infty$ right? BTW, you made some typos.
$endgroup$
– ablmf
Aug 4 '11 at 2:58
$begingroup$
So the question becomes to prove $lim_{nrightarrowinfty}(bln(n)-kn) = -infty$ right? BTW, you made some typos.
$endgroup$
– ablmf
Aug 4 '11 at 2:58
$begingroup$
Yes; I think too, if you use that ln(n) grows slower than n. I will check for typos.
$endgroup$
– gary
Aug 4 '11 at 3:46
$begingroup$
Yes; I think too, if you use that ln(n) grows slower than n. I will check for typos.
$endgroup$
– gary
Aug 4 '11 at 3:46
add a comment |
$begingroup$
This is inspired by André Nicolas's answer, but instead of taking $b^{th}$ roots, I'll take $n^{th}$ roots.
The limit of the sequence of $n^{th}$ roots of the terms in the original sequence is
$$lim_{ntoinfty}frac{(n^b)^{1/n}}{a}=lim_{ntoinfty}frac{left(n^{1/n}right)^b}{a}=frac{left(limlimits_{ntoinfty} n^{1/n}right)^b}{a}=frac{1^b}{a}=frac{1}{a}.$$
(I used here the fact that $limlimits_{ntoinfty}n^{1/n}=1$, which was the subject of another question on this site, and which can also be proved in many ways.)
If $c$ is any number such that $frac{1}{a}<c<1$, then the previous limit implies that there is an $N>0$ such that for all $n>N$, $$left(frac{n^b}{a^n}right)^{1/n}lt c.$$ Then for all $n>N$, $$frac{n^b}{a^n}lt c^n.$$ Since $c^nto 0$, this shows that the original series goes to zero.
Alternatively, once you know that the sequence of $n^{th}$ roots converges to a number less than $1$, you can apply the root test to conclude that $sumlimits_{n=1}^inftyfrac{n^b}{a^n}$ converges, which implies that the terms go to $0$. (The proof of the root test actually just uses the inequality derived above and the convergence of the geometric series with common ratio $c$.)
$endgroup$
add a comment |
$begingroup$
This is inspired by André Nicolas's answer, but instead of taking $b^{th}$ roots, I'll take $n^{th}$ roots.
The limit of the sequence of $n^{th}$ roots of the terms in the original sequence is
$$lim_{ntoinfty}frac{(n^b)^{1/n}}{a}=lim_{ntoinfty}frac{left(n^{1/n}right)^b}{a}=frac{left(limlimits_{ntoinfty} n^{1/n}right)^b}{a}=frac{1^b}{a}=frac{1}{a}.$$
(I used here the fact that $limlimits_{ntoinfty}n^{1/n}=1$, which was the subject of another question on this site, and which can also be proved in many ways.)
If $c$ is any number such that $frac{1}{a}<c<1$, then the previous limit implies that there is an $N>0$ such that for all $n>N$, $$left(frac{n^b}{a^n}right)^{1/n}lt c.$$ Then for all $n>N$, $$frac{n^b}{a^n}lt c^n.$$ Since $c^nto 0$, this shows that the original series goes to zero.
Alternatively, once you know that the sequence of $n^{th}$ roots converges to a number less than $1$, you can apply the root test to conclude that $sumlimits_{n=1}^inftyfrac{n^b}{a^n}$ converges, which implies that the terms go to $0$. (The proof of the root test actually just uses the inequality derived above and the convergence of the geometric series with common ratio $c$.)
$endgroup$
add a comment |
$begingroup$
This is inspired by André Nicolas's answer, but instead of taking $b^{th}$ roots, I'll take $n^{th}$ roots.
The limit of the sequence of $n^{th}$ roots of the terms in the original sequence is
$$lim_{ntoinfty}frac{(n^b)^{1/n}}{a}=lim_{ntoinfty}frac{left(n^{1/n}right)^b}{a}=frac{left(limlimits_{ntoinfty} n^{1/n}right)^b}{a}=frac{1^b}{a}=frac{1}{a}.$$
(I used here the fact that $limlimits_{ntoinfty}n^{1/n}=1$, which was the subject of another question on this site, and which can also be proved in many ways.)
If $c$ is any number such that $frac{1}{a}<c<1$, then the previous limit implies that there is an $N>0$ such that for all $n>N$, $$left(frac{n^b}{a^n}right)^{1/n}lt c.$$ Then for all $n>N$, $$frac{n^b}{a^n}lt c^n.$$ Since $c^nto 0$, this shows that the original series goes to zero.
Alternatively, once you know that the sequence of $n^{th}$ roots converges to a number less than $1$, you can apply the root test to conclude that $sumlimits_{n=1}^inftyfrac{n^b}{a^n}$ converges, which implies that the terms go to $0$. (The proof of the root test actually just uses the inequality derived above and the convergence of the geometric series with common ratio $c$.)
$endgroup$
This is inspired by André Nicolas's answer, but instead of taking $b^{th}$ roots, I'll take $n^{th}$ roots.
The limit of the sequence of $n^{th}$ roots of the terms in the original sequence is
$$lim_{ntoinfty}frac{(n^b)^{1/n}}{a}=lim_{ntoinfty}frac{left(n^{1/n}right)^b}{a}=frac{left(limlimits_{ntoinfty} n^{1/n}right)^b}{a}=frac{1^b}{a}=frac{1}{a}.$$
(I used here the fact that $limlimits_{ntoinfty}n^{1/n}=1$, which was the subject of another question on this site, and which can also be proved in many ways.)
If $c$ is any number such that $frac{1}{a}<c<1$, then the previous limit implies that there is an $N>0$ such that for all $n>N$, $$left(frac{n^b}{a^n}right)^{1/n}lt c.$$ Then for all $n>N$, $$frac{n^b}{a^n}lt c^n.$$ Since $c^nto 0$, this shows that the original series goes to zero.
Alternatively, once you know that the sequence of $n^{th}$ roots converges to a number less than $1$, you can apply the root test to conclude that $sumlimits_{n=1}^inftyfrac{n^b}{a^n}$ converges, which implies that the terms go to $0$. (The proof of the root test actually just uses the inequality derived above and the convergence of the geometric series with common ratio $c$.)
edited Apr 13 '17 at 12:20
Community♦
1
1
answered Aug 4 '11 at 8:36
Jonas MeyerJonas Meyer
41.2k6149260
41.2k6149260
add a comment |
add a comment |
$begingroup$
Since it is almost always better to expand around zero, let $a = 1+c$, where $c > 0$. So we want to show $lim frac{n^b}{(1+c)^n} = 0$.
The ratio of consecutive terms is $frac{(1+1/n)^b}{1+c}$, so if we can show that $(1+1/n)^b < 1+c/2$ for large enough n, we are done. But this means that
$n > 1/((1+c/2)^{1/b}-1)$.
Restating, letting $N = frac{1}{(1+c/2)^{1/b}-1}$ and $r = frac{1+c/2}{1+c}$, if $n > N$ then $frac{(1+1/n)^b}{1+c} < frac{1+c/2}{1+c}$, which shows that
$$ frac{n^b}{(1+c)^n} < frac{N^b}{(1+c)^N}r^{n-N}$$
which goes to $0$ as $n rightarrow infty$ since $N$ is fixed and $0 < r < 1$.
Note: An elementary proof that $r^n rightarrow 0$ is in "What is Mathematics?" by Courant and Robbins.
Let $r = 1/(1+s)$, where $s > 0$. By Bernoulli's inequality,
$$r^n = frac{1}{(1+s)^n} < frac{1}{1+s n} = frac{1}{1+ n (1/r-1)}.$$
They similarly show that, if $a > 1$, then $a^n rightarrow infty$ like this:
Let $ a = 1+b$, where $b > 0$. Then
$$a^n = (1+b)^n > 1+n b = 1+n(a-1).$$
The keys in both these are expanding around zero and using Bernoulli's inequality.
Archimede's axiom (for any positive reals $x$ and $y$, there is an integer $n$ such that $x < n y$) does the rest.
$endgroup$
add a comment |
$begingroup$
Since it is almost always better to expand around zero, let $a = 1+c$, where $c > 0$. So we want to show $lim frac{n^b}{(1+c)^n} = 0$.
The ratio of consecutive terms is $frac{(1+1/n)^b}{1+c}$, so if we can show that $(1+1/n)^b < 1+c/2$ for large enough n, we are done. But this means that
$n > 1/((1+c/2)^{1/b}-1)$.
Restating, letting $N = frac{1}{(1+c/2)^{1/b}-1}$ and $r = frac{1+c/2}{1+c}$, if $n > N$ then $frac{(1+1/n)^b}{1+c} < frac{1+c/2}{1+c}$, which shows that
$$ frac{n^b}{(1+c)^n} < frac{N^b}{(1+c)^N}r^{n-N}$$
which goes to $0$ as $n rightarrow infty$ since $N$ is fixed and $0 < r < 1$.
Note: An elementary proof that $r^n rightarrow 0$ is in "What is Mathematics?" by Courant and Robbins.
Let $r = 1/(1+s)$, where $s > 0$. By Bernoulli's inequality,
$$r^n = frac{1}{(1+s)^n} < frac{1}{1+s n} = frac{1}{1+ n (1/r-1)}.$$
They similarly show that, if $a > 1$, then $a^n rightarrow infty$ like this:
Let $ a = 1+b$, where $b > 0$. Then
$$a^n = (1+b)^n > 1+n b = 1+n(a-1).$$
The keys in both these are expanding around zero and using Bernoulli's inequality.
Archimede's axiom (for any positive reals $x$ and $y$, there is an integer $n$ such that $x < n y$) does the rest.
$endgroup$
add a comment |
$begingroup$
Since it is almost always better to expand around zero, let $a = 1+c$, where $c > 0$. So we want to show $lim frac{n^b}{(1+c)^n} = 0$.
The ratio of consecutive terms is $frac{(1+1/n)^b}{1+c}$, so if we can show that $(1+1/n)^b < 1+c/2$ for large enough n, we are done. But this means that
$n > 1/((1+c/2)^{1/b}-1)$.
Restating, letting $N = frac{1}{(1+c/2)^{1/b}-1}$ and $r = frac{1+c/2}{1+c}$, if $n > N$ then $frac{(1+1/n)^b}{1+c} < frac{1+c/2}{1+c}$, which shows that
$$ frac{n^b}{(1+c)^n} < frac{N^b}{(1+c)^N}r^{n-N}$$
which goes to $0$ as $n rightarrow infty$ since $N$ is fixed and $0 < r < 1$.
Note: An elementary proof that $r^n rightarrow 0$ is in "What is Mathematics?" by Courant and Robbins.
Let $r = 1/(1+s)$, where $s > 0$. By Bernoulli's inequality,
$$r^n = frac{1}{(1+s)^n} < frac{1}{1+s n} = frac{1}{1+ n (1/r-1)}.$$
They similarly show that, if $a > 1$, then $a^n rightarrow infty$ like this:
Let $ a = 1+b$, where $b > 0$. Then
$$a^n = (1+b)^n > 1+n b = 1+n(a-1).$$
The keys in both these are expanding around zero and using Bernoulli's inequality.
Archimede's axiom (for any positive reals $x$ and $y$, there is an integer $n$ such that $x < n y$) does the rest.
$endgroup$
Since it is almost always better to expand around zero, let $a = 1+c$, where $c > 0$. So we want to show $lim frac{n^b}{(1+c)^n} = 0$.
The ratio of consecutive terms is $frac{(1+1/n)^b}{1+c}$, so if we can show that $(1+1/n)^b < 1+c/2$ for large enough n, we are done. But this means that
$n > 1/((1+c/2)^{1/b}-1)$.
Restating, letting $N = frac{1}{(1+c/2)^{1/b}-1}$ and $r = frac{1+c/2}{1+c}$, if $n > N$ then $frac{(1+1/n)^b}{1+c} < frac{1+c/2}{1+c}$, which shows that
$$ frac{n^b}{(1+c)^n} < frac{N^b}{(1+c)^N}r^{n-N}$$
which goes to $0$ as $n rightarrow infty$ since $N$ is fixed and $0 < r < 1$.
Note: An elementary proof that $r^n rightarrow 0$ is in "What is Mathematics?" by Courant and Robbins.
Let $r = 1/(1+s)$, where $s > 0$. By Bernoulli's inequality,
$$r^n = frac{1}{(1+s)^n} < frac{1}{1+s n} = frac{1}{1+ n (1/r-1)}.$$
They similarly show that, if $a > 1$, then $a^n rightarrow infty$ like this:
Let $ a = 1+b$, where $b > 0$. Then
$$a^n = (1+b)^n > 1+n b = 1+n(a-1).$$
The keys in both these are expanding around zero and using Bernoulli's inequality.
Archimede's axiom (for any positive reals $x$ and $y$, there is an integer $n$ such that $x < n y$) does the rest.
answered Aug 8 '11 at 13:43
marty cohenmarty cohen
76k549130
76k549130
add a comment |
add a comment |
$begingroup$
Tom Apostol, in volume I of his 2-volume text on calculus, gives what I consider to be the canonical proof of this assertion. He calls it the theorem of “the slow growth of the logarithm”. By making the appropriate substitution (namely, exp(x) for x), you can get what we can call the theorem on “the fast growth of the exponential”.
I think you will have no difficulty agreeing that as far as comparing a polynomial to the exponential goes, the only part of the polynomial that we have to consider is the term involving the highest power, and that we can even disregard its coefficient. (This agreement is currently shown in your question, but I believe that is the result of someone else’s edit, and so I am answering what I believe your original question was.) That is what that theorem in Apostol does: it shows that if a > 0 and b > 0, then the limit as x goes to infinity of ((log(x))^a)/(x^b) goes to 0.
You might be interested is knowing a significant application of this fact, namely, in the proof of the recursion formula for what is known as the Gamma function (which is the canonical continuous extension of the factorial function but, for historical reasons, shifted one unit). That is, in proving that Gamma(x + 1) = xGamma(x), one typically applies the technique of integration-by-parts, and one piece of the resulting expression is a power divided by an exponential, which goes to 0 as one takes the limit as the upper limit of the integral goes to infinity.
Another application of this fact is the shape of the graph of y = x*exp(x), namely, that as x goes to negative infinity, y goes to 0.
$endgroup$
add a comment |
$begingroup$
Tom Apostol, in volume I of his 2-volume text on calculus, gives what I consider to be the canonical proof of this assertion. He calls it the theorem of “the slow growth of the logarithm”. By making the appropriate substitution (namely, exp(x) for x), you can get what we can call the theorem on “the fast growth of the exponential”.
I think you will have no difficulty agreeing that as far as comparing a polynomial to the exponential goes, the only part of the polynomial that we have to consider is the term involving the highest power, and that we can even disregard its coefficient. (This agreement is currently shown in your question, but I believe that is the result of someone else’s edit, and so I am answering what I believe your original question was.) That is what that theorem in Apostol does: it shows that if a > 0 and b > 0, then the limit as x goes to infinity of ((log(x))^a)/(x^b) goes to 0.
You might be interested is knowing a significant application of this fact, namely, in the proof of the recursion formula for what is known as the Gamma function (which is the canonical continuous extension of the factorial function but, for historical reasons, shifted one unit). That is, in proving that Gamma(x + 1) = xGamma(x), one typically applies the technique of integration-by-parts, and one piece of the resulting expression is a power divided by an exponential, which goes to 0 as one takes the limit as the upper limit of the integral goes to infinity.
Another application of this fact is the shape of the graph of y = x*exp(x), namely, that as x goes to negative infinity, y goes to 0.
$endgroup$
add a comment |
$begingroup$
Tom Apostol, in volume I of his 2-volume text on calculus, gives what I consider to be the canonical proof of this assertion. He calls it the theorem of “the slow growth of the logarithm”. By making the appropriate substitution (namely, exp(x) for x), you can get what we can call the theorem on “the fast growth of the exponential”.
I think you will have no difficulty agreeing that as far as comparing a polynomial to the exponential goes, the only part of the polynomial that we have to consider is the term involving the highest power, and that we can even disregard its coefficient. (This agreement is currently shown in your question, but I believe that is the result of someone else’s edit, and so I am answering what I believe your original question was.) That is what that theorem in Apostol does: it shows that if a > 0 and b > 0, then the limit as x goes to infinity of ((log(x))^a)/(x^b) goes to 0.
You might be interested is knowing a significant application of this fact, namely, in the proof of the recursion formula for what is known as the Gamma function (which is the canonical continuous extension of the factorial function but, for historical reasons, shifted one unit). That is, in proving that Gamma(x + 1) = xGamma(x), one typically applies the technique of integration-by-parts, and one piece of the resulting expression is a power divided by an exponential, which goes to 0 as one takes the limit as the upper limit of the integral goes to infinity.
Another application of this fact is the shape of the graph of y = x*exp(x), namely, that as x goes to negative infinity, y goes to 0.
$endgroup$
Tom Apostol, in volume I of his 2-volume text on calculus, gives what I consider to be the canonical proof of this assertion. He calls it the theorem of “the slow growth of the logarithm”. By making the appropriate substitution (namely, exp(x) for x), you can get what we can call the theorem on “the fast growth of the exponential”.
I think you will have no difficulty agreeing that as far as comparing a polynomial to the exponential goes, the only part of the polynomial that we have to consider is the term involving the highest power, and that we can even disregard its coefficient. (This agreement is currently shown in your question, but I believe that is the result of someone else’s edit, and so I am answering what I believe your original question was.) That is what that theorem in Apostol does: it shows that if a > 0 and b > 0, then the limit as x goes to infinity of ((log(x))^a)/(x^b) goes to 0.
You might be interested is knowing a significant application of this fact, namely, in the proof of the recursion formula for what is known as the Gamma function (which is the canonical continuous extension of the factorial function but, for historical reasons, shifted one unit). That is, in proving that Gamma(x + 1) = xGamma(x), one typically applies the technique of integration-by-parts, and one piece of the resulting expression is a power divided by an exponential, which goes to 0 as one takes the limit as the upper limit of the integral goes to infinity.
Another application of this fact is the shape of the graph of y = x*exp(x), namely, that as x goes to negative infinity, y goes to 0.
edited Aug 7 '11 at 21:00
answered Aug 4 '11 at 3:55
Mike JonesMike Jones
2,58112738
2,58112738
add a comment |
add a comment |
$begingroup$
As several of the previous answers have noted it is enough to show that $lim_{xto infty} frac{x}{a^x}=0$. To prove this, in turn it is enough to prove that $lim_{x to infty} frac{a^x}{x} = infty$.
Lets compute the derivative and second derivative of $f(x) = frac{a^x}{x}$. We find that
$f'(x) = frac{a^x(x ln(a)-1)}{x^2}$ and $f''(x) = frac{a^x(x^2 ln^2(a)-2ln(a) x+2)}{x^3}$.
We see that for large enough values of $x$, both $f'(x)$ and $f''(x)$ are positive. Thus, $f$ not only grows, it grows at an increasing rate. The only possible conclusion is that $lim_{xto infty} frac{a^x}{x} = infty$.
With a little work this can be made into a completely rigorous proof.
$endgroup$
add a comment |
$begingroup$
As several of the previous answers have noted it is enough to show that $lim_{xto infty} frac{x}{a^x}=0$. To prove this, in turn it is enough to prove that $lim_{x to infty} frac{a^x}{x} = infty$.
Lets compute the derivative and second derivative of $f(x) = frac{a^x}{x}$. We find that
$f'(x) = frac{a^x(x ln(a)-1)}{x^2}$ and $f''(x) = frac{a^x(x^2 ln^2(a)-2ln(a) x+2)}{x^3}$.
We see that for large enough values of $x$, both $f'(x)$ and $f''(x)$ are positive. Thus, $f$ not only grows, it grows at an increasing rate. The only possible conclusion is that $lim_{xto infty} frac{a^x}{x} = infty$.
With a little work this can be made into a completely rigorous proof.
$endgroup$
add a comment |
$begingroup$
As several of the previous answers have noted it is enough to show that $lim_{xto infty} frac{x}{a^x}=0$. To prove this, in turn it is enough to prove that $lim_{x to infty} frac{a^x}{x} = infty$.
Lets compute the derivative and second derivative of $f(x) = frac{a^x}{x}$. We find that
$f'(x) = frac{a^x(x ln(a)-1)}{x^2}$ and $f''(x) = frac{a^x(x^2 ln^2(a)-2ln(a) x+2)}{x^3}$.
We see that for large enough values of $x$, both $f'(x)$ and $f''(x)$ are positive. Thus, $f$ not only grows, it grows at an increasing rate. The only possible conclusion is that $lim_{xto infty} frac{a^x}{x} = infty$.
With a little work this can be made into a completely rigorous proof.
$endgroup$
As several of the previous answers have noted it is enough to show that $lim_{xto infty} frac{x}{a^x}=0$. To prove this, in turn it is enough to prove that $lim_{x to infty} frac{a^x}{x} = infty$.
Lets compute the derivative and second derivative of $f(x) = frac{a^x}{x}$. We find that
$f'(x) = frac{a^x(x ln(a)-1)}{x^2}$ and $f''(x) = frac{a^x(x^2 ln^2(a)-2ln(a) x+2)}{x^3}$.
We see that for large enough values of $x$, both $f'(x)$ and $f''(x)$ are positive. Thus, $f$ not only grows, it grows at an increasing rate. The only possible conclusion is that $lim_{xto infty} frac{a^x}{x} = infty$.
With a little work this can be made into a completely rigorous proof.
answered Feb 18 '15 at 19:36
JohanJohan
1,2011821
1,2011821
add a comment |
add a comment |
$begingroup$
Another method is the following (this method was originally? used by W. Rudin in his PMA):
For $ngeqslant2(|lceil brceil|+1)$ (the absolute value of the ceiling of $b$) we have, using the binomial theorem:$$begin{aligned}a^n&=(1+(a-1))^n\\&>binom{n}{|lceil brceil|+1}(a-1)^{|lceil brceil|+1}\\&=dfrac{n(n-1)cdots(n-|lceil brceil|)}{(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}\\&>dfrac{n^{|lceil brceil|+1}}{2^{|lceil brceil|+1}(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}end{aligned}$$ so $$left(dfrac{2^{|lceil brceil|+1}(|lceil brceil|+1)!}{(a-1)^{|lceil brceil|+1}}right)cdotdfrac{1}{n}>dfrac{n^{|lceil brceil|}}{a^n}geqslantdfrac{n^b}{a^n}$$ and hence $$lim_{ntoinfty}dfrac{n^b}{a^n}=0.$$
$endgroup$
add a comment |
$begingroup$
Another method is the following (this method was originally? used by W. Rudin in his PMA):
For $ngeqslant2(|lceil brceil|+1)$ (the absolute value of the ceiling of $b$) we have, using the binomial theorem:$$begin{aligned}a^n&=(1+(a-1))^n\\&>binom{n}{|lceil brceil|+1}(a-1)^{|lceil brceil|+1}\\&=dfrac{n(n-1)cdots(n-|lceil brceil|)}{(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}\\&>dfrac{n^{|lceil brceil|+1}}{2^{|lceil brceil|+1}(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}end{aligned}$$ so $$left(dfrac{2^{|lceil brceil|+1}(|lceil brceil|+1)!}{(a-1)^{|lceil brceil|+1}}right)cdotdfrac{1}{n}>dfrac{n^{|lceil brceil|}}{a^n}geqslantdfrac{n^b}{a^n}$$ and hence $$lim_{ntoinfty}dfrac{n^b}{a^n}=0.$$
$endgroup$
add a comment |
$begingroup$
Another method is the following (this method was originally? used by W. Rudin in his PMA):
For $ngeqslant2(|lceil brceil|+1)$ (the absolute value of the ceiling of $b$) we have, using the binomial theorem:$$begin{aligned}a^n&=(1+(a-1))^n\\&>binom{n}{|lceil brceil|+1}(a-1)^{|lceil brceil|+1}\\&=dfrac{n(n-1)cdots(n-|lceil brceil|)}{(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}\\&>dfrac{n^{|lceil brceil|+1}}{2^{|lceil brceil|+1}(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}end{aligned}$$ so $$left(dfrac{2^{|lceil brceil|+1}(|lceil brceil|+1)!}{(a-1)^{|lceil brceil|+1}}right)cdotdfrac{1}{n}>dfrac{n^{|lceil brceil|}}{a^n}geqslantdfrac{n^b}{a^n}$$ and hence $$lim_{ntoinfty}dfrac{n^b}{a^n}=0.$$
$endgroup$
Another method is the following (this method was originally? used by W. Rudin in his PMA):
For $ngeqslant2(|lceil brceil|+1)$ (the absolute value of the ceiling of $b$) we have, using the binomial theorem:$$begin{aligned}a^n&=(1+(a-1))^n\\&>binom{n}{|lceil brceil|+1}(a-1)^{|lceil brceil|+1}\\&=dfrac{n(n-1)cdots(n-|lceil brceil|)}{(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}\\&>dfrac{n^{|lceil brceil|+1}}{2^{|lceil brceil|+1}(|lceil brceil|+1)!}(a-1)^{|lceil brceil|+1}end{aligned}$$ so $$left(dfrac{2^{|lceil brceil|+1}(|lceil brceil|+1)!}{(a-1)^{|lceil brceil|+1}}right)cdotdfrac{1}{n}>dfrac{n^{|lceil brceil|}}{a^n}geqslantdfrac{n^b}{a^n}$$ and hence $$lim_{ntoinfty}dfrac{n^b}{a^n}=0.$$
answered Dec 7 '16 at 3:14
CIJCIJ
2,8281717
2,8281717
add a comment |
add a comment |
$begingroup$
The case $ ble 0$ is blatantly obvious since $n^b le 1$ for $ble 0$ and hence since $a>1$ we have, $$frac{n^b}{a^n} le frac{1}{a^n} =left(frac{1}{a}right)^nto 0$$
we assume $ b>0$ then,
$$frac{n^b}{a^n}=left(frac{b}{ln a} right)^bleft(nfrac{ln a}{b} e^{-nfrac{ln a}{b}}right)^b color{red}{overset{u= nfrac{ln a}{b}}{:=}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b} ~~~~ b>0,~~$$
Hence, $$lim_{ntoinfty}frac{n^b}{a^n}= lim_{utoinfty}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b=0$$
$endgroup$
add a comment |
$begingroup$
The case $ ble 0$ is blatantly obvious since $n^b le 1$ for $ble 0$ and hence since $a>1$ we have, $$frac{n^b}{a^n} le frac{1}{a^n} =left(frac{1}{a}right)^nto 0$$
we assume $ b>0$ then,
$$frac{n^b}{a^n}=left(frac{b}{ln a} right)^bleft(nfrac{ln a}{b} e^{-nfrac{ln a}{b}}right)^b color{red}{overset{u= nfrac{ln a}{b}}{:=}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b} ~~~~ b>0,~~$$
Hence, $$lim_{ntoinfty}frac{n^b}{a^n}= lim_{utoinfty}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b=0$$
$endgroup$
add a comment |
$begingroup$
The case $ ble 0$ is blatantly obvious since $n^b le 1$ for $ble 0$ and hence since $a>1$ we have, $$frac{n^b}{a^n} le frac{1}{a^n} =left(frac{1}{a}right)^nto 0$$
we assume $ b>0$ then,
$$frac{n^b}{a^n}=left(frac{b}{ln a} right)^bleft(nfrac{ln a}{b} e^{-nfrac{ln a}{b}}right)^b color{red}{overset{u= nfrac{ln a}{b}}{:=}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b} ~~~~ b>0,~~$$
Hence, $$lim_{ntoinfty}frac{n^b}{a^n}= lim_{utoinfty}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b=0$$
$endgroup$
The case $ ble 0$ is blatantly obvious since $n^b le 1$ for $ble 0$ and hence since $a>1$ we have, $$frac{n^b}{a^n} le frac{1}{a^n} =left(frac{1}{a}right)^nto 0$$
we assume $ b>0$ then,
$$frac{n^b}{a^n}=left(frac{b}{ln a} right)^bleft(nfrac{ln a}{b} e^{-nfrac{ln a}{b}}right)^b color{red}{overset{u= nfrac{ln a}{b}}{:=}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b} ~~~~ b>0,~~$$
Hence, $$lim_{ntoinfty}frac{n^b}{a^n}= lim_{utoinfty}left(frac{b}{ln a} right)^bleft(u e^{-u}right)^b=0$$
edited Jan 21 '18 at 19:51
answered Jan 1 '18 at 11:25
Guy FsoneGuy Fsone
17.3k43074
17.3k43074
add a comment |
add a comment |
$begingroup$
You could also see the following:
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}$$
Now $lim_{nrightarrowinfty}log (frac{n^b}{a^n})=lim_{nrightarrowinfty}(log (n^b) - log (a^n))=lim_{nrightarrowinfty} (blog(n)-nlog(a))=-infty$ assuming $a>1$.
So $lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}=0=lim_{nrightarrowinfty}frac{n^b}{a^n}$
Just to clarify the comment, I am adding the proof that $$lim_{nrightarrowinfty} (blog (n)-nlog (a))=-infty$$
First by applying L'Hôpital's rule we get
$$lim_{nrightarrowinfty} frac{n}{log(n)}=lim_{nrightarrowinfty} frac{1}{frac{1}{n}} to infty$$
We have then
$$lim_{nrightarrowinfty} (blog (n)-nlog (a))=lim_{nrightarrowinfty} log(n)(b-log(a)frac{n}{log(n)})=-infty$$
$endgroup$
$begingroup$
Then you need an upper bound of $log(n)$.
$endgroup$
– ablmf
Apr 15 '18 at 9:18
$begingroup$
I assumed obviously that is a known fact that $log (n) to infty$ as $n to infty$ and that you know how to prove $lim_{nrightarrowinfty} (blog n-nlog a)=-infty$ assuming $a>1$
$endgroup$
– Marco Bellocchi
Apr 15 '18 at 10:03
add a comment |
$begingroup$
You could also see the following:
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}$$
Now $lim_{nrightarrowinfty}log (frac{n^b}{a^n})=lim_{nrightarrowinfty}(log (n^b) - log (a^n))=lim_{nrightarrowinfty} (blog(n)-nlog(a))=-infty$ assuming $a>1$.
So $lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}=0=lim_{nrightarrowinfty}frac{n^b}{a^n}$
Just to clarify the comment, I am adding the proof that $$lim_{nrightarrowinfty} (blog (n)-nlog (a))=-infty$$
First by applying L'Hôpital's rule we get
$$lim_{nrightarrowinfty} frac{n}{log(n)}=lim_{nrightarrowinfty} frac{1}{frac{1}{n}} to infty$$
We have then
$$lim_{nrightarrowinfty} (blog (n)-nlog (a))=lim_{nrightarrowinfty} log(n)(b-log(a)frac{n}{log(n)})=-infty$$
$endgroup$
$begingroup$
Then you need an upper bound of $log(n)$.
$endgroup$
– ablmf
Apr 15 '18 at 9:18
$begingroup$
I assumed obviously that is a known fact that $log (n) to infty$ as $n to infty$ and that you know how to prove $lim_{nrightarrowinfty} (blog n-nlog a)=-infty$ assuming $a>1$
$endgroup$
– Marco Bellocchi
Apr 15 '18 at 10:03
add a comment |
$begingroup$
You could also see the following:
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}$$
Now $lim_{nrightarrowinfty}log (frac{n^b}{a^n})=lim_{nrightarrowinfty}(log (n^b) - log (a^n))=lim_{nrightarrowinfty} (blog(n)-nlog(a))=-infty$ assuming $a>1$.
So $lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}=0=lim_{nrightarrowinfty}frac{n^b}{a^n}$
Just to clarify the comment, I am adding the proof that $$lim_{nrightarrowinfty} (blog (n)-nlog (a))=-infty$$
First by applying L'Hôpital's rule we get
$$lim_{nrightarrowinfty} frac{n}{log(n)}=lim_{nrightarrowinfty} frac{1}{frac{1}{n}} to infty$$
We have then
$$lim_{nrightarrowinfty} (blog (n)-nlog (a))=lim_{nrightarrowinfty} log(n)(b-log(a)frac{n}{log(n)})=-infty$$
$endgroup$
You could also see the following:
$$lim_{nrightarrowinfty}frac{n^b}{a^n} = lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}$$
Now $lim_{nrightarrowinfty}log (frac{n^b}{a^n})=lim_{nrightarrowinfty}(log (n^b) - log (a^n))=lim_{nrightarrowinfty} (blog(n)-nlog(a))=-infty$ assuming $a>1$.
So $lim_{nrightarrowinfty}e^{log frac{n^b}{a^n}}=0=lim_{nrightarrowinfty}frac{n^b}{a^n}$
Just to clarify the comment, I am adding the proof that $$lim_{nrightarrowinfty} (blog (n)-nlog (a))=-infty$$
First by applying L'Hôpital's rule we get
$$lim_{nrightarrowinfty} frac{n}{log(n)}=lim_{nrightarrowinfty} frac{1}{frac{1}{n}} to infty$$
We have then
$$lim_{nrightarrowinfty} (blog (n)-nlog (a))=lim_{nrightarrowinfty} log(n)(b-log(a)frac{n}{log(n)})=-infty$$
edited Jan 18 at 22:07
kccu
11.5k11231
11.5k11231
answered Apr 14 '18 at 13:26
Marco BellocchiMarco Bellocchi
5081412
5081412
$begingroup$
Then you need an upper bound of $log(n)$.
$endgroup$
– ablmf
Apr 15 '18 at 9:18
$begingroup$
I assumed obviously that is a known fact that $log (n) to infty$ as $n to infty$ and that you know how to prove $lim_{nrightarrowinfty} (blog n-nlog a)=-infty$ assuming $a>1$
$endgroup$
– Marco Bellocchi
Apr 15 '18 at 10:03
add a comment |
$begingroup$
Then you need an upper bound of $log(n)$.
$endgroup$
– ablmf
Apr 15 '18 at 9:18
$begingroup$
I assumed obviously that is a known fact that $log (n) to infty$ as $n to infty$ and that you know how to prove $lim_{nrightarrowinfty} (blog n-nlog a)=-infty$ assuming $a>1$
$endgroup$
– Marco Bellocchi
Apr 15 '18 at 10:03
$begingroup$
Then you need an upper bound of $log(n)$.
$endgroup$
– ablmf
Apr 15 '18 at 9:18
$begingroup$
Then you need an upper bound of $log(n)$.
$endgroup$
– ablmf
Apr 15 '18 at 9:18
$begingroup$
I assumed obviously that is a known fact that $log (n) to infty$ as $n to infty$ and that you know how to prove $lim_{nrightarrowinfty} (blog n-nlog a)=-infty$ assuming $a>1$
$endgroup$
– Marco Bellocchi
Apr 15 '18 at 10:03
$begingroup$
I assumed obviously that is a known fact that $log (n) to infty$ as $n to infty$ and that you know how to prove $lim_{nrightarrowinfty} (blog n-nlog a)=-infty$ assuming $a>1$
$endgroup$
– Marco Bellocchi
Apr 15 '18 at 10:03
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f55468%2fhow-to-prove-that-exponential-grows-faster-than-polynomial%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
5
$begingroup$
Have you tried expressing both in terms of the exponential?
$endgroup$
– gary
Aug 4 '11 at 1:41
$begingroup$
You could write the exponential as an infinite polynomial power series. You could also see what happens when you take higher and higher order derivatives of $n^b$ and $a^n$: the polynomial vanishes by an order each time and the exponential is only multiplied by a constant factors ($ln a$) each time.
$endgroup$
– jnm2
Aug 4 '11 at 2:36