RE: Studying Mathematics Thread
March 7, 2018 at 10:43 am
(This post was last modified: March 7, 2018 at 10:44 am by polymath257.)
So, first, the basic aspect of derivatives is linear approximation.
Think of it like this. Suppose we have a function f(x) and an x-value x=a. We want to approximate f(x) as well as possible by a linear function y=mx+b. For convenience, we
will rewrite that as y=m(x-a) + b. So, for x values close to x=a, we want f(x) approximately equal to m(x-a) + c. Now, it is easy to plug in and demand that f(a)=m(a-a)+c, so c=f(a).
But what about the slope of that line? Well, the slope of the line through (a, f(a) ) and (x, f(x) ) is {f(x)-f(a)}/{x-a}. Unfortunately, we cannot just plug in x=a for this because we then get 0/0, which is an indefinite limit (such expressions can come out to be anything). So, we take the next best approach and do a limit.
So, if we let m be the *limit* of {f(x)-f(a)}/{x-a} as x->a, then we should get the 'best' straight line approximation to f(x).
if this limit exists (see above discussion), then we say the function f(x) is differentiable at x=a and define f'(a) to be that limit. So, the derivative f'(a) is the slope of the line that best approximates the function f(x) close to x=a.
Now, if this derivative exists, then the function f(x) *has* to be continuous at x=a. Why? Because x-a->0 as x->a, so to get the limit for the derivative, we need f(x)-f(a)->0 also (otherwise the limit would be infinite).
On the other hand, it is *quite* possible for a fun be continuous at a point but not differentiable there. Take the function f(x)=x if x>=0 and f(x)=-x if x<0. This graphs as a 'V' shape with the corner at x=0. And, in fact, the 'derivative from the left' is -1 and the 'derivative from the right' is +1. This means the derivative does not exist (just like jumps for ordinary limits).
Much stranger is the fact that there are continuous functions that do now have derivatives *anywhere*. For the example above, the derivative failed only at x=0. But there are functions (first discovered by Weierstrass) where the derivative exists at no point at all. In some sense, they are continuous by 'all corners'.
Next, we can go to functions of more than one variable, f(x,y) and ask for the 'best' linear approximation in exactly the same way. Only now, the linear approximation is in two variables and it looks like Ax +By +C for constants A,B, and C. if we want the approximation to work at (x,y)=(a,b), it is easier to write the linear function as A(x-a) +B(x-b) +D. Again, just as before, D=f(a,b).
But now, the coefficients A and B turn out to be limits where we only vary x or y (respectively) as we approach (a,b). This leads to the idea of 'partial derivatives'.
Clearly, now, this can be generalized to any number of variables. It turns out that we can even do this for *infinitely* many variables if we are working in a Hilbert or Banach space. In every case, the ultimate idea for derivatives is as linear approximations.
Well, Newton was one of the first. Leibnitz, Fermat, and Bolya contributed to the overall theory.
Think of it like this. Suppose we have a function f(x) and an x-value x=a. We want to approximate f(x) as well as possible by a linear function y=mx+b. For convenience, we
will rewrite that as y=m(x-a) + b. So, for x values close to x=a, we want f(x) approximately equal to m(x-a) + c. Now, it is easy to plug in and demand that f(a)=m(a-a)+c, so c=f(a).
But what about the slope of that line? Well, the slope of the line through (a, f(a) ) and (x, f(x) ) is {f(x)-f(a)}/{x-a}. Unfortunately, we cannot just plug in x=a for this because we then get 0/0, which is an indefinite limit (such expressions can come out to be anything). So, we take the next best approach and do a limit.
So, if we let m be the *limit* of {f(x)-f(a)}/{x-a} as x->a, then we should get the 'best' straight line approximation to f(x).
if this limit exists (see above discussion), then we say the function f(x) is differentiable at x=a and define f'(a) to be that limit. So, the derivative f'(a) is the slope of the line that best approximates the function f(x) close to x=a.
Now, if this derivative exists, then the function f(x) *has* to be continuous at x=a. Why? Because x-a->0 as x->a, so to get the limit for the derivative, we need f(x)-f(a)->0 also (otherwise the limit would be infinite).
On the other hand, it is *quite* possible for a fun be continuous at a point but not differentiable there. Take the function f(x)=x if x>=0 and f(x)=-x if x<0. This graphs as a 'V' shape with the corner at x=0. And, in fact, the 'derivative from the left' is -1 and the 'derivative from the right' is +1. This means the derivative does not exist (just like jumps for ordinary limits).
Much stranger is the fact that there are continuous functions that do now have derivatives *anywhere*. For the example above, the derivative failed only at x=0. But there are functions (first discovered by Weierstrass) where the derivative exists at no point at all. In some sense, they are continuous by 'all corners'.
Next, we can go to functions of more than one variable, f(x,y) and ask for the 'best' linear approximation in exactly the same way. Only now, the linear approximation is in two variables and it looks like Ax +By +C for constants A,B, and C. if we want the approximation to work at (x,y)=(a,b), it is easier to write the linear function as A(x-a) +B(x-b) +D. Again, just as before, D=f(a,b).
But now, the coefficients A and B turn out to be limits where we only vary x or y (respectively) as we approach (a,b). This leads to the idea of 'partial derivatives'.
Clearly, now, this can be generalized to any number of variables. It turns out that we can even do this for *infinitely* many variables if we are working in a Hilbert or Banach space. In every case, the ultimate idea for derivatives is as linear approximations.
(March 7, 2018 at 10:19 am)robvalue Wrote: Ahah, clever!
Sure, I love derivatives! One of my favourite things. It's such an incredibly powerful tool. It was Newton who set them in motion, I believe? (Pun not intended but taken credit for anyway)
Well, Newton was one of the first. Leibnitz, Fermat, and Bolya contributed to the overall theory.