This section explains the basic idea of a limit, which you need to understand
in order for the rest of this topic to make sense. Briefly, the limit of a function
f(x) as x approaches a ,
|
Some functions don't have a real limit as x approaches a , but it may still be possible to makes sense of what happens to the function f(x) as you approach a from only one side.
The notion of ``infinity'' has been much-abused by philosophers and science-fiction writers. But, really, all that infinity ( ¥) means mathematically is a certain kind of limit. Two distinct kinds of limits, actually.
Once you have some grounding in the idea of what limits are supposed to be, this section gives a number of methods to actually compute limits in a number of cases. Some of these methods will come back to haunt us later on, since we really do need to compute limits to make sense of some of the later formulas we develop. This also includes some basic theorems about limits.
The definition of a limit used in the first section (1.1) of this topic was correct, but used imprecise terms that you probably haven't seen in a mathematics course before. The idea of a limit was around a lot longer than any formal definition, but this definition does put it on a firm mathematical base. This section deals with that formal definition, and you work through some examples of how to apply the definition.
Simply put, a function f(x) is continuous (at a point x = a )
if f(a) has the value that it should have,
|
These basic theorems seem to be obvious, but aren't, quite, as obvious as they seem, since the only need to be true if the function is continuous. They provide the theoretical basis for much of the theory of calculus.
Copyright (c) by David L. Johnson, last modified 4/25/00.