Anyway, the definition is plain enough if you're comfortable with metric spaces. Just in case you're not, we'll start with the definition on the real number line with the standard euclidean distance (the absolute difference between two numbers):
If f(x) is a function from R to R, then f is Lipschitz continuous if there exists a constant K > 0 such that, for all x1, x2 in R, |f(x1) - f(x2)| < K|x1- x2|.Basically, this says that the slope of the graph can't get too crazy steep. The generalization to all metric spaces simply replaces the absolute values with whatever distance metric is being used for the space, that is:
If (X, dX) and (Y, dY) are metric spaces then f: X - Y is Lipschitz continuous if there exists K > 0 such that, for all x1, x2 in X, dY(f(x1), f(x2)) < KdX(x1, x2).This condition is rather independent of many other seemingly related conditions. For example, a function can be Lipschitz continuous and not be differentiable everywhere (e.g., |x|). Plenty of differentiable functions are not Lipschitz continuous (e.g., x2). Analytic functions may or may not be Lipschitz continuous. So, you have to check, is the bottom line.
One useful result is that if f maps a space to itself and K is less than 1, the function is called a contraction mapping and it can be shown that there exists a fixed point, that is, there exists x* in X such that f(x*) = x*. Intuitively this makes sense; if a function is squeezing a larger portion of the space into a smaller portion space across all values, there must be a point where it converges. The proof, originally from Banach in 1922, is less than intuitive, but really quite elegant. It's in any analysis text or online for those interested in looking it up.
No comments:
Post a Comment