Personal tools
•
You are here: Home Lecture 3: Function Spaces

Lecture 3: Function Spaces

Document Actions
Schedule :: Intro :: Informal Example :: Function Spaces

One of the neat things about all of this is that we can do the same techniques for infinite-dimensional spaces, which includes spaces of functions. Now our ingredients are not vectors in the usual sense, but functions. Suppose we have a set of ingredient functions called . We want to find a representation of some other function in terms of these functions:
We can ask the same kinds of questions as before: What kinds of functions can be represented? How do we choose the set of coefficients? To get the whole thing going, we need to introduce some kind of geometry into the problem. We will define the inner product between two functions as
(the limits will usually be dropped for ease of writing, indicating that the integral is to be taken over all applicable values). Compare this inner product with the inner product from before: we are multiplying things together and adding them up. Two functions are said to be orthogonal if
The projection of one function onto another one is defined as before:
The length of a function is defined by
and the distance between two functions is defined as

Getting back to our representation,

the kinds of functions that we can represent without error depends upon the kinds of basis functions that we choose. If the set of basis functions is able to represent all functions is a given class, then the set of functions is said to be complete (even if it cannot produce all possible functions. Analogy: a set of ingredients for cakes may be complete, even if it cannot produce all kinds of shampoo.)

To find the coefficients, we proceed as we did in the vector case: we want the error between the function and its representation to be as small as possible. This produces again the orthogonality theorem : the error is orthogonal to the data. This leads to the following equation for the coefficients:

Comparison of this equation with the one above reveals that they are identical in form: It doesn't matter whether you are dealing with vectors or functions: the result is the same. This means that you can use any geometric insight that you might obtain from vectors and apply it to functions when represented in this way. This is an extremely powerful notion and, in a sense, forms the very heart of digital communications and a good part of signal processing theory.

As before, it is often convenient to deal with orthogonal functions. Suppose that the set of basis functions that we choose is orthogonal, so that

Then the equation for the coefficients reduces down to
The coefficients may then be solved for as

Now we are ready to deal with Fourier series!

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 22). Lecture 3: Function Spaces. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Signals_and_Systems/3_3node3.html. This work is licensed under a Creative Commons License