Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Linear Algebra

  • 25-05-2011 11:28pm
    #1
    Closed Accounts Posts: 183 ✭✭


    Hi,

    I have posted something similar to this in the physics thread.

    I wish to discuss the inner product of two vectors. Are there any simple thought experiments that can be performed to demonstrate that when taking the inner-scalar product of two vectors, those vectors cannot be in the same vector space?


Comments

  • Registered Users, Registered Users 2 Posts: 3,745 ✭✭✭Eliot Rosewater


    Mathematically the definition of inner product is a function mapping two vectors in a vector space V to the underlying field (i.e., real numbers, complex numbers etc). The vectors have to be in the same vector space.

    I can't speak for what goes on in the physics forum! :D


  • Closed Accounts Posts: 183 ✭✭pvt6zh395dqbrj


    Hi,

    If I define my vector space with lots of vectors in it. And lets say I make all those vectors row vectors. How to I take the dot product and end up with a scalar?


  • Registered Users, Registered Users 2 Posts: 2,481 ✭✭✭Fremen


    I think you need to read up on linear algebra. There should be a good few texts in your library. I always thought Anthon's book was good, though some posters here disagree with me.

    If X and Y are row vectors, and A is a positive definite matrix, then

    <X,Y> = XAY' is an inner product, where Y' means Y transpose.

    The most common inner product you tend to see has A as the identity matrix.


  • Registered Users, Registered Users 2 Posts: 3,038 ✭✭✭sponsoredwalk


    Are there any simple thought experiments that can be performed to demonstrate that when taking the inner-scalar product of two vectors, those vectors cannot be in the same vector space?

    What does the inner-scalar product say about two vectors? When you
    calculate the inner product of two vectors you re-express this result in
    a manner so as to give you some infӨrmatiӨn that relates the two
    vectors, this information I'm thinking of seems to me to be an irrefutable
    argument for why the vectors must be contained in the same vector space
    & it can most certainly be visualized. In fact if the vectors were not in the
    same vector space this thing I'm thinking of wouldn't make sense (as far
    as I can see). Do you know what I'm talking about?:pac:


  • Closed Accounts Posts: 6,081 ✭✭✭LeixlipRed


    I see what you done there ;)


  • Advertisement
  • Closed Accounts Posts: 183 ✭✭pvt6zh395dqbrj


    Fremen wrote: »
    I think you need to read up on linear algebra. There should be a good few texts in your library. I always thought Anthon's book was good, though some posters here disagree with me.

    If X and Y are row vectors, and A is a positive definite matrix, then

    <X,Y> = XAY' is an inner product, where Y' means Y transpose.

    The most common inner product you tend to see has A as the identity matrix.


    Hi,

    Which library? Would the public libraries have linear algebra books? They are the only ones I can access...


  • Posts: 0 [Deleted User]


    Hi,

    Which library? Would the public libraries have linear algebra books? They are the only ones I can access...

    Yes, at your public library. You can search their catalogues here.


  • Closed Accounts Posts: 183 ✭✭pvt6zh395dqbrj


    Hi,

    Ok, so the two vectors have to be in the same vector space. The way I learned it was:

    Every vector space has a dual vector space which contains all the linear functionals on V if the space is V.

    For r3, the dual and the space are equivilant.

    Anytime you took the dot product of two vectors in V, what you were actually doing was mapping two vectors from V onto a field of scalars. But you had to choose one of the vectors from the dual space.

    what bit did I learn wrong?


  • Registered Users, Registered Users 2 Posts: 2,481 ✭✭✭Fremen


    That's all correct. The Reisz representation theorem says that a continuous linear functional f can be represented as an inner product.

    That is, if f is any functional and x is an element of the vector space, then

    f(x) = <x,y>, where y is some other element of the vector space, and <.,.> denotes inner product.

    This theorem holds for any finite-dimensional vector space, and any Hilbert space. There are more general infinite-dimensional vector spaces known as Banach spaces where this does not hold - there is no sensible notion of inner product on a Banach space, and Banach spaces are not self-dual (that is, the space of continuous linear functionals is not isomorphic to the original space).


  • Registered Users, Registered Users 2 Posts: 3,038 ✭✭✭sponsoredwalk


    I was looking at some elementary quantum mechanics over the past few
    days & I came up against an issue like this. Furthermore I'm trying to deal
    with another issue like this in my thread on "Forms" on this forum as well.
    What I'm saying is probably only half right & I'm trying to get it 100%
    right so be wary, but I think this will help you out a little at least.

    Basically we have to get our definitions & notation as explicit as possible
    or else we wont get anywhere & that's what my main problem is, I think if
    we can crack this we'll both get an answer:

    A linear functional is just a linear map f : V → F.
    The dual space of V is the vector space L(V,F) = (V)*, i.e. the space
    of linear functionals, i.e. maps from V to F.
    L(V,F)= {f | f : V → F}.
    That' supposed to read "The set of all functions such that f is a map
    from V to F", if you have a better, clearer way to express that please
    post it!

    What I think is going on here is a linear transformation
    T : V L(V,F),
    i.e.
    T : V (V)*. Reading the start of Spivak's Calculus on Manifolds he
    mentions this explicitly and more as below. This is what I find so weird,
    you're mapping an element of a vector space to a map, not just some
    element but to an element of a vector space which itself is a map,
    mapping from the vector space to the field...

    If (
    ℝⁿ)* is the dual space to ℝⁿ, with x ∈ ℝⁿ,
    define φx ∈ (ℝⁿ)* by φx(y) = <x,y>,
    define T : ℝⁿ → (ℝⁿ)* by T(x) = φx. This comes from Spivak's CoM, He's
    clearly written that φx ∈ (ℝⁿ)*, but I get the feeling that sometimes
    you're working with the scalar value <x,y> also supposed to be in (ℝⁿ)*
    & other times not. But to put it into my notation of:

    T : ℝⁿ → (ℝⁿ)* | x ↦ T(x) = φx.

    the element φx is itself a map from ℝⁿ to so you pick a vector from the
    dual space
    and map it onto the inner product. To spell it all out:

    T : ℝⁿ → L(ℝⁿ,ℝ) | x ↦ T(x) = φx : ℝⁿ → ℝ | y ↦ φx(y) = <x,y>.

    Here
    x ∈ ℝⁿ is mapped onto T(x) = φx ∈ (ℝⁿ)* = L(ℝⁿ,ℝ) but this
    itself is mapping
    y ∈ ℝⁿ to <x,y> ∈ ℝ so I think when you said
    "But you had to choose one of the vectors from the dual space"
    that vector is the y vector I'm using. Still that vector is in the same
    vector space we're using all along, ℝⁿ, hence there will be an angle
    between the vectors etc... I think that is what is causing you confusion.
    Honestly though the notation just doesn't sit right with me, is it right?

    Maybe somebody could dissect what I've said, whether it applies to
    your question in the way I think it does, or showing why it doesn't.
    In any case this stuff is bat-hit crazy at this stage & wrecking my head :(


  • Advertisement
Advertisement