Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Big Oh vs Big Theta For Complexity Theory

  • 13-12-2010 6:57pm
    #1
    Registered Users, Registered Users 2 Posts: 505 ✭✭✭


    Hi guys,

    I hope the Maths forum is the correct place to post this. I'm looking at complexity theory in Algorithms and am having a hard time getting my head around Big Oh and Big Theta notation.

    I think I understand Big Oh g(x) = O(f(x)), not too sure about g(x)= Θ(f(x)).

    My understanding is that for g(x) = O(f(x)), g(x) must be upper bound by f(x). i.e., there exists some constant C>0, where g(x) <= C.f(x)

    Then for g(x)= Θ(f(x), g(x) must be upper and lower bound by f(x). i.e., there exists some constants C1,C2 >0, where C1.f(x) <= g(x) <= C2.f(x)

    I think this is correct?

    So, for the following:

    f(x) = x^(3/2)
    g(x) = 2x^(3/2) + (x+1)^-1

    g(x) = O(f(x)) is false, as no matter what positive C is chosen, g(x) can never be bound from above by C.f(x)?

    g(x)= Θ(f(x)) is false, as the only C1 that would make this true, has to be negative?

    I've attached a graph of the functions to try to make this clearer.

    My head is spinning with all the symbols - if anyone could help me out, I'd be very grateful!


Comments

  • Registered Users, Registered Users 2 Posts: 1,595 ✭✭✭MathsManiac


    stiofan85 wrote: »
    I think I understand Big Oh g(x) = O(f(x)), not too sure about g(x)= Θ(f(x)).

    My understanding is that for g(x) = O(f(x)), g(x) must be upper bound by f(x). i.e., there exists some constant C>0, where g(x) <= C.f(x)

    Then for g(x)= Θ(f(x), g(x) must be upper and lower bound by f(x). i.e., there exists some constants C1,C2 >0, where C1.f(x) <= g(x) <= C2.f(x)

    I think this is correct?

    Assuming you're using the same notation as here, then you're almost right. You've left out the important point that they only need to be asymptotically bounded in this way. That is, the bound doesn't have to hold for all x, it only has to eventually always hold. That is, there has to exist some X such that it holds for all x>X. (Usually stated with n rather than x by the way, in computation theory, since you're dealing with integers).

    So, in the example you give, g(x) is actually O(f(x)). As x gets large, the second term of g gets very small, so the first is the dominant term.

    If you choose any constant C>2, then you'll be able to find a value of X such that, for all x>X, g(x) < C*f(x).

    You don't have to get the smallest such X, or any particularly small C; any bound of the right form will do. For example, if you can satisfy yourself that, for all x>100, g(x) < 3*f(x), then you've shown that f(x) is O(g(x)). Of course, I could get a much tighter bound than that, but it doesn't matter.


  • Registered Users, Registered Users 2 Posts: 505 ✭✭✭stiofan85


    So, in the example you give, g(x) is actually O(f(x)). As x gets large, the second term of g gets very small, so the first is the dominant term.

    If you choose any constant C>2, then you'll be able to find a value of X such that, for all x>X, g(x) < C*f(x).

    You don't have to get the smallest such X, or any particularly small C; any bound of the right form will do. For example, if you can satisfy yourself that, for all x>100, g(x) < 3*f(x), then you've shown that f(x) is O(g(x)). Of course, I could get a much tighter bound than that, but it doesn't matter.


    Ah, yes I see now. I follow you. Thanks for that!

    The lecturer's notes didn't specify Asymptotically so that's why I left it out, but now that you've pointed that out, it makes a lot more sense.

    I've spent most of the day on this now, and my brain's a bit fried, so I'll come back to it in the morning and hopefully it'll all still make sense. If I've any more questions I'll be looking for help!

    Muchos thanks!


  • Registered Users, Registered Users 2 Posts: 505 ✭✭✭stiofan85


    Assuming you're using the same notation as here, then you're almost right. You've left out the important point that they only need to be asymptotically bounded in this way. That is, the bound doesn't have to hold for all x, it only has to eventually always hold. That is, there has to exist some X such that it holds for all x>X. (Usually stated with n rather than x by the way, in computation theory, since you're dealing with integers).

    So, in the example you give, g(x) is actually O(f(x)). As x gets large, the second term of g gets very small, so the first is the dominant term.

    If you choose any constant C>2, then you'll be able to find a value of X such that, for all x>X, g(x) < C*f(x).

    You don't have to get the smallest such X, or any particularly small C; any bound of the right form will do. For example, if you can satisfy yourself that, for all x>100, g(x) < 3*f(x), then you've shown that f(x) is O(g(x)). Of course, I could get a much tighter bound than that, but it doesn't matter.

    :D CLICK! It all just clicked into place!

    thanks so much!


  • Registered Users, Registered Users 2 Posts: 505 ✭✭✭stiofan85


    stiofan85 wrote: »
    So, for the following:

    f(x) = x^(3/2)
    g(x) = 2x^(3/2) + (x+1)^-1

    It all came apart on me again - could I just clarify that g(x) is O(f(x)) because there is no C where g(x) is lower bound by Cf(x)?

    i.e. g(x) can only be upper bound asymptotically by Cf(x), C>0?

    :confused:

    thanks again!

    edit: I think this video has cleared it up for me, again!

    http://www.youtube.com/watch?v=6Ol2JbwoJp0


  • Registered Users, Registered Users 2 Posts: 1,595 ✭✭✭MathsManiac


    That's a great video, and exactly what you needed, by the sound of it. I think your edit indicates that you don't need your question answered, but just in case:

    g(x) is O(f(x)), because g(x) can eventually be bounded above by a constant multiple of f(x).

    g(x) is OMEGA(f(x)), because g(x) can be eventually bounded below by a constant multiple of f(x). (This is more obvious than the previous statement, as g(x) is clearly greater than f(x) for all x>1.)

    Because g(x) is both O(f(x)) and OMEGA(f(x)), it is therefore THETA(f(x)).

    By the way, it might also be helpful to note that Big-oh and Big-omega are flipsides of eachother: For any functions f and g, the statement that g(x) is O(f(x)), is equivalent to the statement that f(x) is OMEGA(g(x)). This is bacause if f(x) is eventually bounded above by c*g(x), then, from the same value of x onwards, g(x) is bounded below (1/c)*f(x).


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 505 ✭✭✭stiofan85


    That's a great video, and exactly what you needed, by the sound of it. I think your edit indicates that you don't need your question answered, but just in case:

    g(x) is O(f(x)), because g(x) can eventually be bounded above by a constant multiple of f(x).

    g(x) is OMEGA(f(x)), because g(x) can be eventually bounded below by a constant multiple of f(x). (This is more obvious than the previous statement, as g(x) is clearly greater than f(x) for all x>1.)

    Because g(x) is both O(f(x)) and OMEGA(f(x)), it is therefore THETA(f(x)).

    By the way, it might also be helpful to note that Big-oh and Big-omega are flipsides of eachother: For any functions f and g, the statement that g(x) is O(f(x)), is equivalent to the statement that f(x) is OMEGA(g(x)). This is bacause if f(x) is eventually bounded above by c*g(x), then, from the same value of x onwards, g(x) is bounded below (1/c)*f(x).


    Thanks for your patience - I think I've got the theory down completely now, so one final question if that's alright?

    if f(n) = n^2

    give an example of a function g(n) that is THETA(f(n))
    and h(n) that is O(f(n)), but not THETA(f(n))

    so my example is g(n) = 2n^2 + (n+1)^-1 ---this seems straight forward enough and I've a graph that shows it can be bound above and below by f(n)

    I can't find a h(n) that is O(f(n)) though - all of the ones I try out can be upper and lower bound! would h(n) = n^1/2 be an option?

    Thanks again, I really appreciate the help!


  • Registered Users, Registered Users 2 Posts: 1,595 ✭✭✭MathsManiac


    Any power lower than n^2 will do for h(n).

    So yes, your answer is correct, as is h(n)=n, or h(n)=n^(1.9).


  • Registered Users, Registered Users 2 Posts: 505 ✭✭✭stiofan85


    Any power lower than n^2 will do for h(n).

    So yes, your answer is correct, as is h(n)=n, or h(n)=n^(1.9).

    Thank you so much MM. Really appreciate this - I asked a lad doing a Phd in Comp. Science and his reply was "ah man, I hate that Big O sh*te - I couldn't tell you!", so to say I was a bit stumped is an understatement!

    Exam is tomorrow, so hopefully this'll come up!

    Thanks again! :)


Advertisement