Wednesday, February 23, 2011

Introduction to Algorithms Lecture 2

Available on youtube at:

http://www.youtube.com/user/basaratali?feature=mhum#p/a/A7062CB4B95A85BA/1/whjt_N9uYFI

Here is what is covered in it:

  • The bit O notation 
    • Yes he is correct in the beginning when he writes 2n^2 = O(n^3) . It is because the notation itself is open for the function (2n^2) being < and =  to O but we like to keep the O result (being engineers) to be as close to the original function as possible. Actually we are using the Theta notation which is a valid O as well.  
  • The Omega notation 
  • The Theta notation 
    • Engineering definition covered in lecture one : drop the leading constant and lower order terms :)
  • The small O and small Omega notation. The difference from their big brothers is that these are for < and > rather than <= and >= 
And then the cool stuff 
  • Asymptotic analysis of recursive algorithms. The algorithm complexity is known and defined as T(n) (call this expression 1).
    Methods:
    • Using substitution method. It uses mathematical induction by saying if big O for T(k) is (put an assumed/guessed expression in terms of k ... call this expression 2) complexity for n should be T(n) <  (put n in expression 2). The induction is carried out by putting T(n) from expression 2 into expression 1 and verifying the "should be" clause. If you find this tricky the lecture is awesome for explaining this. 
    • Using a recursion tree. The instructor's favorite method. You need to draw it to explain it so watch the video :) It was also used in lecture one for merge sort. 
    • Using the master method : The recursion algorithm must be of equal sized sub problems. e.g. in merge sort we divide the original problem into two equal sized (two equal sized means halved) sub problems. It will not work on e.g. T(n) = T(n/2) + T(n/3) etc. You can do non recursive work (denoted by f(n) ). So in short the limitation is that T(n) must be of the form aT(n/b) + f(n).
      The actual method are three cases (where you do a substiution in T(n) expression 1 and observe the result) that need to be remembered and it will immediately give you the running time.
      The lecture also presents an intuitive derivation of why the master method works (COOL). The guy that pointed out the height of the recursion tree to be "log (base b) of n" beat me to it. I felt a bit hurt that it didn't come to me immediately (sad pause). This image explains it:
Enjoy!

No comments:

Post a Comment