Monthly Archives: July 2013

Building a Personal Website.

This site is built using

  • Ubuntu
  • AWS : EC2, Elastic IP
  • Apache : 3 Sites, Virtual Hosts, a2ensite, a2enmod, mod proxy http, customized document Root
  • JRuby on Rails
  • JQuery
  • Twitter Bootstrap: JS and CSS
  • Google Drive: Docs and Presentations
  • Atlassian: Jira and GreenHopper
  • Git
  • Google Analytics
  • WordPress

Below is the sequence of tasks that I took thus far

  • Setup an EC2 node
    • Purchase a reserved instance of m1.large type
    • choice of m1.large was made based on three factors
      • Considerable memory may be required for the large number of applications that will be setup
      • Higher CPU may not be required given that the load will not be high
      • micro, small, medium may be too small for the memory requirements and m2.xlarge would be too high
    • Purchase a reserved instance of shortest term possible (In this case I purchased a node from a 3rd party for 1 month term)
    • Consider a heavy utilization instance since, the node has to be up and running all the time and load cannot be predicted.
    • Take Ubuntu 13.04 AMI
  • Setup Apache
  • Setup MySQL
  • Purchase and Configure a domain name
  • Purchase Jira Starter license and Setup Jira with MySQL as backend
  • Setup WordPress with MySQL as backend
  • Setup a private repository on to host website project
  • Configure website with Twitter bootstrap starter template
  • Configure website to centralize all content


Big O comparison

logx < x < x*logx < x^2 < 2^x

  1. Each part of the above statement is true beyond a certain value of x and
  2. All of the above statement is also true for x > 2 (if log is taken at base 2)
  3. Now what’s missing above is how does 2 ^ logx compare to the above values? Hmmm how about an image?
  4. Here’s how, below is the diagram comparing 2 ^ logx to the rest.(Octave code). Remember a ^ log b is same as b ^ log a hence 2 ^ logx is same as x ^ log2. which is less than x since  log2 to the base e < 1.

logx  <  2 ^ logx  <  x  <  x*logx  <  x^2  <  2^x


Why is the Big O of merge sort not 2^logn?

Even after hearing. learning, programming merge sort several times in the past, suddenly I am caught in a logical knot today again. I know that Big O of Merge Sort is n*logn but

  • since there are atleast 1 comparisons at the last step and
  • 2 comparisons at the last but one step and
  • 4 comparisons in the last but second steps and so on..
  • isn’t 1 + 2 + 4 + ….n/2 =  (1 – 2^logn)/(1-2) = 2^logn
  • i.e. it seems the merge sort is O(2^logn).

Here’s how I untangled it by writing down a break down of all the steps involved.

  1. lets say values is an int array of size n.
  2. During merge sort we break it two halves of size n/2 and 
  3. continue to perform the same operation on each of the halves until we get arrays of sie 1
  4. at this point we merge two arrays of size 1 by comparing them and we get array of size 2
  5. Similarly 1 comparison is required at least n/2 times
  6. this is because we are breaking n into n/2 two size arrays recursively.
  7. From above it seems that we need to make n/2 1 comparisons and n copy statements in the worst case i.e. n/2 + n are the total number of operations required to come up with n/2 sorted 2 size arrays.
  8. Taking this further we need (n/4)*2 + n here there are 2 comparisons required and this continues
  9. given above the total number of steps seems to be = (n/2 + n) + ((n/4)*2 + n) + …
  10. i.e. nlogn + n/2 * logn
  11. i.e. 3n/2 logn

QED and my mind is restored, the logical mistake that I committed was assuming that 1 comparison was required only once and 2 comparisons were required only one etc. but in fact

  1. 1 comparisons was required n/2 times
  2. 2 comparison were required n/4 times and so on untill
  3. n/2 comparisons which was required only once
  4. i.e. n/2 + n2+ … = n/2 * logn comparisons in total