The Fibonacci sequence is the sequence of integers where the next number is the sum of the previous two. While there are numerous examples in nature and many applications this post will focus on algorithms for computing any given number in the sequence. Determining the nth number in the Fibonacci sequence is a common exercise in computer science courses and interviews.

The first method is a direct translation of the recurrence relation definition.

func recursive(n int) int {
  switch n {
    case 0:
      return 0
    case 1:
      return 1
      return recursive(n-1) + recursive(n-2)

The computational complexity of this method is O(n^2) which can be derived by counting the number of stack frames generated in an example run. Many of these function calls are repeated and therefore could benefit from memoization.

func memoize(n int) int {
  cache := make(map[int] int)
  cache[0] = 0
  cache[1] = 1
  return memoizeInternal(n, cache)

func memoizeInternal(n int, cache map[int] int) int {
  if res, ok := cache[n]; ok {
    return res

  res := memoizeInternal(n-1, cache) + memoizeInternal(n-2, cache)
  cache[n] = res
  return res

Memoization will reduce the complexity to O(n), one addition per number in the sequence. For ever other stack frame we will return the cached answer. The memoized method will also need linear memory. The next approach can compute F(n) in linear time with constant memory.

func iterative(n int) int {
  a := 0
  b := 1
  for i := 0; i < n; i++ {
    a += b
    a, b = b, a
  return a

By using a different form of the definition for the Fibonacci sequence, the result can be computed in O(lg(n)) time.

| 1 1 |      | F(n+1)  F(n)   |
|     |   =  |                |
| 1 0 |      | F(n)    F(n-1) |

Taking advantage of the fact that the matrix is always 2x2 we can compute the answer without a fully generic implementation of matrix chain multiplication.

type matrix [][]int

func qmatrix(n int) int {
    cache := make(map[int] matrix)
    return qmatrixInternal(n, cache)

func qmatrixInternal(n int, cache map[int] matrix) int {
    if n < 2 {
        return n

    var q matrix
    q = append(q, []int{1,1})
    q = append(q, []int{1,0})
    fmt.Printf("%d %d\n%d %d\n", q[0][0], q[0][1], q[1][0], q[1][1])

    var matricies []matrix
    i := 0
    for n > 0 {
        exp := n & 0x1
        n = n >> 1
        pow := exp << uint(i)
        i += 1
        if exp != 0 {
            matricies = append(matricies, matrixPowerOfTwo(q, pow, cache))

    fmt.Println("%q\n", matricies)

    for len(matricies) > 1 {
        m1 := matricies[len(matricies)-1]
        matricies = matricies[:len(matricies)-1]
        m2 := matricies[len(matricies)-1]
        matricies = matricies[:len(matricies)-1]
        matricies = append(matricies, matrixMultiply(m1,m2))
    return matricies[0][0][1]

func matrixMultiply(a, b matrix) matrix {
    c11 := a[0][0]*b[0][0] + a[0][1]*b[1][0]
    c12 := a[0][0]*b[0][1] + a[0][1]*b[1][1]
    c21 := a[1][0]*b[0][0] + a[1][1]*b[1][0]
    c22 := a[1][0]*b[0][1] + a[1][1]*b[1][1]
    var c matrix
    c = append(c, []int{c11, c12})
    c = append(c, []int{c21, c22})
    return c

func matrixPowerOfTwo(m matrix, p int, cache map[int] matrix) matrix {
    if p == 1 {
        return m
    if val, ok := cache[p]; ok {
        return val

    k := matrixPowerOfTwo(m, p/2, cache)
    res := matrixMultiply(k,k)
    cache[p] = res
    return res

All of the code samples in this post can be found at

Yesterday I stumbled across a project on twitter that fascinated me so I figured that I would share it here. Handmade Hero is a quest to build a video game from scratch completely live streamed. Building from scratch is how I like to do everything, especially learning.

Handmade Hero streams live every weekday at 8PM PST1 and has an archive of all previous sessions on youtube. Don't worry, the project started on 17 Nov 2014 so there are only a handful of episodes currently in the backlog.

Go watch the trailer and preorder the game for full access to the source code.

  1. Pretty sure it means 'Whatever time zone Seattle is in' not strictly PST, but I just quoted from the website. 

Previously, I had looked at calculating the Levenshtein distance between two strings. In that post I did some hand waving and claimed that the dynamic implementation will not register with the time command. Well I decided to test that claim and here is the proof.

#include <iostream>
#include <string>
using namespace std;
int minimum(int d, int i, int s);

int main(){
 string s, t;
 cin >> s >> t;
 int sl = s.length()+1;
 int tl = t.length()+1;
 int* previous = new int[sl];
 int* current = new int[sl];
 //initialize current
 for(int j=0; j<sl; ++j) *(current+j) = j;
 for(int j=1; j<tl; ++j){
  //Swap current and previous
  int* tmp = previous;
  previous = current;
  current = tmp;
  *(current) = j;
  for(int k=1; k<sl; ++k){
   if( == *(current+k) = *(previous+k-1);
    *(current+k) = minimum(*(previous+k),*(current+k-1),*(previous+k-1))+1;
 cout << *(current+sl-1) << endl;
 return 0;

int minimum(int d, int i, int s){
 int min = d;
 if(i < min) min = i;
 if(s < min) min = s;
 return min;

The Levenshtein distance is a measure of the number of changes required to get one string from another. The possiblities are insertion, deletion, and replacement. Below we calculate the distance between the last names of two of the NFL's longest players, Roethlisberger and Hhoushmandzadeh.

Both names contain 14 letters, thus the Levenshtein distance must be between 0 and 14. That makes fifteen possiblities for the correct answer. Let's see how many function calls it takes with this recursive implementation to determine the answer.


using namespace std;

typedef unsigned long long int u64;
u64 nFc=0;

u64 lev(char *s, char *t){
 if(!*s) return strlen(t);
 if(!*t) return strlen(s);
 u64 min=1+lev(s+1,t);
 u64 c = lev(s,t+1);
 if(min>c) min=c;
 u64 d=(*s!=*t);
 if(min>c) min=c;
 return min;

int main(){
 cout << lev("roethlisberger","houshmandzadeh") << endl;
 cout << nFc << endl;
 return 0;

11 character changes between "roethlisberger" and "houshmandzadeh".

This implementation only took 11,885,772,379 calls and 92 seconds to determine the correct answer. Amazing. Now if we were to build a table of all possible changes, it would be a 15 by 15 matrix. 255 possible manipulations. This would take approximately 255 function calls to generate or .000002% of the recursive implementation. Total time to run: .0000195 seconds. Not even enough to register with the Unix time command.

The Levenshtein Distance problem is a classic example of dynamic programming. For a more detailed analysis checkout Wikipedia.