I'm trying to learn/understand some basic algorithms, and today I decided to write a binary tree in Go. This is what the structure looks like:
type Node struct {
Value int
Left *Node
Right *Node
}
Here's my function to check if the tree contains an int:
func (tree *Node) Contains(val int) bool {
if val == tree.Value {
return true
} else if val > tree.Value {
if tree.Right != nil {
return tree.Right.Contains(val)
} else {
return false
}
} else if val < tree.Value {
if tree.Left != nil {
return tree.Left.Contains(val)
} else {
return false
}
} else { // huh
return false
}
}
I wrote a test function to test how long different operations on the tree take. It takes my Insert()
function 34ms to insert 100,000 random integers, and it takes my Contains()
function 33ms to check if the tree contains 100,000 random integers. If I bump the number of random integers up to 1,000,000, it takes my Insert()
function 34ms to run, but my Contains()
function suddenly takes 321ms to run.
Why does Contains()
run time increase so drastically, while Insert()
stays practically the same?