I just started learning Go and I found myself creating a simple program that:
- Creates an
int
array of sizeSIZE
(expected to be >= 1000) - Iterates through its elements from 0 to 999, setting them to 0
- Prints how long everything took
Something like this:
package main
import (
"time"
"fmt"
)
const SIZE = 1000
func main() {
start := time.Now()
a := [SIZE]int {}
for i := 0; i < 1000; i++ { a[i] = 0 }
fmt.Println("Time: ", time.Since(start))
}
I got the following results after running on my machine 5 times:
3.375µs
2.831µs
2.698µs
2.655µs
2.59µs
However, if I increase SIZE
to 100000
(100x) the program becomes slower. These are the observed results on the same machine:
407.844µs
432.607µs
397.67µs
465.959µs
445.101µs
Why is the value of SIZE
making such a big difference? The number of iterations will always be the same (1000)...