Coming from a python background, and just starting with Go, I found myself looking for the equivalent of the map() and reduce() functions in Go. I didn't find them, so fell back on for loops. For example, this is what I used instead of a map(), where mapFunction is defined elsewhere:
data := make([]byte, 1024)
count, err := input.Read(data) // error handling removed from this snippet
for i:=0; i<count; i++ {
data[i] = mapFunction(data[i])
}
and this is what I used instead of a reduce(), where there are 2 state variables that I'm using to keep track of quoting of fields in a CSV as the code moves through each item in the slice:
data := make([]byte, 1024)
count, err := input.Read(data) // error handling removed from this snippet
for i:=0; i<count; i++ {
data[i], stateVariable1, stateVariable2 =
reduceFunction(data[i], stateVariable1, stateVariable2)
}
Here are my questions:
- Are there builtin capabilties for this that I missed?
- Is it appropriate to use mutable slices for each of these?
- Would it be a good idea to use goroutines for the map()? Would that allow decoupling of the IO operation to read the file and the process to run the mapping function on each item, and therefore allow parallelization?
- Is it correct to say that goroutines would not be appropriate for the reduce() function because the 2 state variables are defined by all of the preceding data, and it must proceed sequentially. In other words, this sequential process cannot benefit from concurrent architecture?
Thanks!
ps - the full code is here: https://github.com/dbro/csvquote/blob/go/csvquote.go