First of all let me explain the problem.
I have a stream of JSON records coming into my Golang app. It basically forwards these to a data store (InfluxDB). There are some integer values in the JSON, and also some float values. It is essential that these get forwarded to the data store with the original data type. If they don't, there will be type conflicts and the write operation will fail.
The Ruby JSON parser has no trouble doing this:
require 'json'
obj = { "a" => 123, "b" => 12.3 }
parsed = JSON.parse(obj.to_json)
print parsed["a"].class # => Integer
print parsed["b"].class # => Float
The encoding/json
package in Golang, does have some trouble (all numbers are parsed as floats):
package main
import "encoding/json"
import "fmt"
func main () {
str := "{\"a\":123,\"b\":12.3}"
var parsed map[string]interface{}
json.Unmarshal([]byte(str), &parsed)
for key, val := range parsed {
switch val.(type) {
case int:
fmt.Println("int type: ", key)
case float64:
fmt.Println("float type: ", key)
default:
fmt.Println("unknown type: ", key)
}
}
}
Which prints:
float type: a
float type: b
I need a way to parse ints as ints, and floats as floats, in the way the Ruby JSON parser does.
It is not feasible in this case to parse everything as strings and check whether or not there is a decimal. Certain values come in as strings such as "123" and I need to push those as strings.
I do not have structs for the parsed objects, nor is that an option. The golang app doesn't actually care about the schema and just forwards the input as it receives it.
I tried the approach outlined here: Other ways of verifying reflect.Type for int and float64 (using reflect
) but it did not accurately identify the int:
t := reflect.TypeOf(parsed["a"])
k := t.Kind()
k == reflect.Int // false
k == reflect.Float64 // true