I do lot of queries resulting in map or slice/array of map, something like this:
// package M
type SX map[string]interface{}
type IX map[int64]interface{}
type IAX map[int64][]interface{}
type SAX map[string][]interface{}
type SS map[string]string
type SF map[string]float64
type II map[int64]int64
type IB map[int64]bool
type SI map[string]int64
type IS map[int64]string
type SB map[string]bool
// package A
type X []interface{}
type MSX []map[string]interface{}
So I could declare it something like this:
// import `gitlab.com/kokizzu/gokil/A`
// import `gitlab.com/kokizzu/gokil/M`
values := M.SX{
`orderId`: `1-12-1`,
`apiKey`: `16313c061a8e3288528123bd8`,
`country`: `360`,
`currency`: `360`,
`payType`: 1,
`items`: A.MSX{
M.SX{
`code`: `subscription for 7 days`,
`name`: `Bla bla`,
`price`: price,
},
},
`profile`: M.SX{
`entry`: A.MSX{
M.SX{
`key`: `need_mno_id`,
`value`: `yes`,
},
M.SX{
`key`: `foo`,
`value`: `bar`,
},
},
},
`profile`: A.MSX{
M.SX{`foo`:`bar`,`age`:123},
M.SX{`foo`:`wow`,`age`:234,`currency`:360},
M.SX{`foo`:`such`,`age`:45,`is_admin`:true},
M.SX{`foo`:`wow`,`age`:57,`is_deleted`:true},
},
}
Which one of the list other than encoding/gob
and encoding/json
, that support this kind of serialization (no need to generate struct/schema)?
github.com/alecthomas/binary
github.com/davecgh/go-xdr/xdr
github.com/Sereal/Sereal/Go/sereal
github.com/ugorji/go/codec
gopkg.in/vmihailenco/msgpack.v2 --> has example for enc/dec-ing a map
labix.org/v2/mgo/bson
github.com/tinylib/msgp (code generator for msgpack)
github.com/golang/protobuf (generated code)
github.com/gogo/protobuf (generated code, optimized version of goprotobuf)
github.com/DeDiS/protobuf (reflection based)
github.com/google/flatbuffers
github.com/hprose/hprose-go/io
github.com/glycerine/go-capnproto
zombiezen.com/go/capnproto2
github.com/andyleap/gencode
github.com/pascaldekloe/colfer
Note: there is nothing wrong with Gob
(currently I use them), I just need to prepare for the alternative (or start with the best one) when Gob
no longer suffice (not fast/small enough) since I use it to cache the database (with ever changing schema) queries result on RAM.