dounou9751 2013-11-29 22:14
浏览 56
已采纳

使用带有结构的反射来构建通用处理函数

I have some trouble building a function that can dynamically use parametrized structs. For that reason my code has 20+ functions that are similar except basically for one type that gets used. Most of my experience is with Java, and I'd just develop basic generic functions, or use plain Object as parameter to function (and reflection from that point on). I would need something similar, using Go.

I have several types like:

// The List structs are mostly needed for json marshalling
type OrangeList struct {
    Oranges []Orange
}

type BananaList struct {
    Bananas []Banana
}

type Orange struct {
    Orange_id string
    Field_1 int
    // The fields are different for different types, I am simplifying the code example
}

type Banana struct {
    Banana_id string
    Field_1 int
    // The fields are different for different types, I am simplifying the code example
}

Then I have function, basically for each list type:

// In the end there are 20+ of these, the only difference is basically in two types! 
// This is very un-DRY!
func buildOranges(rows *sqlx.Rows) ([]byte, error) {
    oranges := OrangeList{}     // This type changes
    for rows.Next() {
        orange := Orange{}      // This type changes
        err := rows.StructScan(&orange)   // This can handle each case already, could also use reflect myself too
        checkError(err, "rows.Scan")
        oranges.Oranges = append(oranges.Oranges,orange)
    }
    checkError(rows.Err(), "rows.Err")
    jsontext, err := json.Marshal(oranges)
    return jsontext, err
}

Yes, I could change the sql library to use more intelligent ORM or framework, but that's besides the point. I want to learn on how to build generic function that can handle similar function for all my different types.

I got this far, but it still doesn't work properly (target isn't expected struct I think):

func buildWhatever(rows *sqlx.Rows, tgt interface{}) ([]byte, error) {
    tgtValueOf := reflect.ValueOf(tgt)
    tgtType := tgtValueOf.Type()
    targets := reflect.SliceOf(tgtValueOf.Type())
    for rows.Next() {
        target := reflect.New(tgtType)
        err := rows.StructScan(&target) // At this stage target still isn't 1:1 smilar struct so the StructScan fails... It's some perverted "Value" object instead. Meh.
        // Removed appending to the list because the solutions for that would be similar
        checkError(err, "rows.Scan")
    }
    checkError(rows.Err(), "rows.Err")
    jsontext, err := json.Marshal(targets)
    return jsontext, err
}

So umm, I would need to give the list type, and the vanilla type as parameters, then build one of each, and the rest of my logic would be probably fixable quite easily.

  • 写回答

1条回答 默认 最新

  • dpb35161 2013-11-30 05:56
    关注

    Turns out there's an sqlx.StructScan(rows, &destSlice) function that will do your inner loop, given a slice of the appropriate type. The sqlx docs refer to caching results of reflection operations, so it may have some additional optimizations compared to writing one.

    Sounds like the immediate question you're actually asking is "how do I get something out of my reflect.Value that rows.StructScan will accept?" And the direct answer is reflect.Interface(target); it should return an interface{} representing an *Orange you can pass directly to StructScan (no additional & operation needed). Then, I think targets = reflect.Append(targets, target.Indirect()) will turn your target into a reflect.Value representing an Orange and append it to the slice. targets.Interface() should get you an interface{} representing an []Orange that json.Marshal understands. I say all these 'should's and 'I think's because I haven't tried that route.

    Reflection, in general, is verbose and slow. Sometimes it's the best or only way to get something done, but it's often worth looking for a way to get your task done without it when you can.

    So, if it works in your app, you can also convert Rows straight to JSON, without going through intermediate structs. Here's a sample program (requires sqlite3 of course) that turns sql.Rows into map[string]string and then into JSON. (Note it doesn't try to handle NULL, represent numbers as JSON numbers, or generally handle anything that won't fit in a map[string]string.)

    package main
    
    import (
        _ "code.google.com/p/go-sqlite/go1/sqlite3"
    
        "database/sql"
        "encoding/json"
        "os"
    )
    
    func main() {
        db, err := sql.Open("sqlite3", "foo")
        if err != nil {
            panic(err)
        }
        tryQuery := func(query string, args ...interface{}) *sql.Rows {
            rows, err := db.Query(query, args...)
            if err != nil {
                panic(err)
            }
            return rows
        }
        tryQuery("drop table if exists t")
        tryQuery("create table t(i integer, j integer)")
        tryQuery("insert into t values(?, ?)", 1, 2)
        tryQuery("insert into t values(?, ?)", 3, 1)
    
        // now query and serialize
        rows := tryQuery("select * from t")
        names, err := rows.Columns()
        if err != nil {
            panic(err)
        }
        // vals stores the values from one row
        vals := make([]interface{}, 0, len(names))
        for _, _ = range names {
            vals = append(vals, new(string))
        }
        // rowMaps stores all rows
        rowMaps := make([]map[string]string, 0)
        for rows.Next() {
            rows.Scan(vals...)
            // now make value list into name=>value map
            currRow := make(map[string]string)
            for i, name := range names {
                currRow[name] = *(vals[i].(*string))
            }
            // accumulating rowMaps is the easy way out
            rowMaps = append(rowMaps, currRow)
        }
        json, err := json.Marshal(rowMaps)
        if err != nil {
            panic(err)
        }
        os.Stdout.Write(json)
    }
    

    In theory, you could build this to do fewer allocations by not reusing the same rowMap each time and using a json.Encoder to append each row's JSON to a buffer. You could go a step further and not use a rowMap at all, just the lists of names and values. I should say I haven't compared the speed against a reflect-based approach, though I know reflect is slow enough it might be worth comparing them if you can put up with either strategy.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论

报告相同问题?

悬赏问题

  • ¥15 2020长安杯与连接网探
  • ¥15 关于#matlab#的问题:在模糊控制器中选出线路信息,在simulink中根据线路信息生成速度时间目标曲线(初速度为20m/s,15秒后减为0的速度时间图像)我想问线路信息是什么
  • ¥15 banner广告展示设置多少时间不怎么会消耗用户价值
  • ¥16 mybatis的代理对象无法通过@Autowired装填
  • ¥15 可见光定位matlab仿真
  • ¥15 arduino 四自由度机械臂
  • ¥15 wordpress 产品图片 GIF 没法显示
  • ¥15 求三国群英传pl国战时间的修改方法
  • ¥15 matlab代码代写,需写出详细代码,代价私
  • ¥15 ROS系统搭建请教(跨境电商用途)