dougao1542 2019-09-18 08:04
浏览 386

为什么数据库连接会自动关闭?

I'm having an issue with Gorm / Psql where my database connection get automatically closed.

I never call defer dbInstance.Close() in main.go (not anymore for now, I've removed it, since that's the only place in my code where I felt the connection could be wrongfully closed) nor was it ever anywhere else.

The way I'm initializing my db is with a "db" package that looks like this:

package db

import (
    "fmt"
    "github.com/jinzhu/gorm"
    _ "github.com/jinzhu/gorm/dialects/postgres"
)

var DbInstance *gorm.DB

func Init() *gorm.DB {
    if DbInstance != nil {
        return DbInstance
    }
    fmt.Println("Opening Db")
    db, err := gorm.Open("postgres", "host=localhost port=5432 user=admin dbname=dbname sslmode=disable password=")
    if err != nil {
        fmt.Println(err)
        panic("failed to connect database")
    }
    return db
}

then I call db.Init() in my main.go and then only call the db from "db.dbInstance" from the rest of my program.

As I've previously mentioned I used to call defer db.DbInstance.Close() from main.go but have tried removing it to see if it fixed the issue but it didn't.

What's strange is that the db connection will work for hours and hours in many different calls/function but always end up closing at some point.

From what i understand it should work :

gorm.Open() uses https://golang.org/pkg/database/sql/, which is threadsafe and handles connection pooling for you. Don't call gorm.Open() every request. Call it once when setting up your application, and make sure you use defer db.Close() so connections are cleanly ended when your program exits.

Lastly I need to add that it seems (i'm not 100% sure) that it's closing after I do batch inserts but again the .Close() function is never called, anywhere in my program.

I'm a bit lost as to what could be happening? Garbage collector (doesn't make sense the var is global)? psql driver closing in the background? Configuration issue?

I'm adding the batch function for reference just in case:

func InsertWithPostGresLimitSizeV2(DB *gorm.DB, array []interface{}) {
    if len(array) == 0 {
        return
    }
    numberOfParams := len(DB.NewScope(array[0]).Fields())
    // postgres is limited to 65535 params.
    maxStructPerBulk := int(65535 / numberOfParams)
    currentIndex := 0
    if len(array) > maxStructPerBulk {
        for len(array) > currentIndex {
            if (maxStructPerBulk + currentIndex) < len(array) {
                slice := array[currentIndex:(currentIndex + maxStructPerBulk)]
                currentIndex += maxStructPerBulk
                _, err := DB.BatchInsert(slice)
                log.Println(err)
            } else {

                slice := array[currentIndex:len(array)]
                currentIndex = len(array)
                _, err := DB.BatchInsert(slice)
                log.Println(err)
            }
        }
    } else {
        _, err := DB.BatchInsert(array)
        log.Println(err)
    }
}

func BatchInsert(db *gorm.DB,objArr []interface{}) (int64, error) {
    if len(objArr) == 0 {
        return 0, errors.New("insert a slice length of 0")
    }

    mainObj := objArr[0]
    mainScope := db.NewScope(mainObj)
    mainFields := mainScope.Fields()
    quoted := make([]string, 0, len(mainFields))
    for i := range mainFields {
        if (mainFields[i].IsPrimaryKey && mainFields[i].IsBlank) || (mainFields[i].IsIgnored) {
            continue
        }
        quoted = append(quoted, mainScope.Quote(mainFields[i].DBName))
    }

    placeholdersArr := make([]string, 0, len(objArr))

    for _, obj := range objArr {
        scope := db.NewScope(obj)
        fields := scope.Fields()
        placeholders := make([]string, 0, len(fields))
        for i := range fields {
            if (fields[i].IsPrimaryKey && fields[i].IsBlank) || (fields[i].IsIgnored) {
                continue
            }
            var vars interface{}
            if (fields[i].Name == "CreatedAt" || fields[i].Name == "UpdatedAt") && fields[i].IsBlank {
                vars = gorm.NowFunc()
            } else {
                vars = fields[i].Field.Interface()
            }
            placeholders = append(placeholders, mainScope.AddToVars(vars))
        }
        placeholdersStr := "(" + strings.Join(placeholders, ", ") + ")"
        placeholdersArr = append(placeholdersArr, placeholdersStr)

        mainScope.SQLVars = append(mainScope.SQLVars, scope.SQLVars...)
    }
    mainScope.Raw(fmt.Sprintf("INSERT INTO %s (%s) VALUES %s",
        mainScope.QuotedTableName(),
        strings.Join(quoted, ", "),
        strings.Join(placeholdersArr, ", "),
    ))
    if err := mainScope.Exec().DB().Error; err != nil {
        return 0, err
    }
    return mainScope.DB().RowsAffected, nil
}

On last thing is that I was thinking of " fixing " the issue by calling my db through but the ping would slow each of my calls:

func getDb() *gorm.DB {
    err := DbInstance.DB().Ping()
    if err != nil {
        fmt.Println("Connection to db closed opening a new one")
        return Init()
    }
    return DbInstance
}
  • 写回答

1条回答 默认 最新

  • dsv38843 2019-09-18 08:59
    关注

    You can global-search DbInstance.Close() to ensure not ever call it to close it yourself. If not, you kan set db timeout for longer and raise amount of idle db connections.

    At last, It's most important to support auto-reconnecting db dataSource.

    Here is part of my auto-reconnecting part you might refer to:

    var DB *gorm.DB
    func init() {
        dbConfig = fmt.Sprintf("host=%s user=%s dbname=%s sslmode=%s password=%s",
            "localhost",
            "postgres",
            "dbname",
            "disable",
            "password",
        )
        db, err := gorm.Open("postgres",
            dbConfig,
        )
        db.SingularTable(true)
        db.LogMode(true)
        db.DB().SetConnMaxLifetime(10 * time.Second)
        db.DB().SetMaxIdleConns(30)
        DB = db
        // auto-connect,ping per 60s, re-connect on fail or error with intervels 3s, 3s, 15s, 30s, 60s, 60s ...
        go func(dbConfig string) {
            var intervals = []time.Duration{3 * time.Second, 3 * time.Second, 15 * time.Second, 30 * time.Second, 60 * time.Second,
            }
            for {
                time.Sleep(60 * time.Second)
                if e := DB.DB().Ping(); e != nil {
                L:
                    for i := 0; i < len(intervals); i++ {
                        e2 := RetryHandler(3, func() (bool, error) {
                            var e error
                            DB, e = gorm.Open("postgres", dbConfig)
                            if e != nil {
                                return false, errorx.Wrap(e)
                            }
                            return true, nil
                        })
                        if e2 != nil {
                            fmt.Println(e.Error())
                            time.Sleep(intervals[i])
                            if i == len(intervals)-1 {
                                i--
                            }
                            continue
                        }
                        break L
                    }
    
                }
            }
        }(dbConfig)
    }
    

    By the way:

    // Try f() n times on fail and one time on success 
    func RetryHandler(n int, f func() (bool, error)) error {
        ok, er := f()
        if ok && er == nil {
            return nil
        }
        if n-1 > 0 {
            return RetryHandler(n-1, f)
        }
        return er
    }
    
    评论

报告相同问题?

悬赏问题

  • ¥15 微信会员卡接入微信支付商户号收款
  • ¥15 如何获取烟草零售终端数据
  • ¥15 数学建模招标中位数问题
  • ¥15 phython路径名过长报错 不知道什么问题
  • ¥15 深度学习中模型转换该怎么实现
  • ¥15 HLs设计手写数字识别程序编译通不过
  • ¥15 Stata外部命令安装问题求帮助!
  • ¥15 从键盘随机输入A-H中的一串字符串,用七段数码管方法进行绘制。提交代码及运行截图。
  • ¥15 TYPCE母转母,插入认方向
  • ¥15 如何用python向钉钉机器人发送可以放大的图片?