Go SQL scanned rows getting overwritten -


i'm trying read rows table on sql server , store them in string slices use later. issue i'm running scanned rows getting overwritten every time scan new row, though i've converted mutable byte slices immutable strings , saved result slices slice. here code i'm using:

rawresult := make([]interface{}, len(cols)) // holds in row result := make([]string, len(cols)) // hold row elements strings var results [][]string // hold result string slices dest := make([]interface{}, len(cols)) // temporary, pass scan i, _ := range rawresult {     dest[i] = &rawresult[i] // fill dest pointers rawresult pass scan } rows.next() { // each row     err = rows.scan(dest...) // scan row     if err != nil {         log.fatal("failed scan row", err)     }     i, raw := range rawresult { // each scanned byte slice in row         switch rawtype := raw.(type){ // determine type, convert string         case int64:             result[i] = strconv.formatint(raw.(int64), 10)         case float64:             result[i] = strconv.formatfloat(raw.(float64), 'f', -1, 64)         case bool:             result[i] = strconv.formatbool(raw.(bool))         case []byte:             result[i] = string(raw.([]byte))         case string:             result[i] = raw.(string)         case time.time:             result[i] = raw.(time.time).string()         case nil:             result[i] = ""         default: // shouldn't reachable since types have been covered             log.fatal("unexpected type %t", rawtype)         }     }     results = append(results, result) // append result our slice of results } 

i'm sure has way go handles variables , memory, can't seem fix it. can explain i'm not understanding?

you should create new slice each data row. notice, slice has pointer underlying array, every slice added results have same pointer on actual data array. that's why have faced behaviour.


Comments