I'm using influxDB to store my time series data.
I wrote a simple golang application to read lines from a file called time.log
.
The documentation at https://github.com/influxdata/influxdb/blob/master/client/README.md#inserting-data says:
Inserting Data
Time series data aka points are written to the database using batch inserts. The mechanism is to create one or more points and then create a batch aka batch points and write these to a given database and series. A series is a combination of a measurement (time/values) and a set of tags.
In this sample we will create a batch of a 1,000 points. Each point has a time and a single value as well as 2 tags indicating a shape and color. We write these points to a database called square_holes using a measurement named shapes.
NOTE: You can specify a RetentionPolicy as part of the batch points. If not provided InfluxDB will use the database default retention policy.
func writePoints(clnt client.Client) { sampleSize := 1000 rand.Seed(42) bp, _ := client.NewBatchPoints(client.BatchPointsConfig{ Database: "systemstats", Precision: "us", }) for i := 0; i < sampleSize; i++ { regions := []string{"us-west1", "us-west2", "us-west3", "us-east1"} tags := map[string]string{ "cpu": "cpu-total", "host": fmt.Sprintf("host%d", rand.Intn(1000)), "region": regions[rand.Intn(len(regions))], } idle := rand.Float64() * 100.0 fields := map[string]interface{}{ "idle": idle, "busy": 100.0 - idle, } bp.AddPoint(client.NewPoint( "cpu_usage", tags, fields, time.Now(), )) } err := clnt.Write(bp) if err != nil { log.Fatal(err) } }
But because I'm continuously reading data from the log. I'm never done reading the log. So what is the best way for me to write the points to the influx server?
Here is my current code:
cmdBP := client.NewBatchPoints(...)
for line := range logFile.Lines {
pt := parseLine(line.Text)
cmdBP.AddPoint(pt)
}
influxClient.Write(cmdBP)
Basically range logFile.Lines never terminates because it is based on a channel.