Creating a DB2 client container as Prometheus scraping target in K8s
I need to mine some business metrics from a DB2 database and present it in the Grafana dashboard. A Prometheus scraping target is to be developed.
I published a Medium paper about 3 years ago to run DB2 queries in Golang by using the DB2 ODBC/CLI driver. Following that, let's create a container image as a Prometheus scraping target and run it in Kubernetes.
The app as a scraping target
Some Golang code excerpts to describe how the data collection works,
The DB and metric struct
type MetricConfig struct {
Name string `yaml:"name"`
Desc string `yaml:"desc"`
Sql string `yaml:"sql"`
Frequency string `yaml:"frequency"`
}type DBMetricsConfig struct {
Dsn string `yaml:"dsn,omitempty"`
User string `yaml:"user,omitempty"`
Password string `yaml:"password,omitempty"`
Metrics []MetricConfig `yaml:"metrics,omitempty"`
}
The Dsn is the DB2’s ODBC Data Source Name. The metric is a type of Prometheus Gauge. The metric value will be set from a SQL query that returns a single row and a single column. We run the data collection as a cron job using the jobrunner library.
The initialization function,
func ScheduleDBMetricJob(mc DBMetricsConfig) error {
user := os.Getenv(mc.User)
password := os.Getenv(mc.Password)
db, err := sqlx.Open("odbc", fmt.Sprintf("DSN=%s;uid=%s;pwd=%s", mc.Dsn, user, password))
if err != nil {
logrus.Errorf("Error opening database: %v", err)
return err
} db.SetMaxIdleConns(5)
db.SetMaxOpenConns(5) for _, met := range mc.Metrics {
gauge := prometheus.NewGauge(prometheus.GaugeOpts{
Namespace: "db2",
Subsystem: "mon",
Name: met.Name,
Help: met.Desc,
})
prometheus.MustRegister(gauge)
jobrunner.Schedule(fmt.Sprintf("@every %s", met.Frequency), DBMetricJob{
db: db,
sql: met.Sql,
gauge: gauge,
})
} return nil
}
The actual data collection function shown below is straightforward, thanks to the sqlx
helper function.
type DBMetricJob struct {
db…