AWS Lambda function to relay fingerprint-ed / normalized MySQL Slow Query logs to Datadog.
This function normalizes SQL like below to aggregate metrics on Datadog.
SELECT id, name FROM tbl WHERE id = "1000"` => `SELECT id, name FROM tbl WHERE id = ?
SELECT id, name FROM tbl WHERE id IN (10, 20, 30)` => `SELECT id, name FROM tbl WHERE id IN (?+)
-
Enable slow_query_log parameter of your RDS database instance
-
Modify your database instance to export slow query
- Create Datadog API Key secrets with PLAIN TEXT format
You can find your API Key here: https://app.datadoghq.com/account/settings#api
- Download function.zip from github
https://github.com/samitani/mysql-slow-query-datadog-lambda/releases
- Create Lambda function with downloaded function.zip
Specify Python3 as Runtime, main.lambda_handler
as Handler
- Configure Lambda Environments below
Key | Value |
---|---|
DD_API_KEY_SECRET_ARN | AWS Secret Manager ARN of Datadog API KEY. eg) arn:aws:secretsmanager:ap-northeast-1:XXXXXXXXX:secret:DdApiKeySecret-XXXXXXXX |
DD_ENHANCED_METRICS | false |
DD_SITE | datadoghq.com |
-
Edit IAM role to allow this lambda function to get secrets
-
Create Lambda Subscription filter against your Slow Query log CloudWatch Log groups
Generate Metrics with below Grok parser.
SlowLogRule ^(\# Time: (%{date("yyMMdd H:mm:ss"):date}|%{date("yyMMdd HH:mm:ss"):date})\n+)?\# User@Host: %{notSpace: user1}\[%{notSpace: user2}\] @ (%{notSpace: host}| ) *\[%{regex("[0-9.]*"): ip}\] Id:[\x20\t]+%{number: id}\n+\# Query_time: %{number: query_time} *Lock_time: %{number: lock_time} *Rows_sent: %{number: rows_sent} *Rows_examined: %{number: rows_examined}\n(SET timestamp=%{number: timestamp};\n+)?%{regex("[a-zA-Z].*"):query}.
enhanced_lambda_metrics.py
and lambda_function.py
were borrowed from below Datadog repository.