Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions config/_default/menus/main.en.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5790,6 +5790,11 @@ menu:
parent: log_collection
identifier: log_collection_opentelemetry
weight: 115
- name: Log Submission From Agent Checks
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be Agent Integrations?

Suggested change
- name: Log Submission From Agent Checks
- name: Agent Integrations

url: logs/log_collection/agent_checks/
parent: log_collection
identifier: log_collection_agent_checks
weight: 115.5
- name: Other Integrations
url: integrations/#cat-log-collection
identifier: other_integrations
Expand Down
32 changes: 30 additions & 2 deletions content/en/developers/integrations/agent_integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ description: Learn how to develop and publish a Datadog Agent integration.

This page walks Technology Partners through how to create a Datadog Agent integration, which you can list as out-of-the-box on the [Integrations page][23], or for a price on the [Marketplace page][24].

An Agent-based integration uses the [Datadog Agent][17] to submit data through custom checks written by developers. These checks can emit [metrics][34], [events][18], and [service checks][25] into a customer's Datadog account. While the Agent itself can submit [logs][26], this is configured outside of the check.
An Agent-based integration uses the [Datadog Agent][17] to submit data through custom checks written by developers. These checks can emit [metrics][34], [events][18], [service checks][25], and [logs][36] into a customer's Datadog account.

## When to use Agent-based integrations

Expand Down Expand Up @@ -201,13 +201,14 @@ At the core of each Agent-based integration is an *Agent Check* that periodicall

For Awesome, the Agent Check is composed of a [service check][25] named `awesome.search` that searches for a string on a web page. It results in `OK` if the string is present, `WARNING` if the page is accessible but the string was not found, and `CRITICAL` if the page is inaccessible.

To learn how to submit metrics with your Agent Check, see [Custom Agent Check][7].
To learn how to submit metrics with your Agent Check, see [Custom Agent Check][7]. To learn how to submit logs from your Agent Check, see [Log Submission From Agent Checks][36].

The code contained within `awesome/datadog_checks/awesome/check.py` looks something like this:

{{< code-block lang="python" filename="check.py" collapsible="true" >}}

import requests
import time

from datadog_checks.base import AgentCheck, ConfigurationError

Expand All @@ -231,14 +232,40 @@ class AwesomeCheck(AgentCheck):
except Exception as e:
# Ideally we'd use a more specific message...
self.service_check('awesome.search', self.CRITICAL, message=str(e))
# Submit an error log
self.send_log({
'message': f'Failed to access {url}: {str(e)}',
'timestamp': time.time(),
'status': 'error',
'service': 'awesome',
'url': url
})
# Page is accessible
else:
# search_string is present
if search_string in response.text:
self.service_check('awesome.search', self.OK)
# Submit an info log
self.send_log({
'message': f'Successfully found "{search_string}" at {url}',
'timestamp': time.time(),
'status': 'info',
'service': 'awesome',
'url': url,
'search_string': search_string
})
# search_string was not found
else:
self.service_check('awesome.search', self.WARNING)
# Submit a warning log
self.send_log({
'message': f'String "{search_string}" not found at {url}',
'timestamp': time.time(),
'status': 'warning',
'service': 'awesome',
'url': url,
'search_string': search_string
})
{{< /code-block >}}

To learn more about the base Python class, see [Anatomy of a Python Check][8].
Expand Down Expand Up @@ -504,3 +531,4 @@ In addition to any code changes, the following is required when bumping an integ
[33]: https://docs.datadoghq.com/developers/integrations/check_references/
[34]: https://docs.datadoghq.com/metrics/
[35]: https://docs.datadoghq.com/agent/guide/use-community-integrations/
[36]: https://docs.datadoghq.com/logs/log_collection/agent_checks/
15 changes: 12 additions & 3 deletions content/en/logs/log_collection/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ Datadog integrations and log collection are tied together. You can use an integr

## Reduce data transfer fees

Use Datadog's [Cloud Network Monitoring][7] to identify your organization's highest throughput applications. Connect to Datadog over supported private connections and send data over a private network to avoid the public internet and reduce your data transfer fees. After you switch to private links, use Datadogs [Cloud Cost Management][8] tools to verify the impact and monitor the reduction in your cloud costs.
Use Datadog's [Cloud Network Monitoring][7] to identify your organization's highest throughput applications. Connect to Datadog over supported private connections and send data over a private network to avoid the public internet and reduce your data transfer fees. After you switch to private links, use Datadog's [Cloud Cost Management][8] tools to verify the impact and monitor the reduction in your cloud costs.

For more information, see [How to send logs to Datadog while reducing data transfer fees][9].

Expand All @@ -122,8 +122,17 @@ For more information, see [How to send logs to Datadog while reducing data trans
[7]: /network_monitoring/cloud_network_monitoring/
[8]: /cloud_cost_management/
[9]: /logs/guide/reduce_data_transfer_fees/




{{% /tab %}}

{{% tab "Agent Check" %}}

If you are developing a custom Agent integration, you can submit logs programmatically from within your Agent check using the `send_log` method. This allows your custom integration to emit logs alongside metrics, events, and service checks.

To learn how to submit logs from your custom Agent check, see [Log Submission From Agent Checks][15].

[15]: /logs/log_collection/agent_checks/
{{% /tab %}}
{{< /tabs >}}

Expand Down
191 changes: 191 additions & 0 deletions content/en/logs/log_collection/agent_checks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,191 @@
---
title: Log Submission From Agent Checks
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
title: Log Submission From Agent Checks
title: Agent Integration Log Collection

further_reading:
- link: "/developers/integrations/agent_integration/"
tag: "Documentation"
text: "Create an Agent-based Integration"
- link: "/logs/log_configuration/processors"
tag: "Documentation"
text: "Learn how to process your logs"
- link: "/logs/log_configuration/parsing"
tag: "Documentation"
text: "Learn more about parsing"
- link: "/logs/explorer/"
tag: "Documentation"
text: "Learn how to explore your logs"
- link: "https://datadoghq.dev/integrations-core/base/api/#datadog_checks.base.checks.base.AgentCheck.send_log"
tag: "API Reference"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
tag: "API Reference"
tag: "Agent Integrations API"

text: "AgentCheck.send_log API documentation"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
text: "AgentCheck.send_log API documentation"
text: "API parameters to send logs"

---

## Overview

When developing custom Agent integrations, you can submit logs directly to Datadog's log ingestion backend using the `send_log` method. This allows your custom checks to emit logs alongside metrics, events, and service checks.

This approach is useful for both extracting log data from the monitored application or service and for capturing logs produced from the integration check itself.

## Prerequisites

- A custom Agent integration or check. See [Create an Agent-based Integration][1] for setup instructions.
- The Datadog Agent installed and running with [log collection enabled][2].

## Using the send_log method

The `send_log` method is available on any `AgentCheck` class and allows you to submit log entries to Datadog.

### Method signature

```python
send_log(data, cursor=None, stream='default')
```

### Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | `dict[str, str]` | Yes | The log data to send. Must include at least a `message` key. |
| `cursor` | `dict[str, Any]` | No | Optional metadata associated with the log, saved to disk. Can be retrieved later with `get_log_cursor()`. |
| `stream` | `str` | No | Stream name associated with the log for cursor persistence. Only used if `cursor` is provided. Defaults to `'default'`. |

### Special keys in the data dictionary

The `data` dictionary supports the following special keys that are automatically handled by the `send_log` method:

- `timestamp`: Number of seconds since Unix epoch. Defaults to the current time if not provided.
- `ddtags`: Comma-separated string of tags. If not provided, the Agent automatically adds tags from the integration instance configuration.

All other keys in the `data` dictionary are passed through as log attributes. Common attributes to include:

- `message`: The log message content
- `status`: Log status level (such as `info`, `error`, `warning`, `debug`)
- `service`: Service name for the log
- `source`: Source of the log (typically your integration name)
- `hostname`: Hostname associated with the log
- Any custom fields relevant to your integration

## Example usage

### Basic log submission

```python
from datadog_checks.base import AgentCheck
import time

class MyCustomCheck(AgentCheck):
def check(self, instance):
# Submit a simple log message
self.send_log({
'message': 'Custom check executed successfully',
'timestamp': time.time(),
'status': 'info'
})
```

### Structured logging with metadata

```python
from datadog_checks.base import AgentCheck
import time

class MyCustomCheck(AgentCheck):
def check(self, instance):
# Submit a structured log with additional fields
self.send_log({
'message': 'Database query completed',
'timestamp': time.time(),
'status': 'info',
'service': 'my-custom-integration',
'source': 'custom_check',
'query_duration_ms': 145,
'rows_returned': 1024
})
```

### Using cursors for stateful logging

Cursors allow you to persist metadata across check runs, which is useful for tracking progress or maintaining state:

```python
from datadog_checks.base import AgentCheck
import time

class MyCustomCheck(AgentCheck):
def check(self, instance):
# Retrieve the last cursor for this stream
last_cursor = self.get_log_cursor('my_stream')
last_position = last_cursor.get('position', 0) if last_cursor else 0

# Process logs from the last position
new_logs = self.fetch_logs_since(last_position)

for log in new_logs:
# Submit each log with an updated cursor
self.send_log(
data={
'message': log['message'],
'timestamp': log['timestamp'],
'status': log['level']
},
cursor={'position': log['position']},
stream='my_stream'
)
```

### Error logging

```python
from datadog_checks.base import AgentCheck
import time

class MyCustomCheck(AgentCheck):
def check(self, instance):
try:
# Your check logic here
self.perform_check()
except Exception as e:
# Log the error
self.send_log({
'message': f'Check failed: {str(e)}',
'timestamp': time.time(),
'status': 'error',
'error_type': type(e).__name__,
'service': 'my-custom-integration'
})
raise
```

## View your logs

Once submitted, logs from your custom check appear in the [Log Explorer][3] alongside other logs. You can:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Once submitted, logs from your custom check appear in the [Log Explorer][3] alongside other logs. You can:
After submission, logs from your custom check appear in the [Log Explorer][3]. You can:


- Filter logs by `source`, `service`, or custom tags
- Parse structured log data using [log processing pipelines][4]
- Create monitors and alerts based on log content
- Correlate logs with metrics and traces from the same integration

## Best practices

- **Use structured logging**: Include additional fields in the `data` dictionary rather than embedding all information in the message string.
- **Set appropriate status levels**: Use `error`, `warning`, `info`, or `debug` to help with filtering and alerting.
- **Include timestamps**: Always provide a `timestamp` for accurate log ordering, especially when processing historical data.
- **Tag consistently**: Use the same tagging strategy across logs, metrics, and events from your integration.
- **Use cursors for stateful processing**: When tracking progress through log sources, use cursors to avoid reprocessing data.

## Troubleshooting

If logs are not appearing in Datadog:

1. Verify that log collection is enabled in the Datadog Agent configuration.
2. Check the Agent logs for errors related to log submission.
3. Ensure your `data` dictionary includes at least a `message` key.
4. Run the [Agent's status command][5] to confirm your check is running without errors.

## Further Reading

{{< partial name="whats-next/whats-next.html" >}}

[1]: /developers/integrations/agent_integration/
[2]: /agent/logs/?tab=tailfiles#activate-log-collection
[3]: /logs/explorer/
[4]: /logs/log_configuration/processors
[5]: /agent/configuration/agent-commands/?tab=agentv6v7#agent-status-and-information
Loading