Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ _Not all Feature Guides will require a Tutorial._

A Tutorial is meant to provide very specific steps to accomplish complex workflows and advanced use cases that are out of scope of a Feature Guide.

Tutorials should be written to accomodate the targeted persona, i.e. Developer, Admin, End-User, etc.
Tutorials should be written to accommodate the targeted persona, i.e. Developer, Admin, End-User, etc.

_Not all Tutorials require an associated Feature Guide._

Expand Down
2 changes: 1 addition & 1 deletion docs/actions/concepts.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ Each Event instance inside the framework corresponds to a single **Event Type**,

Events are produced to the framework by **Event Sources**. Event Sources may include their own guarantees, configurations, behaviors, and semantics. They usually produce a fixed set of Event Types.

In addition to sourcing events, Event Sources are also responsible for acking the succesful processing of an event by implementing the `ack` method. This is invoked by the framework once the Event is guaranteed to have reached the configured Action successfully.
In addition to sourcing events, Event Sources are also responsible for acking the successful processing of an event by implementing the `ack` method. This is invoked by the framework once the Event is guaranteed to have reached the configured Action successfully.

### Transformers

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ These changes represent the first milestone in Metadata Service Authentication.
1. **Dynamic Authenticator Plugins**: Configure + register custom Authenticator implementations, without forking DataHub.
2. **Service Accounts**: Create service accounts and generate Access tokens on their behalf.
3. **Kafka Ingestion Authentication**: Authenticate ingestion requests coming from the Kafka ingestion sink inside the Metadata Service.
4. **Access Token Management**: Ability to view, manage, and revoke access tokens that have been generated. (Currently, access tokens inlcude no server side state, and thus cannot be revoked once granted)
4. **Access Token Management**: Ability to view, manage, and revoke access tokens that have been generated. (Currently, access tokens include no server side state, and thus cannot be revoked once granted)

...and more! To advocate for these features or others, reach out on [Slack](https://datahubspace.slack.com/join/shared_invite/zt-nx7i0dj7-I3IJYC551vpnvvjIaNRRGw#/shared-invite/email).

Expand Down
4 changes: 2 additions & 2 deletions docs/authentication/personal-access-tokens.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,11 +86,11 @@ is enabled.
## Additional Resources

- Learn more about how this feature is by DataHub [Authentication Metadata Service](introducing-metadata-service-authentication.md).
- Check out our [Authorization Policies](../authorization/policies.md) to see what permissions can be programatically used.
- Check out our [Authorization Policies](../authorization/policies.md) to see what permissions can be programmatically used.

### GraphQL

- Have a look at [Token Management in GraphQL](../api/graphql/token-management.md) to learn how to manage tokens programatically!
- Have a look at [Token Management in GraphQL](../api/graphql/token-management.md) to learn how to manage tokens programmatically!

## FAQ and Troubleshooting

Expand Down
2 changes: 1 addition & 1 deletion docs/managed-datahub/release-notes/v_0_3_12.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ This release includes all changes up to and including DataHub Core v1.1.0.

- Product

- [**AI-Generated Documentaton**](../../automations/ai-docs.md) is now in Public Beta. Turn it on by going to **Settings > AI** and enabling **Documentation AI**.
- [**AI-Generated Documentation**](../../automations/ai-docs.md) is now in Public Beta. Turn it on by going to **Settings > AI** and enabling **Documentation AI**.
- [**DataHub Slack Assistant**](../slack/saas-slack-app.md#datahub-slack-bot) is now in Private Beta. Reach out to your DataHub representative to get access.
- [**Hosted MCP Server**](../../features/feature-guides/mcp.md): DataHub is now a remote MCP server.
- **Data Health Dashboard V2** is here, featuring enhanced filtering capabilities for an improved user experience.
Expand Down
4 changes: 2 additions & 2 deletions docs/managed-datahub/release-notes/v_0_3_13.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ This release includes all changes up to and including DataHub Core v1.2.0.
- [**DataHub AI in Slack**](../slack/saas-slack-app.md#datahub-ai-in-slack): The AI-powered `@DataHub` Slack command is now available in public beta. Admins can enable this feature by navigating to UI → Settings → AI.
- **Customizable Home Page**: Introducing a brand new home page experience with customization to suit your personal or organizational needs! Currently in private beta behind a feature flag, this new home page allows users with permission to create or edit modules for a custom default experience for all users in your organization. Alternatively, users can individually update their own personal home page to suit their needs. Configure custom asset collections, hierarchy views, documentation, pinned links and more!
- **Entity Profile Design Updates**: Entity profile pages receive a tabs design uplift in this release with sleeker, simpler-looking tabs that bring a more consistent feel to the page.
- **Access Worfklows**: Introducing support for creating access approval workflows with custom entry points, custom form fields, routing policies, and more using the `upsertActionWorkflow` GraphQL API. Also introduced support for creating & reviewing access workflows via the **Task Center**. This is in private beta currently, available behind a feature flag (ACTION_WORKFLOWS_ENABLED).
- **Access Workflows**: Introducing support for creating access approval workflows with custom entry points, custom form fields, routing policies, and more using the `upsertActionWorkflow` GraphQL API. Also introduced support for creating & reviewing access workflows via the **Task Center**. This is in private beta currently, available behind a feature flag (ACTION_WORKFLOWS_ENABLED).
- **Bulk Create Field Metric Smart Assertions**: When creating a field metric assertion, you now have the ability to 'Bulk create smart assertions'. This allows you to select multiple fields and metrics, and spin up anomaly monitors across all of them in one go
- **Bulk Create Freshness and Volume Smart Assertions**: On the data health page you can now create smart freshness and volume assertions across thousands of tables in one go. Makes it effortless to strap a seatbelt with anomaly monitors across your landscape.
- **Improved Notifications for Assertion Failures**: Slack and email alerts for assertions failures will now include context around expected vs actual values, making it easier to separate signal from noise right where you work.
Expand All @@ -131,7 +131,7 @@ This release includes all changes up to and including DataHub Core v1.2.0.
- **Container filters on Data Health dashboard**: Filter your data health dashboard by the asset's container, making it easy to see health of specific schemas in your database.
- **Data Health Filters reflected in URL**: This makes it easy to bookmark and share links to specific filtered pages on the Data Health dashboard.
- [**MCP Server**](../../features/feature-guides/mcp.md): The search tool has been revamped to improve LLM understanding and reduce tool confusion / tool call error by ~60%.
- [**AI-Generated Documentaton**](../../automations/ai-docs.md): We can now generate docs for tables with up to 3000 columns, increasing the previous limit of 1000.
- [**AI-Generated Documentation**](../../automations/ai-docs.md): We can now generate docs for tables with up to 3000 columns, increasing the previous limit of 1000.
- **Upstream Propagation** The tag and glossary term propagation automations now support propagating via lineage upstream. This feature is still
in open beta; reach out to your DataHub Cloud representative to get access.

Expand Down
2 changes: 1 addition & 1 deletion docs/managed-datahub/slack/saas-slack-app.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ Some of the most commonly used features within our Slack app are the Incidents m
The DataHub UI offers a rich set of [Incident tracking and management](https://docs.datahub.com/docs/incidents/incidents/) features.
When a Slack member or channel receives notifications about an Incident, many of these features are made accessible right within the Slack app.

When an incident is raised, you will recieve rich context about the incident in the Slack message itself. You will also be able to `Mark as Resolved`, update the `Priorty`, set a triage `Stage` and `View Details` - directly from the Slack message.
When an incident is raised, you will receive rich context about the incident in the Slack message itself. You will also be able to `Mark as Resolved`, update the `Priority`, set a triage `Stage` and `View Details` - directly from the Slack message.

<p align="center">
<img width="70%" alt="Example of search results being displayed within Slack." src="https://raw.githubusercontent.com/datahub-project/static-assets/main/imgs/saas/slack/slack_incidents_1.png" />
Expand Down
2 changes: 1 addition & 1 deletion docs/managed-datahub/slack/saas-slack-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Now proceed to the [Subscriptions and Notifications page](https://docs.datahub.c

### DataHub Slack bot permissions

The DataHub Slack bot requires a certain set of scopes (permissions) to function. We've listed them below with thier explanations.
The DataHub Slack bot requires a certain set of scopes (permissions) to function. We've listed them below with their explanations.

```
# Required for slash commands / shortcuts.
Expand Down
6 changes: 3 additions & 3 deletions docs/managed-datahub/subscription-and-notification.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Notifying on tag changes for every asset in the platform would be noisy, and so

## Prerequisites

Once you have [configured Slack within your DataHub instance](slack/saas-slack-setup.md), you will be able to subscribe to any Entity in DataHub and begin recieving notifications via DM.
Once you have [configured Slack within your DataHub instance](slack/saas-slack-setup.md), you will be able to subscribe to any Entity in DataHub and begin receiving notifications via DM.

To begin receiving personal notifications, go to Settings > "My Notifications". From here, toggle on Slack Notifications and input your Slack Member ID.

Expand Down Expand Up @@ -157,9 +157,9 @@ Then select individual assertions you'd like to subscribe to:
<img width="70%" alt="7" src="https://raw.githubusercontent.com/datahub-project/static-assets/main/imgs/saas/subscription-and-notification/s_n-assertion-sub-resub-1.jpg" />
</p>

## Programatically Managing Subscriptions
## Programmatically Managing Subscriptions

You can create and remove subscriptions programatically using the [GraphQL APIs](/docs/api/graphql/overview.md) or the [Python Subscriptions SDK](/docs/api/tutorials/subscriptions.md).
You can create and remove subscriptions programmatically using the [GraphQL APIs](/docs/api/graphql/overview.md) or the [Python Subscriptions SDK](/docs/api/tutorials/subscriptions.md).

## FAQ

Expand Down
2 changes: 1 addition & 1 deletion metadata-ingestion/docs/sources/fivetran/fivetran_pre.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ grant role fivetran_datahub to user snowflake_user;
If you have multiple instances of source/destination systems that are referred in your `fivetran` setup, you'd need to configure platform instance for these systems in `fivetran` recipe to generate correct lineage edges. Refer the document [Working with Platform Instances](https://docs.datahub.com/docs/platform-instances) to understand more about this.

While configuring the platform instance for source system you need to provide connector id as key and for destination system provide destination id as key.
When creating the conection details in the fivetran UI make a note of the destination Group ID of the service account, as that will need to be used in the `destination_to_platform_instance` configuration.
When creating the connection details in the fivetran UI make a note of the destination Group ID of the service account, as that will need to be used in the `destination_to_platform_instance` configuration.
I.e:

<p align="center">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -194,7 +194,7 @@ class FivetranSourceConfig(StatefulIngestionConfigBase, DatasetSourceConfigMixin

# Configuration for stateful ingestion
stateful_ingestion: Optional[StatefulStaleMetadataRemovalConfig] = pydantic.Field(
default=None, description="Airbyte Stateful Ingestion Config."
default=None, description="Fivetran Stateful Ingestion Config."
)

# Fivetran connector all sources to platform instance mapping
Expand Down
Loading