New tools for Chat Completion CE in-db function (issue #242) #243
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Overview
This PR adds a new Chat Completion module to the Teradata MCP Server that exposes the recently released OpenAI API Connectivity Functions as MCP tools and prompts.
The goal is to make it easy for AI assistants and MCP clients to call Teradata’s
CompleteChatfunction and run large language models over Teradata data, whether those models are deployed on‑prem, in the cloud, or behind custom gateways.What’s included
1. New
chat_cmplttool modulechat_cmplt_completeChatCompleteChatfunction over a user‑provided SQL query that returns atxtcolumn.chat_cmplt_aggregatedCompleteChatCompleteChatand returns aggregated results: uniqueresponse_txtvalues with counts.Both tools are driven by a simple YAML config (
chat_cmplt_config.yml) where you specify:base_url) and model (model)function_db) whereCompleteChatis installedIf required values are missing, the module logs a clear configuration error and leaves the tools disabled rather than failing the server.
2. Chat Completion prompt(s)
chat_cmplt_ai_mapreducechat_cmplt_aggregatedCompleteChat.This gives users a “question → SQL → LLM distribution → summarized insight” flow without needing to hand‑craft prompts or SQL.
3. Wiring into the MCP server
llmUserprofile focused on LLM workflows).