Skip to content

Adding llm_chat function to starlark stored procedure. #22300

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Aug 12, 2025

Conversation

fengttt
Copy link
Contributor

@fengttt fengttt commented Aug 4, 2025

User description

What type of PR is this?

  • API-change
  • BUG
  • Improvement
  • Documentation
  • Feature
  • Test and CI
  • Code Refactoring

Which issue(s) this PR fixes:

issue #22299

What this PR does / why we need it:

Adding llm_chat to starlark stored procedure.


PR Type

Enhancement


Description

  • Add LLM integration to Starlark stored procedures

  • Support mock and Ollama LLM clients

  • Enable llm_connect and llm_chat functions

  • Include comprehensive tests for LLM functionality


Diagram Walkthrough

flowchart LR
  A["Starlark Interpreter"] --> B["LLM Client Interface"]
  B --> C["Mock Client"]
  B --> D["Ollama Client"]
  C --> E["Echo Model"]
  D --> F["External LLM"]
  A --> G["mo.llm_connect()"]
  A --> H["mo.llm_chat()"]
Loading

File Walkthrough

Relevant files
Enhancement
4 files
starlark_interpreter.go
Add LLM functions to Starlark module                                         
+73/-5   
llm.go
Core LLM client interface and factory                                       
+63/-0   
mockllm.go
Mock LLM client implementation                                                     
+54/-0   
ollama.go
Ollama LLM client implementation                                                 
+105/-0 
Tests
4 files
mockllm_test.go
Unit tests for mock LLM client                                                     
+74/-0   
ollama_test.go
Unit tests for Ollama client                                                         
+80/-0   
starlark_llm.result
Expected test results for LLM procedures                                 
+27/-0   
starlark_llm.sql
Integration tests for LLM functionality                                   
+40/-0   
Dependencies
2 files
go.mod
Add langchaingo dependency for LLM support                             
+9/-6     
go.sum
Update dependency checksums                                                           
+43/-36 

@matrix-meow matrix-meow added the size/L Denotes a PR that changes [500,999] lines label Aug 4, 2025
@mergify mergify bot added the kind/feature label Aug 4, 2025
Copy link

qodo-merge-pro bot commented Aug 4, 2025

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 4 🔵🔵🔵🔵⚪
🧪 PR contains tests
🔒 No security concerns identified
⚡ Recommended focus areas for review

Error Handling

The moLlmConnect function has incorrect argument unpacking - it expects 3 arguments but unpacks 4 parameters (server, addr, model, options). This will cause runtime errors when calling the function.

if err := starlark.UnpackPositionalArgs("llm_connect", args, kwargs, 3, &server, &addr, &model, &options); err != nil {
	ret[1] = starlark.String(err.Error())
	return starlark.NewList(ret), nil
}
Nil Check

The moLlmChat function automatically creates a default LLM client when si.llm is nil, but this fallback behavior may not be intuitive to users and could mask configuration issues.

if si.llm == nil {
	si.llm, err = si.llmConnect("", "", "", "")
	if err != nil {
		ret[1] = starlark.String(err.Error())
		return starlark.NewList(ret), nil
	}
}
JSON Parsing

The Chat method attempts to parse the prompt as JSON messages but provides no validation or clear error messages when the JSON format is invalid, which could lead to confusing error messages for users.

func (c *MockClient) Chat(ctx context.Context, prompt string) (string, error) {
	messages, err := stringToMessage(prompt)
	if err != nil {
		return "", err
	}
	return c.ChatMsg(ctx, messages)
}

Copy link
Contributor

mergify bot commented Aug 12, 2025

This pull request has been removed from the queue for the following reason: checks failed.

The merge conditions cannot be satisfied due to failing checks:

You may have to fix your CI before adding the pull request to the queue again.
If you update this pull request, to fix the CI, it will automatically be requeued once the queue conditions match again.
If you think this was a flaky issue instead, you can requeue the pull request, without updating it, by posting a @mergifyio requeue comment.

@mergify mergify bot merged commit 7c1dce1 into matrixorigin:main Aug 12, 2025
19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/feature Review effort 4/5 size/L Denotes a PR that changes [500,999] lines
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants