Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Aug 17, 2025

This PR implements LLM resource configuration functionality to enable enhanced features in the DocumentDB extension, specifically for explainCommand and performanceInsight operations.

Overview

The implementation adds support for configuring Azure OpenAI and OpenAI resources through a user-friendly wizard, with automatic prompts when LLM-enhanced features are accessed without proper configuration.

Key Features

🔧 LLM Configuration Service

  • Secure Storage: API keys are stored using VS Code's built-in secret storage mechanism
  • Multi-Provider Support: Supports both Azure OpenAI and OpenAI providers
  • Global Settings: Provider and endpoint configuration stored in VS Code settings
  • Validation: Proper input validation with provider-specific requirements

⚙️ Configuration Wizard

  • Step-by-step Setup: Follows the project's established wizard pattern with separate prompt steps
  • Provider Selection: Clear options between Azure OpenAI and OpenAI with descriptions
  • Endpoint Validation: Ensures HTTPS endpoints and validates Azure OpenAI requirements
  • API Key Security: Password-masked input with provider-specific validation (e.g., OpenAI keys must start with sk-)

🎯 Enhanced Command Integration

Both explainCommand and performanceInsight now check for LLM configuration:

// Before execution, commands check LLM configuration
const isLlmConfigured = await ensureLlmConfigured(context, 'explainCommand');
if (!isLlmConfigured) {
    return; // User declined or configuration failed
}

When LLM is not configured, users see a helpful dialog:

  • Title: "LLM Enhanced Features Not Available"
  • Message: Explains the need for LLM configuration with clear next steps
  • Options: "Configure LLM" or "Not Now"
  • Seamless Flow: Configuration completion automatically returns to the original command

📦 VS Code Integration

  • Command Palette: New command "DocumentDB: Configure LLM Resource…"
  • Settings UI: Proper configuration properties with enum dropdowns and descriptions
  • Icons: Appropriate icons for configuration commands ($(settings-gear))

Implementation Details

Files Added

  • src/services/LlmConfigurationService.ts - Core service for LLM configuration management
  • src/commands/configureLlm/ - Complete wizard implementation following project patterns
  • src/utils/llmHelpers.ts - Utility functions for LLM configuration checks

Files Modified

  • src/commands/scrapbook-commands/explainCommand.ts - Added LLM configuration check
  • src/commands/performanceInsight/performanceInsight.ts - Added LLM configuration check
  • src/documentdb/ClustersExtension.ts - Registered new command
  • src/extensionVariables.ts - Added settings keys
  • package.json - Added commands and configuration properties

Code Quality

  • ✅ TypeScript strict mode compliance
  • ✅ All ESLint checks pass
  • ✅ Proper error handling and user experience
  • ✅ Internationalization support with vscode.l10n.t()
  • ✅ Follows established project patterns and conventions
  • ✅ Minimal, surgical changes to existing code

User Experience

  1. User runs explainCommand or performanceInsight without LLM configured
  2. Extension shows informative dialog asking if they want to configure LLM
  3. If user selects "Configure LLM", the wizard guides them through setup
  4. Upon successful configuration, the original command continues automatically
  5. Future command executions proceed directly with enhanced LLM features

Fixes #224.


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI changed the title [WIP] [MVP ONLY] Config LLM Resource in Extension [MVP] Implement LLM Resource Configuration for Enhanced Features Aug 17, 2025
Copilot finished work on behalf of xingfan-git August 17, 2025 17:23
Copilot AI requested a review from xingfan-git August 17, 2025 17:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[MVP ONLY] Config LLM Resource in Extension

2 participants