This repository contains the prototype for the Universal Discovery Interface (UDI) chatbot.
The chatbot responds to queries with visualization of the available datasets. This interface is useful as we develop and experiment with the UDI-Grammar.
yarn
# or
npm installquasar devyarn lint
# or
npm run lintyarn format
# or
npm run formatquasar buildSee Configuring quasar.config.js.
| Variable | Description | Default |
|---|---|---|
VITE_LLM_API_BASE_URL |
Base URL for the LLM API. | http://localhost |
VITE_LLM_API_PORT |
Port where the LLM API is running. | 55001 |
VITE_DATA_PACKAGE_PATH |
Path to the data package that lists all data resources available to visualize. | ./data/hubmap_2025-05-05/datapackage_udi.json |
VITE_PRODUCTION |
Toggles production mode, set to true to hide debug features. | false |
VITE_BENCHMARK_ENDPOINT_URL |
Path or endpoint for benchmark analysis results. | ./data/benchmark/benchmark_analysis.json |
VITE_AUTH_TOKEN |
Token for validation on the backend. | "" |
VITE_ENABLE_CUSTOM_API_KEY |
When true, allows users to provide their own API key via the UI. |
false |
VITE_CUSTOM_API_KEY_VALIDATION_URL |
Endpoint used to validate user-provided API keys. | https://api.openai.com/v1/models |
Example .env file:
# .env
# LLM API configuration
VITE_LLM_API_BASE_URL=http://localhost
VITE_LLM_API_PORT=8080
# Data configuration
VITE_DATA_PACKAGE_PATH=./data/my_awesome_data/datapackage.json
# Mode toggle
VITE_PRODUCTION=true
# Enable user-provided API key input in the chat UI
# VITE_ENABLE_CUSTOM_API_KEY=true
