All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, adheres to Semantic Versioning, and is generated by Changie.
Pre-release Changes
- Improved loading/accept/reject UI for apply
- Polished chat sidebar, especially context dropdown
- Further prompt caching with Anthropic
- Updated tutorial file
- Improved styling on "More" page
- Continue for teams auth bug fix
- Fixed a number of apply bugs
- Fixed autoscrolling behavior
- NVIDIA NIMs embeddings provider
- SambaNova LLM Provider
- Support for OpenAI's new o1-preview and o1-mini models
- Use Chromium only as a fallback after asking user
- Increased default maxTokens, and autodetect for common models
- Use Chromium only as a fallback after asking user
- Redesigned onboarding flow
- Fixed CRLF bug causing diff streams to treat every line as changed on Windows
- Hotfix for ability to use more than one inline context provider
- Hotfix: submenu context providers
- Improved indexing progress UI
- Improved @codebase using repomap
- Repo map context provider
- Many small UI improvements
- Fixes db.search not a function
- Use headless browser for crawling to get better results
- TTS support in the chat window
- Improved support for WatsonX models
- Fixed several small indexing bugs
- new /onboard slash command
- Fixed problem loading config.ts
- Fixed bug causing duplicate indexing work
- Support for Llama 3.1 and gpt-4o-mini
- Support for WatsonX+Granite models
- Significant improvements to indexing performance
- Improved @codebase quality by more accurately searching over file names and paths
- Improved @codebase accuracy
- Further improvements to indexing performance
- Improved docs indexing and management
- Fixed Gemini embeddings provider
- Improved indexing reliability and testing
- Quick Actions: use CodeLens to quickly take common actions like adding docstrings
- Support for Gemini 1.5 Pro
- Link to code in the sidebar when using codebase retrieval
- Smoother onboarding experience
- .prompt files, a way of saving and sharing slash commands
- Support for Claude 3.5 Sonnet, Deepseek Coder v2, and other new models
- Support for comments in config.json
- Specify multiple autocomplete models and switch between them
- Improved bracket matching strategy reduces noisy completions
- Numerous reliability upgrades to codebase indexing
- Support for improved retrieval models (Voyage embeddings/reranking)
- New @code context provider
- Personal usage analytics
- Tab-autocomplete in beta
- Image support
- Full-text search index for retrieval
- Docs context provider
- CodeLlama-70b support
- config.ts only runs in NodeJS, not browser
- Fixed proxy setting in config.json
- Add codellama and gemini to free trial, using new server
- Local codebase syncing and embeddings using LanceDB
- Improved VS Code theme matching
- Updates to packaging to download native modules for current platform (lancedb, sqlite, onnxruntime, tree-sitter wasms)
- Context providers now run from the extension side (in Node.js instead of browser javascript)
- disableSessionTitles option in config.json
- Use Ollama /chat endpoint instead of raw completions by default, and /show endpoint to gather model parameters like context length and stop tokens
- support for .continuerc.json in root of workspace to override config.json
- Inline context providers
- cmd+shift+L with new diff streaming UI for edits
- Allow certain LLM servers to handle templating
- Context items are now kept around as a part of past messages, instead of staying at the main input
- No more Python server - Continue runs entirely in Typescript
- migrated to .json config file format
- Full screen mode
- StackOverflow slash command to augment with web search
- VS Code context menus: right click to add code to context, debug the terminal, or share your Continue session
- Reliability improvements to JetBrains by bringing up-to-date with the socket.io refactor
- Codebase Retrieval: Use /codebase or cmd+enter and Continue will automatically gather the most important context
- Switch from Websockets to Socket.io