Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
/target
lumen.config.json
40 changes: 40 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@ spinoff = { version = "0.8.0", features = ["dots"] }
thiserror = "1.0"
indoc = "2.0.5"
dirs = "6.0.0"
regex = "1.11.1"
lazy_static = "1.5.0"

[profile.release]
lto = true
69 changes: 41 additions & 28 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,11 @@ A command-line tool that uses AI to streamline your git workflow - from generati
![demo](https://github.com/user-attachments/assets/0d029bdb-3b11-4b5c-bed6-f5a91d8529f2)

## GitAds Sponsored

[![Sponsored by GitAds](https://gitads.dev/v1/ad-serve?source=jnsahaj/lumen@github)](https://gitads.dev/v1/ad-track?source=jnsahaj/lumen@github)

## Table of Contents

- [Features](#features-)
- [Getting Started](#getting-started-)
- [Prerequisites](#prerequisites)
Expand Down Expand Up @@ -41,23 +43,27 @@ A command-line tool that uses AI to streamline your git workflow - from generati
## Getting Started 🔅

### Prerequisites

Before you begin, ensure you have:

1. `git` installed on your system
2. [fzf](https://github.com/junegunn/fzf) (optional) - Required for `lumen list` command
3. [mdcat](https://github.com/swsnr/mdcat) (optional) - Required for pretty output formatting

### Installation

#### Using Homebrew (MacOS and Linux)

```bash
brew install jnsahaj/lumen/lumen
```

#### Using Cargo
> [!IMPORTANT]
> `cargo` is a package manager for `rust`,

> [!IMPORTANT] > `cargo` is a package manager for `rust`,
> and is installed automatically when you install `rust`.
> See [installation guide](https://doc.rust-lang.org/cargo/getting-started/installation.html)

```bash
cargo install lumen
```
Expand All @@ -78,7 +84,6 @@ lumen draft --context "match brand guidelines"
# Output: "feat(button.tsx): Update button color to align with brand identity guidelines"
```


### Generate Git Commands

Ask Lumen to generate Git commands based on a natural language query:
Expand Down Expand Up @@ -110,6 +115,7 @@ lumen explain HEAD --query "What are the potential side effects?"
```

### Interactive Mode

```bash
# Launch interactive fuzzy finder to search through commits (requires: fzf)
lumen list
Expand All @@ -126,29 +132,30 @@ lumen draft | xclip -selection c # Linux
lumen draft | tee >(pbcopy)

# Open in your favorite editor
lumen draft | code -
lumen draft | code -

# Directly commit using the generated message
lumen draft | git commit -F -
lumen draft | git commit -F -
```

If you are using [lazygit](https://github.com/jesseduffield/lazygit), you can add this to the [user config](https://github.com/jesseduffield/lazygit/blob/master/docs/Config.md)

```yml
customCommands:
- key: '<c-l>'
context: 'files'
command: 'lumen draft | tee >(pbcopy)'
loadingText: 'Generating message...'
- key: "<c-l>"
context: "files"
command: "lumen draft | tee >(pbcopy)"
loadingText: "Generating message..."
showOutput: true
- key: '<c-k>'
context: 'files'
command: 'lumen draft -c {{.Form.Context | quote}} | tee >(pbcopy)'
loadingText: 'Generating message...'
- key: "<c-k>"
context: "files"
command: "lumen draft -c {{.Form.Context | quote}} | tee >(pbcopy)"
loadingText: "Generating message..."
showOutput: true
prompts:
- type: 'input'
title: 'Context'
key: 'Context'
- type: "input"
title: "Context"
key: "Context"
```

## AI Providers 🔅
Expand All @@ -167,26 +174,27 @@ export LUMEN_AI_MODEL="gpt-4o"

### Supported Providers

| Provider | API Key Required | Models |
|----------|-----------------|---------|
| [Phind](https://www.phind.com/agent) `phind` (Default) | No | `Phind-70B` |
| [Groq](https://groq.com/) `groq` | Yes (free) | `llama2-70b-4096`, `mixtral-8x7b-32768` (default: `mixtral-8x7b-32768`) |
| [OpenAI](https://platform.openai.com/docs/guides/text-generation/chat-completions-api) `openai` | Yes | `gpt-4o`, `gpt-4o-mini`, `gpt-4`, `gpt-3.5-turbo` (default: `gpt-4o-mini`) |
| [Claude](https://claude.ai/new) `claude` | Yes | [see list](https://docs.anthropic.com/en/docs/about-claude/models#model-names) (default: `claude-3-5-sonnet-20241022`) |
| [Ollama](https://github.com/ollama/ollama) `ollama` | No (local) | [see list](https://github.com/ollama/ollama/blob/main/docs/api.md#model-names) (required) |
| [OpenRouter](https://openrouter.ai/) `openrouter` | Yes | [see list](https://openrouter.ai/models) (default: `anthropic/claude-3.5-sonnet`) |
| [DeepSeek](https://www.deepseek.com/) `deepseek` | Yes | `deepseek-chat`, `deepseek-reasoner` (default: `deepseek-reasoner`) |
| Provider | API Key Required | Models |
| ----------------------------------------------------------------------------------------------- | ---------------- | ---------------------------------------------------------------------------------------------------------------------- |
| [Phind](https://www.phind.com/agent) `phind` (Default) | No | `Phind-70B` |
| [Groq](https://groq.com/) `groq` | Yes (free) | `llama2-70b-4096`, `mixtral-8x7b-32768` (default: `mixtral-8x7b-32768`) |
| [OpenAI](https://platform.openai.com/docs/guides/text-generation/chat-completions-api) `openai` | Yes | `gpt-4o`, `gpt-4o-mini`, `gpt-4`, `gpt-3.5-turbo` (default: `gpt-4o-mini`) |
| [Claude](https://claude.ai/new) `claude` | Yes | [see list](https://docs.anthropic.com/en/docs/about-claude/models#model-names) (default: `claude-3-5-sonnet-20241022`) |
| [Ollama](https://github.com/ollama/ollama) `ollama` | No (local) | [see list](https://github.com/ollama/ollama/blob/main/docs/api.md#model-names) (required) |
| [OpenRouter](https://openrouter.ai/) `openrouter` | Yes | [see list](https://openrouter.ai/models) (default: `anthropic/claude-3.5-sonnet`) |
| [DeepSeek](https://www.deepseek.com/) `deepseek` | Yes | `deepseek-chat`, `deepseek-reasoner` (default: `deepseek-reasoner`) |

## Advanced Configuration 🔅

### Configuration File

Lumen supports configuration through a JSON file. You can place the configuration file in one of the following locations:

1. Project Root: Create a lumen.config.json file in your project's root directory.
2. Custom Path: Specify a custom path using the --config CLI option.
3. Global Configuration (Optional): Place a lumen.config.json file in your system's default configuration directory:
- Linux/macOS: `~/.config/lumen/lumen.config.json`
- Windows: `%USERPROFILE%\.config\lumen\lumen.config.json`
- Linux/macOS: `~/.config/lumen/lumen.config.json`
- Windows: `%USERPROFILE%\.config\lumen\lumen.config.json`

Lumen will load configurations in the following order of priority:

Expand All @@ -213,19 +221,23 @@ Lumen will load configurations in the following order of priority:
"revert": "Reverts a previous commit",
"feat": "A new feature",
"fix": "A bug fix"
}
},
"trim_thinking_tags": false // Set to true to automatically remove <think>...</think> blocks from AI output.
}
}
```

### Configuration Precedence

Options are applied in the following order (highest to lowest priority):

1. CLI Flags
2. Configuration File
3. Environment Variables
4. Default options

Example: Using different providers for different projects:

```bash
# Set global defaults in .zshrc/.bashrc
export LUMEN_AI_PROVIDER="openai"
Expand All @@ -241,4 +253,5 @@ export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx"
# Or override using CLI flags
lumen -p "ollama" -m "llama3.2" draft
```

<!-- GitAds-Verify: OE99H8YHI6ACIS31OJLLV19T6QOB4J3P -->
15 changes: 10 additions & 5 deletions src/ai_prompt.rs
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ use crate::{
};
use indoc::{formatdoc, indoc};
use thiserror::Error;
use serde_json;

#[derive(Error, Debug)]
#[error("{0}")]
Expand Down Expand Up @@ -65,23 +66,23 @@ impl AIPrompt {
None => match &command.git_entity {
GitEntity::Commit(_) => formatdoc! {"
{base_content}

Provide a short explanation covering:
1. Core changes made
2. Direct impact
"
},
GitEntity::Diff(Diff::WorkingTree { .. }) => formatdoc! {"
{base_content}

Provide:
1. Key changes
2. Notable concerns (if any)
"
},
GitEntity::Diff(Diff::CommitsRange { .. }) => formatdoc! {"
{base_content}

Provide:
1. Core changes made
2. Direct impact
Expand Down Expand Up @@ -122,6 +123,9 @@ impl AIPrompt {
"".to_string()
};

let commit_types_json = serde_json::to_string(&command.draft_config.commit_types)
.map_err(|e| AIPromptError(format!("Failed to serialize commit types: {}", e)))?;

let user_prompt = String::from(formatdoc! {"
Generate a concise git commit message written in present tense for the following code diff with the given specifications below:

Expand All @@ -139,7 +143,8 @@ impl AIPrompt {
{diff}
```
",
commit_types = command.draft_config.commit_types,
commit_types = commit_types_json,
diff = diff,
});

Ok(AIPrompt {
Expand All @@ -156,7 +161,7 @@ impl AIPrompt {
"});
let user_prompt = formatdoc! {"
Generate Git command for: {query}

<command>Git command</command>
<explanation>Brief explanation</explanation>
<warning>Required for destructive commands only - omit for safe commands</warning>
Expand Down
13 changes: 11 additions & 2 deletions src/command/draft.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
use std::io::Write;

use async_trait::async_trait;
use regex::Regex;
use lazy_static::lazy_static;

use crate::{
config::configuration::DraftConfig, error::LumenError, git_entity::GitEntity,
Expand All @@ -9,6 +11,10 @@ use crate::{

use super::Command;

lazy_static! {
static ref THINK_TAG_REGEX: Regex = Regex::new(r"(?s)<think>.*?</think>\s*").unwrap();
}

pub struct DraftCommand {
pub git_entity: GitEntity,
pub context: Option<String>,
Expand All @@ -18,8 +24,11 @@ pub struct DraftCommand {
#[async_trait]
impl Command for DraftCommand {
async fn execute(&self, provider: &LumenProvider) -> Result<(), LumenError> {
let result = provider.draft(self).await?;

let mut result = provider.draft(self).await?;
if self.draft_config.trim_thinking_tags {
result = THINK_TAG_REGEX.replace_all(&result, "").to_string();
result = result.trim().to_string();
}
print!("{result}");
std::io::stdout().flush()?;
Ok(())
Expand Down
Loading