Skip to content

Conversation

@celo-rod
Copy link

This PR introduces optional support for LLM (Large Language Model)-generated excuses using OpenRouter.ai, while preserving the existing offline behavior as fallback.

Changes

  • Added support for language models via OpenRouter (e.g., meta-llama/llama-3.3-8b-instruct:free)
  • Added optional ?theme=... query parameter to generate excuses based on a specific theme
  • JSON responses now include a "source" field indicating "llm" or "offline"
  • Implemented automatic fallback to reasons.json if the LLM call fails or is disabled
  • .env support added with OPENROUTER_API_KEY for OpenRouter authentication

Related issues

Closes: #27

Copy link

@gustav0d gustav0d left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess it would be great if you just keep the singleQuote format

the rest of the code lgtm

index.js Outdated

const app = express();
app.set('trust proxy', true);
app.set("trust proxy", true);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@celo-rod maybe use singleQuote to match the current formatting?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done @gustav0d

@celo-rod celo-rod force-pushed the feat-basic-llm-integration branch from cbc5964 to f96f340 Compare June 24, 2025 02:12
@celo-rod celo-rod force-pushed the feat-basic-llm-integration branch from f96f340 to 789468b Compare June 24, 2025 02:18
README.md Outdated
"author": "hotheadhacker",
"license": "MIT",
"dependencies": {
"axios": "^1.10.0",

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one more thing actually, I guess it would be better if you just use fetch instead of adding axios

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, I removed the axios usage & dependency @gustav0d

index.js Outdated
Comment on lines 61 to 68
if (response.ok) {
const reason = data.choices?.[0]?.message?.content?.trim();
if (reason) {
return res.json({ reason, theme, source: 'llm' });
}
console.warn('LLM response was empty, falling back to local reasons.');
} else {
console.error('Fetch failed, falling back to local reasons:', data.error.message);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right, done! @gustav0d

@ajmas
Copy link

ajmas commented Nov 8, 2025

I would make the LLM an optional thing. Not everyone needs the power of 4 suns for saying no.

I

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add some intelligence to the api results

3 participants