Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Identify users on channel requests #27

Open
Darkhub opened this issue Sep 25, 2024 · 2 comments
Open

Identify users on channel requests #27

Darkhub opened this issue Sep 25, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@Darkhub
Copy link

Darkhub commented Sep 25, 2024

Describe the feature
Bot should be able to identify which user makes a request in a channel.

Why do you think this feature should be implemented?
Bot could be added to a channel and adapt answers to different users, for example it would be perfect in gdr play with bot playing a specific part between more users.

Additional context
I think it could be implemented like a sort of table in .env file, for example with the following format:
RECOGNISE_USERS = 1
USERID1 Username1
USERID2 Username2
...

Then when the bot receives a request in a channel, if RECOGNISE_USERS is set it should check the table and if a correspondency is found the request to Ollama could be formatted like "Username1 says..."
In the SYSYEM prompt Ollama could be instructed to adapt answers if user prompts contain that specific "Username1 says".

@Darkhub Darkhub added the enhancement New feature or request label Sep 25, 2024
@238SAMIxD
Copy link
Collaborator

I do not understand it exactly but the bot is already creating chat history and provides its context to the llms until you use reset/clear command

@Darkhub
Copy link
Author

Darkhub commented Sep 25, 2024

I've checked Ollama's debug log and to me it seems that no user information is passed to Ollama as prompt.
So, for example, if on a channel 2 users talk to the bot it would not be able to distinguish the 2 users.

Can you please tell me where the string to send to Ollama as prompt is built in code? I've checked but I was not able to find where this happens.

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants