Skip to content

AvichalTrivedi7/AI-Study-Chatbot-Assistant-Project

Repository files navigation

📚 AI Study Assistant & Medical-to-Layman Language Converter

An intelligent, offline AI-powered study assistant built with a local LLM (Mistral 7B) that answers syllabus-related questions, performs semantic search on study material, generates multiple-choice questions (MCQs) from PDFs and simplifies complex medical language into layman-friendly terms.


🔍 Features

  • 💬 Ask Questions – Enter queries related to your syllabus and get instant, contextual answers.
  • 📄 Semantic Search – Uses FAISS to retrieve the most relevant text chunks from your study material.
  • 🧠 Quiz Mode – Upload any PDF and generate high-quality MCQs.
  • 🩺 Medical Language Simplifier – Converts complex medical terms and jargon into simple, layman-friendly language.
  • 🧱 Offline Capability – Uses a locally running Mistral 7B model via llama-cpp-python.
  • 🎨 GUI Interface – Built using Tkinter for easy interaction.

🧰 Tech Stack

Component Technology
LLM Mistral 7B Instruct (GGUF format)
Vector Search FAISS
Embeddings Sentence Transformers (MiniLM)
Interface Tkinter (Python GUI)
Server API (for intranet deployment) Flask (In python)
Medical Converter Tkinter (Python GUI)
Local LLM Interface llama-cpp-python
PDF Parsing PyMuPDF

🚀 Getting Started

1. Clone the Repository

git clone https://github.com/AvichalTrivedi7/AI-Study-Chatbot-Assistant-Project
cd ai-study-assistant

2. Install Dependencies

pip install -r requirements.txt

*Important - Use python 3.11.6 for it to be stable, well-supported, and work out of the box with Tkinter. (You can install python 3.11.6 to your virtual env, with it added to your path)

3. Complete Folder Structure

AI Study Chatbot Assistant Project/
├── ask_question.py                                      # GUI and LLM calling for answering questions
├── quiz_mode.py                                         # GUI and LLM calling for generating MCQs from PDF
├── create_embeddings.py                                 # Script to embed syllabus text and store FAISS index
├── server.py                                            # Flask API for intranet deployment
├── medical_converter.py                                 # GUI and LLM calling for converting medical text to layman terms
├── vector_store/
│   ├── chunks.txt                                       # Stored text chunks from syllabus
│   └── study_index.faiss                                # FAISS index for semantic search
├── read_pdf.py                                          # Utility to extract text from PDF
├── .gitignore                                           # Tells Git which files/folders to ignore (.venv & _pycache_ for now)
├── .venv/                                               # Our Python virtual environment (ignored in the repo.)
├── __pycache__/                                         # Compiled Python files from some libraries required in the project (can ignore, have ignored)
├── requirements.txt                                     # All our dependencies
├── models/                                              # Contains our local LLM
│   └── mistral/                                         
│       └── mistral-7b-instruct-v0.1.Q4_K_M.gguf         # Mistral 7B Instruct GGUF model
├── Unit 1 - Artificial Intelligence Overview.pdf        # Our study material (input)

📘 How to Use

First add the path to the pdf in read_pdf.py and run the file. Then run create_embeddings.py and then choose any of the two 1) ask_question.py (Uses the path given beforehand), 2) medical_converter.py (Uses the previous path as well) - The pdf can have medical reports to simplify, complex terms, etc,. 3) quiz_mode.py (You can browse pdf file separately in this mode during run time).

🧠 Ask a Question

python ask_question.py
  • Enter any syllabus-related question.
  • The app searches the syllabus and gives a precise, LLM-generated answer.

🩺 Simplify Medical Language

+python medical_converter.py
  • Opens a Tkinter GUI.
  • Enter any medical text/ Ask about the given report, context pdf, and the assistant will convert it into simple, layman-friendly language.

📝 Generate MCQs

python quiz_mode.py
  • Upload a syllabus PDF.
  • The model generates 5 MCQs from the uploaded content.

Important Note :- Please download the model from --> https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF - Download the Q4_K_M version only (Balance between speed and quality)

✍🏻 Run-time Screenshots

Screenshot 2025-04-20 183627 Screenshot 2025-04-20 192043 Screenshot 2025-04-29 172458 Screenshot 2025-04-29 173802

🧠 Future Plans

  • Fine-tune Mistral on domain-specific Q&A for better contextual answers.
  • Add voice-based question input and image generation also available as output.
  • Auto-save history and generate quizzes per topic.
  • Expand medical simplification with fine-tuned datasets for better accuracy and empathy in explanations.
  • Add export-to-PDF feature for MCQs.

📜 License

MIT License. Feel free to use and modify.


Built with ❤️ to make studying smarter, faster, and fun. 🚀

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages