-
Notifications
You must be signed in to change notification settings - Fork 0
Project Evaluation Report
A project evaluation report describes how well a student did on their project. The report combines the answers from multiple completed evaluations to create a summary result showing criteria and scores. Because the metrics used in the evaluation can be either quantitative or qualitative, how the report combines metric scores may vary from metric to metric. E.g. the report may take the average of quantitative scores while listing qualitative scores.
More generally, an evaluation report (as opposed to a project evaluation report) combines the results of any collection of completed evaluations. For more on this see Future Ideas.
This feature only covers creating a project report. This intentionally excludes:
- How evaluations are created or filled out (See "Assigning Evaluations")
- The student-class report describing a student's placement in a class (see "Class Rank")
- Peer evaluations or peer evaluation report (see "Groups")
- Note: Before peer evaluations can be built, "evaluation-report" needs to be completed
- How students or instructors navigate to this report (this should be included in a wireframe of the system)
#Objectives
- Create a report to see evaluation scores of a student's project.
#Requirements
- Evaluation scores are anonymous on the report
- Quantitative metrics should be aggregated into a statistic
- Qualitative metrics should be displayed individually as a collection
#Solution
- The report should be uniquely identified by evaluation & project.
- Projects should have multiple evaluations.
- Create a web page for the report using the path
/project-evaluation-report/:evaluationId/:projectId
- On the web page display
- Assignment title
- Project title
- for each criteria
- Criteria title
- Summary score
- The summary score for a criteria should
- Display quantitative metrics as an average
- Use a progress bar ranging from 1-maximum
- Fill the bar to location of the average
- Display the average value
- Display comments as a list
- Display quantitative metrics as an average
#Future Ideas Possible future reports:
- How well did the students collectively do on an assignment? How does performance vary?
- How do students at the beginning of a class compare at the end of the class? (assuming criteria reuse)
- How consistent are students in their evaluations? Instructors?
- Is there an effect when changing a criteria on learning outcomes?
- How do past students of a course compare to current students?
- How well are students in a major doing over their four years?
- How do grades compare to peer evaluations for an assignment? For a class? Course? Major?
- Do professors and students score similiarly on a criteria?
- Are course prerequisites optimal? Do students perform differently taking classes in a different order accounting for individual student performance differences?
- Do students achieve different results under different instructors?
Instructors could customize how reports are generated. Maybe a 1-100 scale should be displayed differently depending on the report desired. Maybe other statistics like standard deviation are important when more evaluations are compared?
Could use machine learning to find patterns.
Qualitative responses could be captured as a recording or video.
Kick Off, Feb 20
SCRUM, Feb 22
Functional Requirements, Feb 23
SCRUM, March 3
SCRUM, March 8
SCRUM, March 15
SCRUM, March 22
SCRUM, March 29
SCRUM, April 5
SCRUM, April 12
SCRUM, April 19
SCRUM, April 26
User Accounts
Assigning Evaluations
Assignment Management
Project Evaluation Report
Groups
Class Rank
Definition of Done
[Evaluation MVP (Haggis) Functional Requirements](Evaluation MVP-(Haggis)-Functional-Requirements)
Laravel Resources List
Reading List
Terminology