A mockup of Rate My Class on a laptop

Rate My Class

UX Research, UX Design, Usability Testing
Team

Katie Kunesh (me)
Christian Reyes
Aviv Roskes

Product Type

Mid-Fidelity Prototype

Timeframe

August - December 2023

My Role

Contextual Inquiry
Sketches
Low-Fidelity Wireframes
Mid-Fidelity Prototype
Usability Testing

Tools Used

Figma

Overview

In this semester-long project, my team and I were tasked with addressing a problem that we and our peers faced on campus. We chose to address the gap in information available to students about content and teaching style when enrolling in classes and design an application that would aid our peers in finding classes that were the right fit.

The Problem

When college students are deciding what classes to enroll in each semester, they are given little more than a short description of what the class entails. If a student wants to know exactly what the class covers and whether the assignments suit their learning style, they are forced to gather more information elsewhere, such as their peers or websites like Rate My Professor. However, Rate My Professor offers more information about specific professors than the classes themselves.

The Goal

We aimed to design a web application that has all the information students need about the classes offered at their university in one place, including:

  • Reviews from their peers about the content of the course
  • Information about difficulty, workload, assessment type, and teaching style
  • Recommended classes based on current major or path

Contextual Inquiry

To kickstart our design process, we conducted a contextual inquiry, engaging in conversations with six Vanderbilt University students to gain insights into their current approach for selecting classes each semester. We approached students from various majors to capture diverse perspectives on the process.

Each interview consisted of:

  • An in-depth exploration of the participant’s decision-making process when enrolling in classes for the semester
  • Observation of their typical method
  • Discussion of likes, frustrations, and suggestions for making the process more streamlined


This research uncovered valuable data about the information that college students value most when enrolling in classes, as well as the use of external resources, such as spreadsheets for tracking major requirements and the website Rate My Professor for assessing class difficulty.

With the findings from the contextual inquiry, our team created an affinity diagram to find patterns in the needs of our target audience, leading us to prioritize features such as credit tracking and incorporate statistics about workload and assessment style.

Storyboards

With a clear direction in mind, we proceeded to create storyboards in the form of comic strips to illustrate the context of use within our system. I focused on depicting a scenario in which a user leaves a review for a class they enjoyed. By visualizing this interaction, each member of our team gained a deeper understanding of users' expectations when they first accessed the website and the navigation pathways they would likely follow. This enabled us to refine our design and reduce the need for significant adjustments in later stages of development.

A comic of a person leaving a review

The Process

Research

Gather information about the needs of our target users

Plan

Analyze the project's requirements and create storyboards

Design

Sketch and develop simple wireframes for each page

Improve

Iterate through gaining feedback and making adjustments

Sketches

Based on the storyboard, we identified the essential screens that our system would require and began sketching designs for key pages. Each team member individually sketched their design for each page with pencil and paper, which we then consolidated by combining our favorite elements from each. This collaborative approach ensured that our designs were well-rounded and encompassed multiple perspectives.

A sketch of the home page

Revision and Mid-Fidelity Prototype

Once we had wireframes for the key components of our system, we presented our designs to our classmates to gather valuable feedback and refine our work further, then we moved forward with developing our high-fidelity prototype. To make the prototype more visually appealing and cohesive during user testing, we designed our branding. Because our site would provide users with a high volume of information, we chose colors that were simple yet high contrast to allow for specific elements to stand out.

The home page

The home page showing recommended classes

A class page

A class page showing ratings and reviews

Low-Fidelity Wireframes

Once our paper designs were agreed upon, we translated them into digital wireframes using Figma. Leveraging the ideas from our paper prototype, we each contributed to designing multiple screens for the system.

Our goals for the system were to:

  • Create an intuitive experience for users
  • Provide all of the information needed without feeling crowded or overwhelming
  • Provide easy navigation to view and discover different courses
A wireframe of the home page
A wireframe of leaving a review

Usability Testing

With our prototype complete, we conducted usability testing with Vanderbilt students to ensure that our application met their needs and was easy to use. Our participant group consisted of five Vanderbilt undergraduate students from diverse majors, allowing us to gather feedback from different perspectives on the class enrollment process.

We designed specific use cases for our participants, asking them to perform common tasks on the system while recording:

  • Amount of time taken per task
  • Navigation routes
  • Any errors encountered

After each session, we also sought participant feedback on our user interface, encouraging them to provide insights into any challenges faced and discuss the root causes of any errors.

To gauge overall usability, we had each participant answer the System Usability Scale (SUS) questions.

Responses from the System Usability Scale

Once all the usability testing sessions were completed, we analyzed the data. I calculated the average SUS score, which came out to an impressive 85 and indicated a highly positive user experience.

Additionally, I compiled all the usability concerns and suggestions mentioned by our participants into a cost importance analysis table. This provided us with valuable insights to prioritize future improvements to the usability of our system. While we were happy with the positive feedback, we also identified several small details that could be further refined in a second iteration of the design, which was out of the scope of this project.

Contact me!