User Experience Design Research
SUS+Scores.png

Learning

 

Educational Institution

Select de-identified sections of the report will be shown here.

OVERVIEW

generic university.JPG

For an online learning platform, we tested five different platforms linked to one central Learning Management System (LMS).

We used eye tracking to help understand where participants were looking, searching for, and focused on.

We tested five vision impaired participants to help understand accessibility issues on the sites.

Client: A (non-Bentley) University with an Online Learning Platform

Role: UX Researcher

Length: 2 months

Skills: Usability testing, moderating, screening/recruiting, qualitative and quantitative analysis

Product Description: An online learning platform where students can take courses to get a university degree. Multiple courseware platforms were integrated into one system.

Top Takeaway: Participants wanted an “orientation” (specific to their course) with an overview of what was expected and required of them.


PROBLEM STATEMENT

question-mark-2110767_1920.jpg
  • Identify the most significant user experience issues and provide actionable design recommendations for each of the five courseware platforms.

  • Understand and identify how students navigate and make sense of courseware platforms within the LMS (Learning Management System)?

  • Understand and identify, how does the inclusion of courseware affect student perceptions of the learning experience?

This would help give direction for the next iteration of the design.


USERS & AUDIENCE

Ghost-people.png

We created a recruiter screener, and working with a professional recruiter we recruited 40 participants, aiming for a demographic that was similar to the Universities, while also testing accessibility. Due to attrition, we tested 36 participants.

  • 5 participants had a vision impairment requiring them to use a screen reader.

  • 31 participants had vision that met the requirements for using the eye tracker.

  • Gender – Mix [60-70% Female]

  • Age – Mix between 20-50+

  • None have completed a BS degree


Structure of Testing

software-762486_1920.jpg

36 sessions, each approximately 60-90 minutes long. 

The majority of sessions were conducted in-person. Three of out five participants with visual disabilities took part in remote sessions.

Each participant looked at the LMS and one courseware.

  • Background Questions

  • LMS Landing Page

  • Course Landing Page

  • Get and Review Syllabus

  • Registering Courseware

  • Courseware Environment Impressions

  • Courseware Activities

  • Return to LMS

  • Wrap Up


DATA COLLECTED & REPORTED

10910197294_f838579c94_o.jpg
  • Initial interview questions before testing

  • Eye tracking gaze in real-time

  • Usability issues observed during testing

  • SUS Questionnaire and open-ended questions at end of session

  • Recommendations for improvement

  • Video recordings of 35 sessions. Eye-tracking gaze recorded in video.


ROLES & RESPONSIBILITIES

image.jpg

I was on a team of four; our team was composed of a project manager, two fellow graduate students and myself. I was responsible for:

  • Helping write the recruit screener that would be used to recruit the participants.

  • Lead the writing of the moderation guide for two of the five platforms, and helping writing the others.

  • Analyzing screens independently according to usability heuristics for our expert review. Then, as per best practice, I came together with the other UX researcher to discuss the potential usability issues we had each found and synthesize our findings.

  • Designing the usability study by drafting the tasks for the moderator’s guide and receiving the client’s sign-off.

  • During the usability test, I moderated four sessions with in-person participants out of a total of 35 sessions.

  • After the usability test, I created a structure for our final usability test report, a “template” for each problem reported out so it would look consistent between all three teammates, and completed the entire website report section, executive summary, and conclusion.

  • Writing the final draft report before being approved and sent to the client with minor tweaks by the project manager.


RESULTS

Executive Summary

  • Though many participants experienced challenges while navigating the Learning Management System and courseware, their perception of each remained positive and they felt confident that they could learn them in time.

  • Participants wanted an “orientation” (specific to their course) with an overview of what was expected and required of them.

  • The calendars on both the Learning Management System Landing page and Course Landing page were deemed important for participants for planning and they were disappointed when it was hidden and/or did not meet their expectations.

  • Participants expected the syllabus to be more readily available given this is where they find learning materials information.

  • All participants struggled with registering the courseware. They did not necessarily understand how it and the Learning Management System contributed to their overall learning experience. Inconsistent terminology between Education Institution materials, the Learning Management System and courseware contributed greatly to this lack of understanding.

  • The course table of contents page served as a helpful place holder for participants and should remain persistent when users are at this level. When participants returned to the course page many sought the table of contents to orient themselves.

  • After redirecting viewers to the courseware tab, the launching page that loads on the original course page significantly disoriented participants when returning from courseware tab.


ACCESSIBILITY RESULTS

  • Most blind participants were positive with what their experience, however there is room for improvement.

Anybody who attempts to make education accessible is to be applauded, I think they’re on the right track and that they’re starting the journey is good to see, it really is a journey it’s not an end point” (P9)

“my take is that it looks like they are aware of accessibility there has been done work done, ….they’re probably about 60% of the way there” “there are a lot of challenges when working with an external vendor, but you’re asking the user to work with the landing page and to work with Pearson or any third party vendor that’s managing the coursework, so that’s super challenging” (P9)

“I like it, I think it’s encouraging.” (P18)

“I thought it was very good” a few things could be better “labeling the images, making sure the modules … are more described, opening the book in a new tab” (P21)

  • Like the other participants, switching tabs between the Learning Management System and the courseware was challenging. This is probably a bigger issue for blind participants it is not clear a new tab opened when the courseware launched.

  • There are some extra Heading elements that are confusing. Heading elements (H1-H6) are important to provide a mental model of the page structure. For example, there is H2 called “D2L-visualtalk-2019” that does not provide any value. Also, the page is missing an H1.

  • The set of course module/theme blocks’ organization are hard to grasp as there are too many redundant links that are too general and do not provide context such as “Go to Module”, “continue”, “return

“I’m a little confused because as far as the screen reader is concerned, it’s above the calendar” “in between it has a continue button and I’m not sure what those are supposed to go to, it’s not really telling me where it goes” (P11)

Accessibility Recommendations:

  • Simplify each block navigation. Remove “Go to module” link and only link the area name such as “Education Institution Careers”

  • All links should be able to stand alone. Use CSS to add hidden text that only screen reader users hear.


System Usability Score (SUS)

Bar Chart of SUS Scores.
  • The System Usability Scale (SUS) provides a “quick and dirty”, reliable tool for measuring the usability.

  • It consists of a 10 item questionnaire with five response options for respondents; from Strongly agree to Strongly disagree.

  • This SUS Score was 60, which is in the 30th percentile. An Average SUS score is 68.

  • It is not surprising such a complex system did not receive 100, but there are some improvement opportunities.

“In general, it seems like a pretty easy website to maneuver around. I do think there are some small things [that could be improved]” (P16 - Score 80.0)

“Once I understood it, it made sense. But it wasn’t easy to find right away”
(P36 - Score 52.5)


Future Steps for Client

future1.png
  • Review findings and recommendations in this report.

  • Create a plan for addressing issues in the Learning Management System.

  • Conduct a thorough review of Learning Management System accessibility using WCAG 2.1.

  • Create a plan for creating help/training material for students and technical support.

  • Discuss issues with courseware vendors.


See Another Page