Abstract: Needs assessment and usability study of the EdAssist’s Tuition Assistance Management System.
Category: User Research, Needs Assessment, Heuristic Evaluation,
Problem: Assess the needs of the student user Students who were using the system to apply for tuition assistance or seek reimbursement
Project: I was part of a team of five students that conducted needs assessments, usability studies, and heuristic evaluations for the EdAssist’s Tuition Assistance Management System. Our assessment started through gaining an understanding of the TAMS users based on interviews, user personas and scenarios, and surveys. We then conducted user testing on the system based on this background information. We Our analysis took into account the organizational workflow for creating marketing material, the use of external resources like databases, and the internal communication amongst JDL employees. We concluded our project by with a report offering empirically based recommendations to improve the information flow onto and off the website.
Role: Project Manager & Editor
Final Solution: Summary of Research Results
Interviews: We conducted interviews with TAMS employees and current and potential TAMS users. We used the interviews to generate user personas and scenarios and personas to better understand the average TAMS user.
Surveys: We used these personas and scenarios to conduct a survey of the current TAMS users, so we could understand user behavior , analyze system usage and context of usage, and to give our client a more accurate reading of their users’ demographics.
Comparative Analysis Study: We conducted direct, indirect, parallel/partial and analogous studies of EdAssist’s competitors to identify strengths and areas of improvement for TAMS based on functionality, target audience, aesthetics, and interaction style.
Heuristic Evaluation: We conducted indivdual analyses of TAMS based on Nielson’s Ten Heuristics.
Usability Study: We recruited five potential users of the TAMS site and had them complete a series of four tasks based on the actions of the average TAMS user. After the study we had each user fill out a survey to gather information about their experiences during and after the task, and then debriefed the users.
* Figure 1. is the work of Fan Zhang, one of my talented group-mates.
From the compilation of these studies we arrived at 5 findings:
Feedback: TAMS does not give the user appropriate feedback for actions within the system. We recommended that they adjust the system to tell the user that they are saving the application and not submitting it and that they change the navigation bar so as to indicate the tab the user is current on.
Navigation: Users commented that there were several links/webpages that were labeled differently but linked to the same information. We recommend that excess links/pages be consolidated and labeled consistently
Information organization: The structure of information on the TAMS site does not fit users mental models. We recommended the rearranging of certain questions so that the users would answer all of the financial information on a single page instead of flipping between pages.
Language: The language on website did not follow natural language models for users; users found many of the terms TAMS used to be confusing. We recommended having “tool tips” or other explanatory information to clarify technical terms.
Flexibility: The TAMS site was not responsive on mobile or tablets. We recommended that since 40% of their current users were accessing their site on either their tablet or phone that they make the website responsive so that users can check their application on the go.