User-Centered/User Experience Uc/Ux Design Thinking Approach for Designing a University Information Management System

User-Centered/User Experience Uc/Ux Design Thinking Approach for Designing a University Information Management System

Olujimi Daniel AlaoEzihe Amarachi Priscilla Ruth Chinkata Amanze Shade Oluwakemi Kuyoro Adewale Olanrewaju Adebayo 

Department of Computer Science, Babcock University Ilishan-Remo, Ogun State 121103, Nigeria

Corresponding Author Email: 
alaool@babcock.edu.ng
Page: 
577-590
|
DOI: 
https://doi.org/10.18280/isi.270407
Received: 
14 June 2022
|
Revised: 
23 July 2022
|
Accepted: 
5 August 2022
|
Available online: 
31 August 2022
| Citation

© 2022 IIETA. This article is published by IIETA and is licensed under the CC BY 4.0 license (http://creativecommons.org/licenses/by/4.0/).

OPEN ACCESS

Abstract: 

University Management Information Systems (UMIS) are a very essential part of a school’s ecosystem. Trying to build a functional UMIS is no longer a serious issue, these days as students interact with this system to perform tasks such as course registration, school fee payment, etc., the ease at which they do these activities is extremely important, any error or confusing experience they come in contact with can make the process dreadful for these users and demotivate them. This study would be centered on designing the User Interface (UI) and improving the UX of University Management Information Systems for web-based interfaces. User-Centered Design processes and system design thinking methodology were employed to solve the problem. Questionnaires were used to obtain the users' pain points as it relates to the existing UMIS in their schools, the responses were analyzed to understand the users’ pain and issues they face with their current UMIS and then decipher the right features to create a more usable interface. User personas and wireframes were used to make sense of the data obtained from user research. Figma, a visual design and prototyping tool was used for the prototype and interface design. The newly created interfaces were subjected to user testing using a platform called Maze. Users were able to interact with the platform and then answer certain questions as it relates to the developed system. Test data was used to measure usability parameters such as efficiency, effectiveness, learnability, ease of use and simplicity. From the testing phase, the developed system has a System Usability Score (SUS) of 87, it shows that users enjoyed using the system and could navigate through a platform they are interacting with for the first time, with little to no help. it was discovered that users prefer a simpler, responsive, and more interactive interface. Also, users were able to successfully complete tasks even though it is an interface they had never interacted with before. This study would address the usability issues students face while interacting with the UMIS platform provided for them by their institutions and also proposed a responsive and user-centered design which if implemented would improve students engagement on the platform and also reduce the constant problems that may arise from using the UMIS platform.

Keywords: 

user experience, user-centered design, design thinking, usability testing and user interface design

1. Introduction

Design is the captivating part of today’s world. From complex designs like buildings or cars to simple designs like a spoon, everything in life was designed to solve a particular problem or set of problems and strategic decisions were put into place to ensure these products perform as they should and can easily be used by end-users. This study is geared towards evaluating the essence of product design in complex systems such as websites and applications.

Although Don Norman, a cognitive psychologist, and a designer propounded the term “User Experience” in 1995. User experience as a system of design predates its name by a couple of decades, from “Feng Shui” a Chinese philosophy that dates back to over 6,000 years ago which looked at the arrangement of objects or a person’s surroundings spatially in other to optimize space and provide a user-friendly and harmonious environment to the Ancient Greeks in the 5th century BC who designed their work areas and tools using ergonomic principles-the scientific discipline concerned with the understanding of interactions between humans and other system elements and the area of expertise that applies theory, principles, and methods to design and optimize human well-being and overall performance of a system. Not forgetting Henry Dreyfuss an American industrial engineer who was known for placing usability at high importance when it came to his designs, in his 1955 book “Designing for People” he explained his own view of User Experience (UX) as:

“The designer has failed when the point of contact between the product and the people becomes a source of friction. On the other hand, if people are made safer, more comfortable, more eager to buy, more efficient, or simply happier as a result of their interaction with the product, the designer has succeeded.”

The impact of Walt Disney on the field of UX design is highlighted by Joseph Dickerson in his article for UX magazine, he stated that Disney’s guiding principles for his ‘Imagineers’ were “Know your audience, put yourself in their shoes, and use color, shape, form, and texture to communicate with them”.

He also envisioned a place where “the latest technology can be used to improve the lives of people”.

This look into history has showcased the importance of putting the user first in the thought process when it comes to anything design-related. This study will consider design thinking methods as key components in giving a deeper insight into the value obtained from considering how an application or website can be of importance to a user, how simplicity and flexibility of complex systems can yield desired traffic as well as engagement from users and also the general emotion of users when they come in contact with a particular system.

Problem Statement:

University Management Information System (UMIS) employed by universities, although functional still pose some challenges to the users. These issues have led to inefficiency in data collection and organization process. Navigation difficulties, complex nature of platform, non-responsive web platforms and more are some of the usability challenges students are saddled with while engaging with these platforms. These issues have reduced the rate of records digitization in most institution since more time is spent trying to manually fix the problems that the usability gaps have given rise to.

So, this study would take a closer look at the usability problems students face while using these platforms and proffer a simple, usable, and efficient interface that would help users easily carry out tasks, also some proposed features will be added to make the system more robust, existing features that are not properly implemented will also be closely looked into and these features would be designed in a way that would give students the best experience possible.

Aim and Objectives:

This study aims to showcase the importance of User-Centered Design (UCD) and how it can be applied to improving the usability of UMIS across universities. The specific objectives include:

1. Review different closely related literature works as related to UMIS and its usability.

2. Develop a model that can be used to implement a usable, functional, and engaging University Management Information System.

3. Evaluate the model designed to ensure that users have the best experience with the proposed system.

Significance of Study:

This study would address the usability issues students face while interacting with the UMIS platform provided for them by their institutions and also proposed a responsive and user-centered design which if implemented would improve students engagement on the platform and also reduce the constant problems that arise from the aforementioned situation. The study proposes a simple, efficient, and usable system that would ensure that students can easily accomplish their tasks with ease and also ensure that the goals of the UMIS are met.

Methodology Overview:

To achieve the aim and objectives of this study, quantitative research would be conducted with the use of questionnaires to pinpoint the grievances of students with regards to the current UMIS used in their various institutions, the information collected would be used to understand the true problem and propose a design-oriented means of fixing it. Also, the information obtained from the students through the questionnaires would be used to create user personas that would depict the actual users of the designed system. These personas would be essential in designing the system and ensuring that the design process remains user-centered at all times. Figma, a UX design tool, will be used to design the system’s interface as well as create the system prototype. Finally, the designed system will be tested by students using the usability factors as test parameters for feedback and to ensure that the objectives are met.

University Management Information System: 

El-Bakry and Mastorakis [1] defined UMIS as a computer-based system collection of hardware, software, people, data, and information to give administrators the tools they need to organize, evaluate, and run their departments effectively.

Since academic institutions like universities have different organizational structures and modes of operation from that of companies, businesses, or even non-profit organizations, there is a need for a UMIS for activities such as student registration, library services, online classes, and assessments in other to facilitate the administrative management of the institution [2].

El-Bakry and Mastorakis [1] highlights the different UMIS components, which include: Student Information System (SIS), Finance System, Library Information System, Faculty Information System, these components can make up a University Management Information System as shown in Figure 1.

Figure 1. Components of UMIS (El-Bakry and Mastorakis, [1])

User Experience Design (UED):

Don Norman in the late 1990’s while working at Apple Computer, Inc. popularized the term “User Experience”. He described it as:

“User experience encompasses all aspects of the end-users interaction with the company, its services and its products.”

It is a subset of product design and it focuses on determining the experience of a user of a particular product. ISO 9241-210, the international standard on ergonomics of human-system interaction, defines UX as a "person’s perceptions and responses that result from the use or anticipated use of a product, system or service" Mirnig et al. [3]. This means that before, during and after use of a product or system. UX encompasses the users' emotions, beliefs, preferencs, perceptions, bodily and psychological responses, behaviors and accomplishments.

User Interface Design (UID):

According to Ergonomics of human-system interaction — Part 210: Human-centered design for interactive systems [4]. User Interface (UI) refers to all components (software or hardware) of an interactive system that give information and controls for the user to complete specific operations with the interactive system. The optimal user interface is one that goes unnoticed, allowing the user to concentrate on the information and task at hand rather than the mechanics that display the information and perform the task [5]. Galitz [5] viewed the User Interface as the part of the computer and its software that humans or users can see, hear, touch, talk to, or otherwise comprehend or direct. It involves the use of COLOR, typography, images, iconography and other visual elements to convene information, help users accomplish tasks and satisfy the user’s needs. Often confused with User Experience design, it is a subset of users experience but not the experience itself, it is concerned with what the user can see, feel and touch when interacting with a platform while UX on the other hand, is concerned with the users, their journey, thought process, needs, etc.

For the UI design processes, Jitnupong and Jirachiefpattana [6] described them as processes involved in the design of a successful UI, they involve putting into consideration the tasks, users, type of platform being designed and the environment the platform will be used, also creating a prototype of the proposed platform that can be evaluated by the potential users and design team to obtain feedback and determine if the platform is good or not.

User-Centered Design (UCD):

User-centered design (UCD) is a design perspective and an iterative process that involves users in all phases of the process. It is defined as both a process and a philosophy. As a design process, it is a method for planning projects as well as a set of methods to utilize in each step. And as a design philosophy, its goal is to include users at every stage of the design process Garcia-Lopez et al. [7]. Donald A. Norman proposed the term ‘User-Centered Design’ and the concept became widely used after his second book “User-Centered System Design; New Perspectives on Human-Computer Interaction” was published in 1986.

Liu et al. [8] established that within the framework of user centered design, end users’ needs have to identified/ prioritized. In order to formulate design guidelines.

ISO 9241-210 established optimal UCD criteria for Human Computer Interaction development. The standard describes some phases in the user-centered design process, they include:

a) understanding and specifying the context of use.

b) specifying the user requirements.

c) producing design solutions.

d) evaluating the design.

Figure 2 showcases the iterative nature of the UCD processes.

How Does User Experience UX and User Center Design UCD Work Together in Product Design?

User experience (UX) is one of the many aspects of UCD. It includes the user's entire experience with the product, including physical and emotional reactions. UCD involves more than making designs aesthetically pleasing. Design plays an important role; however, it's not the only important factor.

While user-centered design refers to the process applied in order to engineer experiences, user experience deals with the specific experience users have with the products they use. It is a reference to how a user experiences and interacts with a product or service. 

Figure 2. User-centered approach iterative flow [9]

Related Works:

Al-Hunaiyyan et al. [10] defined Student Information System (SIS) as a management information system for education sector establishments used to manage student data. The purpose of their research was to look into an SIS's User Experience also, the strengths and shortcomings of the design, usability, and UX given by the SIS currently in use at PAAET (Public Authority for Applied Education and Training). It was evaluated based on six key parameters for successful systems, the parameters include attractiveness, efficiency, perspicuity, dependability, stimulation, and novelty. To achieve the objectives, qualitative and quantitative analysis was used to obtain feedback on the SIS system. Regarding the qualitative analysis, 16 students were selected at random out of the 645 research participants from the five colleges in PAAET, for a focus group session, with the help of a facilitator, an in-depth discussion session was carried which provided an insight into the perception and experience of the students on the system. For the quantitative analysis, the questionnaire was adapted from User Experience Questionnaire (UEQ) which can be used to measure User Experience, it was shared with the school among each faculty and its students. After proper analysis of the data obtained from the questionnaire and the focus group, it showed that the participating students had a favorable impression of the SIS. With regards to the UX parameters that were used to evaluate the system, the perspicuity, stimulation, and dependability of this SIS were scored somewhat higher than its novelty, attractiveness, and efficiency, according to the data. This indicates that the SIS at PAAET, which has been in use since 2010, is no longer capable of supporting the new learning models and delivery modes that students need.

Maslov et al. [11] centered their study on investigating university students' perspectives on the learning management system (LMS), identify elements that influence user experience and e-learning outcomes, and provide potential solutions to identify issues with the UX and usability of an LMS. The focus of this study is the Modular Object-Oriented Dynamic Learning Environment (Moodle) platform which is an application of LMS increasingly being used to facilitate e-learning. To achieve the study purpose, the researchers invited 10 male and 10 female students for quantitative and qualitative analysis of the platform. The researchers adopted the 24 questions developed by Topolewski et al. [12] to obtain both qualitative and quantitative data, for qualitative data collection, the researchers conducted 20 semi-structured interviews and for quantitative data collection, a short survey was developed using a Likert scale from 1-7, with 1 being “very unfulfilling with UX property” and 7 being “very fulfilling with UX property and shared among the respondents. After analysis of data collected, the findings revealed that students, particularly in programs where courses are mostly delivered online, rely on such learning systems. Furthermore, the use of Moodle as an LMS program was assessed and was regarded as a successful long-term learning option in the current environment.

For Phongphaew and Jiamsanguanwong [13], the purpose of the study was to identify the key interface concerns of a system called myCourseVille using a usability evaluation approach connected with five usability factors on the student and teacher interfaces. During the course of the research, scholars were able to identify that one of the reasons for myCourseVille's non-adoption and user displeasure was a system design issue, such as difficulty in use, system complexity and user dissatisfaction with the interface. Usability evaluation was one of the design methodologies used to discover human-computer interface difficulties and depict actual user behavior with the system in order to make recommendations for improving inadequate interface design.

Also, various usability concerns, such as complex design and ambiguous language, were discovered. The researchers also discovered from the study that the inappropriate layout design of the main function and the size of icons on the screen in both students and teachers interfaces posed a challenge to the users, the study suggested a redesign of the page layout and adjusting the size of the menu should be considered to improve its usability.

Studiyanti and Saraswati [14] study aims at increasing the satisfaction level of students in University X by evaluating and analyzing the usability of SIS (Student Information System), to do this, the effectiveness, efficiency, and satisfaction of SIS are measured using the following instruments: Task Successful Rate. Mouse-Clicking and System Usability Scale (SUS). In selecting participants for the Usability Testing, Macefield [15] stated that 30 persons would identify 95% of usability problem of a system, therefore, thirty participants made up of 15 males and 15 females from the Industrial Technology Faculty were selected.

The usability test was conducted three times: Pilot usability testing, early usability testing, and final usability testing. The usability problems discovered during the early usability testing formed the input for the improvement of the SIS, a prototype of the improved SIS was accessible through a Local Area Network and was made as a Low-Fidelity design which was tested again during the final usability testing, the results from the test was compared with that of the early usability testing and it was discovered that there was a significant increase in effectiveness, efficiency, and satisfaction with the platform. The final results of the study demonstrated that the SIS was not performing well enough to satisfy the students/participants and that modifications needed to be evaluated on the same participants. The testing was carried out with the use of a prototype created. It was discovered through usability testing that effectiveness went from 58 percent to 85 percent, efficiency climbed from 66 percent to 92 percent, and satisfaction scores increased from 53.83 to 70.67.

From the review of literature works by other scholars related to the topic, the use of instruments like Efficiency, Effectiveness, Satisfaction and learnability led to the identification of usability problems of various platforms. Also, the use of either qualitative or quantitative means of data collection to aid usability analysis led to the identification of this study research model, research design methods that would be adopted in this research work.

2. Research Methodology

2.1 Introduction

This section discusses the research methods adopted in this study. It will also highlight the tools, functional requirements as well as the process model employed for the study.

2.2 Research model and system design

2.2.1 Research model

The diagram in Figure 3 depicts the general model and approach that was employed for the study. To successfully design the proposed interface of a University Management Information System, the highlighted process was followed.

Firstly, Quantitative analysis and research was done using questionnaires to collect data on the pain points of students regarding an existing UMIS they already use. 100 respondents who are university students, who are also potential users of the University information management system, responded to the questionnaire which provide critical data necessary for understanding the users employing Macefield [15] findings that 30 persons would identify 95% of usability problem of a system. The data collected facilitated user problem identification, creation of user personas, system features identification, and creation of user stories, etc. The design thinking process as shown in Figure 4 was utilized to design the proposed interface, the design thinking processes include:

  1. Empathy: This involves understanding the pain points or struggles of the user regarding a specified use case. In the case of this study, it involves understanding the struggles of users (students) as it relates to their interaction with the existing University Management and Information Systems in their school. To really understand the users, questionnaires were shared and users were given the opportunity to anonymously express their grievances. The questionnaires were created and shared using google forms.
  2. Define: After the data was collected, it was necessary to analyze the data in order to make informed design decisions such as narrowing down users’ problems and identifying users’ needs. Google forms and the UEQ data analysis tool were used to analyze the collected data.
  3. Ideate: After identifying the user's problems and struggles. A brainstorming section became fundamental in streamlining the necessary features that will help solve the problems and make user interaction with the interface easier and swift. After the features have been identified, user stories and user flows were created to understand clearly how each feature will affect user needs and the journey users will take throughout the entire system. Figma will then be used to facilitate this process.
  4. Prototyping: Next, with the information obtained from the previous stages, the visual design of the interface began using Figma. User interface components and elements such as buttons, text boxes, checkboxes, tables and other necessary elements were created and combined together to create the main interface. After the interface design, each frame layout was prototyped together to simulate an actual interaction experience, this prototype was then subjected to testing.
  5. Testing: Finally, using a tool called Maze, the prototyped platform was tested by 30 persons (Male and female), ranging from first-year to final year students. The following parameters were used to test the interface: Effectiveness, efficiency, learnability, error tolerance, simplicity, ease of use.

2.2.2 System design

In other to effectively design the proposed interface, the prototyping process model is employed. Prototyping is the process of creating a working replica of a product or system that must be designed and implemented. It provides a small-scale facsimile of the finished product and is used to gather customer feedback. In this model, a prototype of the end product is created, tested, and refined based on user feedback until a final acceptable prototype is achieved, which serves as the foundation for developing the final product. Specifically, the evolutionary prototype model as shown in Figure 5 is employed. In this approach, the initial prototype is incrementally streamlined based on user feedback until it is finally accepted. It provides a better approach that saves both time and effort. Using the evolutionary prototyping approach makes room for updates as user feedback comes through as regards the proposed UMIS interface.

Figure 3. Proposed model for designing the interface for a University Management Information System. Researcher’s model (2021)

Figure 4. Design thinking stages. Researcher’s model (2021)

Figure 5. Evolutionary Prototyping Model "Understand the software development life cycle models" [16]

2.3 Data collection method

For this research, a quantitative research approach was employed. For precise data collection, an online survey was carried out to effectively obtain data on the usability issues users face with the current systems available in their institutions. The survey contains semi-structured questions sent to students for the collection of data.

This study employs questionnaires as its main tool for data collection and it is designed after the User Experience Questionnaire (UEQ), it was developed using a Likert scale from 1-5. A questionnaire is a research instrument consisting of a series of questions with the aim of obtaining information and responses from participants. The following highlights the reasons for employing UEQ for designing the questionnaire:

  1. It is often used in classical usability evaluation to collect some quantitative data about the impression of users concerning their experience with a particular system.
  2. It contains heuristic parameters such as attractiveness, perspicuity, efficiency dependability, stimulation, novelty. Each parameter has different items totaling 26 as shown in Figure 6 that foster the collection of data concerning the issues users face.​ 

Figure 6. Assumed scale structure of the UEQ. Schrepp [17]

The questionnaire contained closed-ended and open-ended questions, these open-ended questions allow the respondents to duly express themselves in more detail, proving more validity to the data used. The data collected using the questionnaire evaluates the various pain-points University students face with regards to their experience with the University Management Information Systems offered to them by their various Universities.

2.4 Reliability and validity

Polit and Beck [18] described reliability as the constancy and preciseness of information attained from a study. The reliability of the responses from the questionnaires was measured using two means. Firstly, a non-biased data collector evaluated the questions to guarantee acceptable question competency. Secondly, the questionnaire's questions ensured that the participants were well-versed in the specifics of the tasks undertaken. While, Polit and Beck [18] said that Validity is a complicated motion that refers to a study’s reliability and the illative sustenance provided. The questionnaire was written in an unambiguous manner and uses basic language to obtain data relevant to the study's goal, all participants were told of the study's purpose and provided voluntary consent. The supplied replies were used in the study while maintaining anonymity.

2.5 Questionnaire analysis

The questionnaire employed to understand the users pain points made use of the Likert scale, ranging from 1-5. A total of 15 questions were answered by 100 respondents, each question was drafted to understand their experience with the existing UMIS systems in their schools with relation with UEQ metrics. Some of the questions used in the questionnaire and their outcome are as shown in Table 1.

Problems with the Existing System

While analyzing responses from the questionnaires, some setbacks were identified with regards to the usability issues in the UMIS available in Universities as shown in Figure 7.

As a response to the question: Which of the following would you say are pain points while using your school's UMIS? The following result was obtained about the pain points of the existing system.

  1. Confusing interface: 51% of respondents recorded that the general interface of their school’s UMIS is very confusing and they struggle to understand the general function and purpose of the platform.
  2. Poor interface design: 48% of respondents recorded that their school’s platform has a poor interface design and they do not find the system aesthetically pleasing.
  3. Difficulties finding necessary information (navigation issues): Given the ambiguous nature of the interface elements, respondents expressed the issues they face trying to locate certain information on the platform, also moving from one page to another poses a challenge to 63% of them. This pain point holds the highest percentage of response, this means that users are really struggling with it.
  4. Unclear error messages: 58% of respondents recorded that the use of technical jargon and complex error codes are distressing to them. This means that the system does not use simple grammar that can be understood by everyone and can provide the users with the help they need if they find themselves in any error.
  5. Unresponsive system: 53% of respondents, recorded that the system is not responsive across different devices.

With the result of the questionnaire in place and analysis, the researcher identifies the need to design features such as course registration, hall of residence selection even login, in such a way that would ease users’ pains.

System flows, user stories and wireframes were used to increase our understanding of the users’ problems and parts of the proposed design that will be helpful and parts that may not be. Figma was then used for the wire framing, visual interface design and prototyping while Maze was used for user testing.

Figure 7. Questionnaire analysis chart. Researcher’s analysis (2022)

Table 1. User research questionnaire

Questions

UEQ metrics

Likert scale values

Outcome

On your first few tries with your school's UMIS, how easy was it to understand?

Perspicuity:

This metric tries to measure if it was easy to get familiar with the UMIS on the users first time and if it was easy to learn how to use the current system.

Ranged from ‘Not Understandable’ to ‘Very Understandable’. With 1 being ‘Not Understandable’, 2 being ‘Somewhat Not Understandable’, 3 being ‘Neutral’, 4 being ‘Understandable’ and 5 being ‘Very Understandable’

46% of the respondents found their school’s UMIS somewhat understandable on their first try. Showing that the existing systems perspicuity level is very low.

The last time you registered, how would you describe your interaction with your school's UMIS interface?

Attractiveness: This metric tries to measure the overall impression of the system and find out if the respondents like or dislike the current system.

Ranged from ‘Very Unpleasant’ to ‘Pleasant’

34% of respondents stated that they neither have an unpleasant nor very pleasant interaction with their school’s system and 27% of the respondents have slightly unpleasant interactions. This means that respondents do not find the system as attractive as they should.

On a scale of 1 to 5, how confident are you while using UMIS in terms of the number of errors made?

Stimulation: This metric tries to measure the level of excitement and motivation respondents have concerning their school’s system.

Ranged from ‘Not Confident at all’ to ‘Very Confident’

38% of respondents have a neutral standpoint when it comes to their confidence level with their school’s system and 23% of the respondents are more confident. This means that the system does not provide users the required features they need to be as confident as expected while using the system.

Does your school's current UMIS meet your expectation, in terms of interface and interaction?

Dependability: This metric tries to measure if the users’ expectations are met with the system.

This was a YES or NO question.

80% of the responds is a NO. Showing that that the interface and system interactions have failed to meet the respondents’ expectations.

Table 2. Table of features for the proposed interface design

Features

Description

Login

The interface shall be designed to allow users log in using their matriculation number and password. If they cannot successfully login, they shall be directed to get help.

Semester registration

The interface shall be designed to make registration easy and smooth for users. By providing prompts that remind students of their registration status (Registered or Not). Also, the interface shall be designed to make the general registration process easy.

Residence

The interface shall be designed to make hall of residence selection easy. Providing detailed information about each residence, information such as residence type, available space, population and other relevant information about the residences.

Academic overview

The interface shall be designed with features that would allow students to view their system results and keep proper track of their academic journey.

Financial registration

The interface shall be designed with features to facilitate payment of fees, keeping track of finances.

Help/guide

The interface shall be designed to provide help and guide on how to navigate through the system and easily mitigate any issues students might face.

Accessibility

The interface shall be designed to be responsive on mobile devices and desktop devices easily. Also features for text-to-speech will be designed to facilitate those with visual impairment.

Table 3. Non-functional requirements for the proposed interface design

Usability principles

Requirement

Metrics

Effectiveness

The user should be able to accurately complete each task.

Measured in terms of completion rate.

Efficiency

The user should be able to successfully complete a task within a certain time.

Measured in terms of task and the number of clicks used to accomplish the specified task.

Error tolerance

The user should be able to retrace his/her step when an error is encountered.

Measured by the number of errors (Miskicks) per task.

Learnability

The system should be easy to learn by the user at first encounter and subsequent use.

Measured by questionnaire, users will be subjected to a quick survey after the testing process. Also Observing the users who are testing the platform.

Simplicity

The user should find the interface simple to use to achieve their goals.

Measured by questionnaire, users will be subjected to a quick survey after the testing process.

Ease of use

The user should be able to complete tasks easily without much effort.

Measured by questionnaire, users will be subjected to a quick survey after the testing process.

Functional Requirements:

The functional requirements/features of the system are as shown in Table 2.

Non-functional Requirements:

For this study, the non-functional requirements of the proposed interface design include usability principles that will be employed to create the design are as shown in Table 3. Non-functional requirements provide functional limitations and outline how a system should behave. These specifications guarantee the system's efficiency and usefulness. For the proposed interface design, five usability principles will be considered as non-functional requirements.

2.6 Design tools & wireframe

2.6.1 Design tools

  1. Figma: It is a collaborative interface design tool that allows designers to work on design projects as a group. Web and application interfaces can be designed on this platform. Visual interfaces and wireframes can be created as well. Figma includes vector tools for complete illustration, as well as prototype and software development tools for hand-off. It enables real-time collaboration by allowing team members to all log into a design project at the same time and make changes to the design at the same time. Figma makes it easy to create interfaces that are consistent by allowing the creation of reusable components, design systems and style guides. It provides designers with access to a growing library of templates, themes, plugins, and UI kits.

Figma was used in this study to create the visual design for the proposed UMIS interface, design the wireframe for the layout and then used for prototyping.

  1. Maze: This tool was used to conduct usability testing of the designed interface. It is a usability testing platform that employs a clickable prototype to obtain practical feedback from users. This user testing platform collects feedback on design prototypes by conducting click tests, generating tasks for users to complete and asking open-ended or closed survey questions. It provides options for a variety of usability tests such as card-sorting, 5-secs test, that suits the tester needs. After the testing is completed, testers receive a UX assessment report on participant behavior and responses, which include the number of clicks, heat maps etc. By conducting click tests, generating activities for users to accomplish and asking open-ended or closed survey questions, the user testing platform collects input on design prototypes.
  2. User personas

This is critical in the user-centered design and design-thinking processes. It transforms the abstract concept of "user" into a person with thoughts and emotions, allowing the designer to better understand who they are designing for. User personas are quintessential users whose goals and characteristics are representative of the needs of a larger group of users. A persona is typically presented in a one or two-page document as shown in Figure 8. Behavior patterns, goals, skills, attitudes, and background information, as well as the environment in which a persona operates, are all included in these 1–2-page description Faller [19]. In order to keep the design of the UMIS interface user center, user personas were employed in this study.

Figure 8. User persona. Researcher’s design (2022)

2.6.2 Wireframe

A wireframe is a two-dimensional skeletal outline of a website or application. Wireframes provide a visual representation of the page structure, layout, information architecture, user flow, functionality and intended behaviors. Due to the fact that a wireframe usually represents the first version of a product, styling, color and graphics are kept to a minimum, they help to clarify the product's features Hannah [20]. A wireframe of the proposed interface design is as shown in Figure 9.

Figure 9. Login screen wireframe. Researcher’s design (2022)

2.7 System implementation & the proposed UMIS

2.7.1 System implementation

Figma was used to design the proposed system from a made-up school called University of XYZ. After the design, the proposed system was subjected to usability testing to ensure that the interface provides users with the best possible experience in terms of usability and ease of use.

2.7.2 The proposed UMIS

The proposed UMIS comprises of the following components that make up a university management information system:

  1. Student Information System: This is implemented with the addition of a profile feature, where students can update their personal information, get access to their medical information and view their previous registration information. It also comprises designed features that allow students to register their hall of residence, meal-type, course registration etc.
  2. Finance System: This is implemented with the addition of a financial section that allows students to view the fees they have to pay and also their available balance, it also gives them verified access to the school’s payment gateway. Users can view their financial history and so much more.
  3. Faculty Information System: This is implemented with the addition of faculty information on which instructor is taking any course and that information allows users to easily select the course they need.

2.8 Dashboard interface for the proposed UMIS design

The interface in Figure 10 is the first screen the user sees when they login to the platform. It provides users with quick access to the necessary information they need. This means that from the dashboard, they can easily navigate to select their hall of residence, register their course and check their semester result.

The dashboard interface complies with Miller’s law which is a user experience law that means that the average person can only keep seven (plus or minus two) items in their working memory. This is implemented clearly in the “Registration quick select” section of the interface where only three of the most important features or tasks that users can do are available on the screen. The organization of the necessary content into smaller chunks makes it easier for users to have access to the part of the system they need, also it makes processing and understanding the information on the screen easy.

Figure 10. Dashboard interface of University of XYZ. Researcher’s design (2022)

In terms of usability principles, the interface was designed after the following principles:

  1. Visibility of system status: This states that the design should always keep users up to date on what is going on by providing appropriate feedback in a timely manner Nielsen [21]. This is seen in the simple “Welcome to your Dashboard” statement that communicates to the user,
  2. Match between system and the real world: This states that the design should communicate in the language of the users. Words, phrases, and concepts that the user is familiar with were used. Also there is adherence to real-world conventions by arranging information in natural and logical order Nielsen [21]. This is seen in the use of simple, plain, and non-complicated grammar across the screen, allowing the user to easily make sense of the interface and what each button, card, and section do. For example, the button labeled “Start Registration” uses simple grammar, the user easily understands that clicking on this button will allow them begin registration.

2.9 Screenshots of course registration interfaces for the proposed UMIS

Figure 11. (1) Course registration interface to select level of study (Researcher’s design, 2022)

Figure 11. (2) Interface to apply for course form and select courses. (Researcher’s design, 2022)

Figure 11. (3) Interface Showing View of Selected Course List. (Researcher’s design, 2022)

Figure 11. (4) Interface Showing Application for Course Form Approval. (Researcher’s design, 2022)

2.10 Financial registration

Financial registration interface for proposed UMIS:

Figure 11. (5a) Financial registration interface of the University of XYZ. (Researcher’s design, 2022)

Figure 11. (5b) Financial registration interface of the University of XYZ. (Researcher’s design, 2022)

Figure 11. (5c) Financial registration interface of the University of XYZ. (Researcher’s design, 2022)

The interfaces from Figures 11. (1) to Figure 11. (5c) are what users come in contact with when they want to register courses. Firstly, users are allowed to select their level, and the courses for that semester and for that level will be displayed. They simply have to select only the courses they want based on instructors and the available space in the class. After registering their selected courses, they are navigated to the selected course list section. In this section, users have to select from a drop-down if they want to “Drop Selected courses” or “Apply for course form”. If they want to drop courses, they can select multiple courses at once or select all the courses and click the “Drop selected courses” button. If they want to get their course form, a modal pops up with summarized information of the selection process, they can choose between previewing the selected list or applying for course form.

This interfaces were designed with the Von Restroff effect in mind, especially with the buttons that have a different color when they are disabled, letting users know that the action cannot be carried out because an event has not been carried out. Also, there is visual hierarchy with the “Course form Approval” modal Call to Action (CTA) button, letting users know that the “Apply for course form” button is the most important since it is a solid button, it catches users' attention first before the other button. Also, the language used on the CTA buttons is simple and straightforward, abiding by the “help users recognize, diagnose, and recover from errors” usability principle.

There are other designed interfaces of the UMIS such as Hall Registration, Financial Registration, Meal Plan Selection, Help and Guide Interface and Post query Interface developed for University XYZ.

2.11 The post query interface

Figure 11. (6) The post query interface (Researcher’s design, 2022)

The post query interface shown in Figure 11. (6) is essential. It focuses on usability principles error prevention and error recovery. This interface will help users who are facing issues using the system or need steps on how to complete a process. This screen will also reduce the rate at which users make errors.

2.12 Usability testing

After the design of the proposed interface, to ensure that the interface provides users with the best possible experience in terms of usability and ease of use, the system was subjected to testing.

Usability testing with metrics as shown in Table 4 was employed to test how easy the designed UMIS interface is to use with a group of users. In user experience design, usability testing is important because it helps to:

  1. Identify flaws in the product or service's design
  2. Identify areas for improvement
  3. Understand the target user's behavior and preferences

For this study, the type of usability testing employed is quantitative, where usability principles such as efficiency, effectiveness, error tolerance, learnability, simplicity, and ease of use were measured and the metric for measuring them collected. Each metric for the test was compared against a benchmark that was already determined.

A platform called Maze was used to administer the test. Maze offers a variety of test blocks, such as yes/no test, card-sorting test, mission test, 5-secs test etc. This study employed the mission test and the 5-secs test to effectively measure usability. The test method used was In-person usability testing, in which the researchers conducted the test live with users so that researcher can guide them and observe their reactions and general actions. Users were encouraged to think out loud and express themselves with each task and interface they came in contact with during the test. The test was conducted with 30 persons and all testers were university students, 15 males, and 15 females. After the test with Maze, users were subjected to an after-test questionnaire where they answered questions on their general take on the tasks given.

Table 4. Usability test metrics

Usability principles

Metrics

Effectiveness

Measured in terms of completion rate.

Efficiency

Measured in terms of task and the number of clicks used to accomplish the specified task.

Error Tolerance

Measured by the number of errors (Misclicks) per task.

Learnability

Measured by questionnaire, users will be subjected to a quick survey after the testing process. Also observe the users who are testing the platform

Simplicity

Measured by questionnaire, users will be subjected to a quick survey after the testing process.

Ease of use

Measured by questionnaire, users will be subjected to a quick survey after the testing process.

Table 5. Maze task breakdown

Task name

Task instruction

Expected number of clicks

Average time to complete task

Login and hall of residence registration

Attempt to login and access the dashboard, then register "DEXTER HALL" as your hall of residence

7 clicks

32 seconds

Meal plan registration

Register the BL meal type for the vegetarian diet option

4 clicks

12 seconds

Course registration

Select your courses and print course form

9 clicks

52 seconds

Opinion scale

How easy is it to register your courses?

Made use of the likert scale, ranging between not easy at all - very easy

Nil

Financial registration and get help

Navigate to the financial section and pay fees. Also, get help on how to register for the semester and post a query

7 clicks

28 seconds

Opinion scale

How difficult was it to find the help section?

Made use of the likert scale, ranging between very difficult - very easy

Nil

Logout

Attempt to logout from the system

2 clicks

5 seconds

5-second test

Focus on the image

An image of a section users had come in contact with before was displayed to them, to test their memory and learnability

5 seconds

Multiple choice

From the image in the last test, what would you say that screen is for?

4 options

 

Table 6. Task analysis summary

Task

Misclick rate

Avg total task time

Avg bounce rate

Task usability score

Login and hall of residence registration

10.4%

41 secs

0%

82

Meal plan registration

5.8%

16 secs

3.3%

94

Course registration

2.9%

52 secs

0%

82

Financial registration and get help

3.8%

45 secs

0%

79

Logout

0%

14 secs

0%

97

Table 5 gives a brief breakdown of the tasks on Maze which includes task instructions, the number of clicks per feature and also the average time taken to complete each task, this served as a benchmark for evaluating and analyzing the test data. It is important to know that the time taken to complete each task is dependent on different factors such as network connectivity and this parameter in the above table is just an approximation, users may complete the task before time or even after time. They just should not consume up to 60 seconds on a particular task.

2.13 Task analysis

The Table 6 above shows a breakdown of the result of the usability testing, each task was completed through a number of screens, testers interacted with these screens to get to the end point of the task. Each task has its average misclick rate, which is the rate at which testers might have miskicked while completing the task or it could be the rate at which testers used a different path from the predefined one to complete the task. The average bounce rate refers to the rate at which testers gave up on a task, it could be due to task difficult or other factors, for this test, the Meal Plan registration task had a bounce rate of 3.3%, only 1 tester out of the 30 testers gave up. For each task screen, the screen usability score and the user’s heat map were recorded. The heat map shown in Figure 12 help the researchers understand the tester journey

through the tasks and the interface elements they interacted with to accomplish the tasks.

Figure 12. Heat map of testers interaction with the dashboard (Researcher’s analysis, 2022)

Analyzing the results under the usability metrics earlier highlighted:

1. Effectiveness: Testers were able to successfully accomplish each task in a sufficient amount of time. For all task except for the meal registration that had a 3.3% bounce rate, all 30 testers completed the tasks.

2. Efficiency: For each task, it was observed that majority of the testers completed the tasks in line with the benchmarked number of clicks and test paths.

3. Learnability: This metric was tested using a simple 5-second test. Testers were shown an image of part of the interface for exactly 5 second and they had to indicate what they had seen, all 30 testers were able to successfully identify the screen presented to them. This metric was also measured by observing the testers during the in-person testing, tester’s ability to understand the interface layout and use that understanding to complete other tasks in the test, showed how well and easy the platform is to learn.

4. Simplicity and ease of use: This was measured primarily with and after test questionnaire, where testers were allowed to express themselves on tasks they enjoyed doing, those they found easy, those they struggled with and their interface preference comparing the interface they just experienced and that of their schools. Figures 13 and 14 show the results of the questions asked.

Figure 13. After test questionnaire analysis chart (Researcher’s anaysis, 2022)

Figure 14. After test questionnaire analysis chart (Researcher’s analysis, 2022)

2.14 System Usability Scale (SUS) score

Figure 15. Final system usability scale score from maze

After the analysis of each test by Maze, the final System Usability Scale SUS score for the system is as shown in Figure 15 to be 87. This is much higher than the average SUS which is 68. According to Thomas [22], the average System Usability Scale score is 68. If it is under 68, then there are probably serious problems with the system usability which should be addressed. With the developed system SUS score of 87, it shows that users enjoyed using the system and could navigate through a platform they are interacting with for the first time, with little to no help.

2.15 Key takeaway from usability testing

The testing process was carried out in-person. This allowed the researchers to encourage testers to voice out their thoughts as they went along with the test. During the test, testers spoke of the easy flow of task instructions and how easy the platform was to use. Also, the researcher's presence helped guide them with the test process but not tell them what to do. Interestingly, testers were able to navigate through the test with little to no struggles for an interface they had never interacted with, they were able to find their way around, some quicker than others. Another key observation, was that in spite of the test and the fact that each tester was informed that they were being timed for each task, it was noticed that a significant amount of users wanted to spend more time on the platform because they enjoyed the general flow, the colors and the use of visual elements to guide them.

For each mission, the research team were able to test the efficiency and effectiveness of the interface, in terms of efficiency, users were able to complete each task with a very low bounce rate and most testers even discovered new paths to accomplish a task then the one mapped out. The effectiveness of the interface faced some challenges due to lags caused by internet connectivity, but testers were still able to accomplish their tasks on time.

Finally, learnability, simplicity and ease of use was measured using an after-test questionnaire and also observing and listening to testers voice their thoughts during testing, testers were able to pick up the new layout and learn how to navigate around, they found the platform easy to use and found the platform simpler than their schools current UMIS interface.

3. Conclusion

This study was conducted to showcase the importance of User-Centered Design (UCD) and how it can be applied to improving the usability of UMIS across universities. The design thinking process was employed, user research was done using questionnaires to get the feedback of students on their experience and pain points with their school’s UMIS. The information obtained was used to decipher necessary features and updates to existing features to make our model more usable. Using Figma, a design of a UMIS for a made-up school called University of XYZ was created and prototyped. Usability principles and laws were applied in the design of the UMIS for the University of XYZ. The newly created interfaces were subjected to user testing using a platform called Maze. And then an after-test questionnaire was administered, the test was carried out in-person with 30 persons. The efficiency, effectiveness, ease of use, simplicity and learnability of the proposed design was tested and measured. The design scored a SUS of 84 which is above average, meaning that testers enjoyed using the platform, they could interact and flow with the system easily with little struggle for a platform they were just interacting with for the first time. From our findings, we have concluded that to create platforms like a University Management and Information System that is made up of various systems put together, schools need to consider usability and sound user experience in developing these platforms. This study showcases the power of usability and how it should be considered before creating functionality.

4. Recommendations

Future research can investigate the possibility of designing accessibility features for users who may be visually impaired or have any level of cognitive difficulty will cater for the user experience needs of students who may be blind or have cognitive difficulties such as dyslexia and make accessing and using UMIS easier and better.

In-person usability testing is also highly recommended because it allows the researchers to observe the testers closely and ask follow up questions. This was done because the best way to gather data when trying to improve a product is to put it in front of the end users, show it to them, ask them to use the product, and observe their behavior through a method called an in-person usability testing. This method will help understand how real users will respond to the product, specifically what they are experiencing during usage and to gather insights that can improve the user experience. Finally, schools should consider providing their students with platforms that are not just functional but usable and that makes it easier for students to complete tasks with little error margin.

  References

[1] El-Bakry, H., Mastorakis, N. (2009). E-Learning and Management Information Systems for E-Universities. Proceedings of the 13th WSEAS International Conference on COMPUTERS, pp. 555-565.

[2] Alwan, N., Alzubi, M., Al Moaiad, Y. (2016). The impact of management information system in educational organizations processes. 2016 IEEE Conference on e-Learning, e-Management and e-Services (IC3e). http://dx.doi.org/10.1109/IC3e.2016.8009060

[3] Mirnig, A.G., Meschtscherjakov, A., Wurhofer, D., Meneweger, T., Tscheligi, M. (2015). A formal analysis of the ISO 9241-210 definition of user experience. Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 437-450. https://doi.org/10.1145/2702613.2732511

[4] Iso.org. (2019). Ergonomics of Human-System Interaction — Part 210: Human-Centred Design for Interactive Systems. https://www.iso.org/obp/ui/#iso:std:iso:9241:-210:ed-2:v1:en, accessed on September 29, 2021.

[5] Galitz, W.O. (2002). The Essential Guide to User Interface Design. John Wiley & Sons, Inc. (2nd ed., p. 2).

[6] Jitnupong, B., Jirachiefpattana, W. (2018). Information system user interface design in software services organization: A small-clan case study. MATEC Web of Conferences, 164(9): 01006. https://www.researchgate.net/publication/324698865_Information_System_User_Interface_Design_in_Software_Services_Organization_A_Small-Clan_Case_Study

[7] Garcia-Lopez, C., Mor, E., Tesconi, S. (2020). Human-centered design as an approach to create open educational resources. Sustainability, 12(18): 7397. http://dx.doi.org/10.3390/su12187397

[8] Liu, Y.D., Encelle, B., Sehaba, K. (2020). A user-centered approach to design a mobile application for chronic pain management. Modelling, Measurement and Control C, 81(1-4): 24-29. https://doi.org/10.18280/mmc_c.811-405

[9] Franzreb, D., Franzreb, P. (2016). Designing with Human Centered Usability Standards. https://www.uxbooth.com/articles/designing-usability-standards/, accessed on October 2, 2021.

[10] Al-Hunaiyyan, A., Alhajri, R., Alghannam, B., Al-Shaher, A. (2021). Student information system: Investigating user experience (UX). International Journal of Advanced Computer Science and Applications, 12(2). https://dx.doi.org/10.14569/IJACSA.2021.0120210

[11] Maslov, I., Nikou, S., Hansen, P. (2021). Exploring user experience of learning management system. The International Journal of Information and Learning Technology, 38(4): 344-363. http://dx.doi.org/10.1108/IJILT-03-2021-0046

[12] Topolewski, M., Lehtosaari, H., Krawczyk, P., Pallot, M., Maslov, I., Huotari, J. (2019). Validating a user eXperience model through a formative approach: An empirical study. IEEE International Conference on Engineering, Technology and Innovation (ICE/ITMC), pp. 1-7. https://doi.org/10.1109/ICE.2019.8792617

[13] Phongphaew, N., Jiamsanguanwong, A. (2018). Usability evaluation on learning management system. Advances in Intelligent Systems and Computing, 40-48. http://dx.doi.org/10.1007/978-3-319-60492-3_4

[14] Studiyanti, S., Saraswati, A. (2019). Usability evaluation and design of student information system prototype to increase student’s satisfaction (Case Study: X University). Industrial Engineering & Management Systems, 18(4): 676-684. https://doi.org/10.7232/iems.2019.18.4.676

[15] Macefield, R. (2009). How to specify the participant group size for usability studies: A practitioner’s guide. Journal of Usability Studies, 5(1): 34-35.

[16] Understand the Software Development Life Cycle Models. (2021). https://outsourcingvn.com/software-development-life-cycle-models/, accessed on March 14, 2022.

[17] Schrepp, M. (2019). User Experience Questionnaire Handbook. pp. 2-3. http://dx.doi.org/10.13140/RG.2.1.2815.0245

[18] Polit, D.F., Beck, C.T. (2014). Essentials of Nursing Research: Appraising Evidence for Nursing Practice. 8th Edition, Lippincott Williams & Wilkins, Philadelphia.

[19] Faller, P. (2019). What Are User Personas and Why Are They Important? https://xd.adobe.com/ideas/process/user-research/putting-personas-to-work-in-ux-design/, accessed on March 14, 2022.

[20] Hannah, J. (2021). What Is a Wireframe? The 2022 Wireframing Guide. https://careerfoundry.com/en/blog/ux-design/what-is-a-wireframe-guide/, accessed on March 14, 2022.

[21] Nielsen, J. (1994). Heuristic Evaluation. In J. Nielsen, & R. L. Mack (Eds.), Usability Inspection Methods, pp. 25-62. New York: John Wiley & Sons.

[22] Thomas, N. (2021). How to Use the System Usability Scale (Sus) to Evaluate the Usability of Your Website. https://usabilitygeek.com/how-to-use-the-system-usability-scale-sus-to-evaluate-the-usability-of-your-website/, accessed on March 20, 2022.