top of page

潘天睿

thumbnail_landscape_dashboard.jpg

Dashboard redesign for case packing machine

This is a course project featuring a full usability research that finally leads to the redesign of the HMI used by factory workers of a Dutch automation company. From framing the problems and user testing to concept development and on-site evaluation, this project follows a rigid and logical user-centered design process.

Personally, I developed my skills as a UX/UI designer to conduct and analyze user research through usability testing and to work all the way from interaction concepts to a high-fidelity UI redesign. From collaboration between teammates and constant communication with stakeholders and supervisors, I developed my communication skills, meanwhile delivering design work both academically rigid and aligning with industry needs.

2024

Usability research, UI design

Group project

A demonstration of the final prototype.

Part 1

Usability research

Case-packing machine produced by BPA.

machine.jpeg
Project background

BluePrint Automation (BPA) designs and manufactures case packing machines. The design project is about redesigning the dashboard of the BPA packing machines, which serves as a terminal to present information about key performance, errors and recordings of machine failure.

The dashboard is embedded into the machine and it's also accessible through a web browser for manager to monitor real-time performance.

current_interface.png

Current dashboard interface.

Context mapping

We started by mapping the context in use of the dashboard, concerning user groups, stakeholders, symbiotic product/services, environment and the goal of users. In this way, we tried to think about the scope of our project and figure about the aspects to consider in research and design.

context.png
personas.gif
Building persona and scenarios

From the main user groups (Operators and service engineers) we summarized 5 personas covering both of them, trying to understand potential user needs. These personas included experienced workers, temporary migrant workers, inexperienced young people, motivated ambitious young people, and mechanic engineers.

 

Based on persona 3 and 4, we created use scenarios to get a better understanding of the current interaction with the machine.

storyboard_1.png
storyboard_2.png

Storyboards demonstrating user struggles.

Research question

To guide us through later stages of design including user testing, data analysis and designing, we formulated a research question to define our scope and goal, based on our understanding on the current user experience and interaction.

Research question
What factors influence users' navigability, understanding, and confidence in utilizing the dashboard interface, particularly for first-time users?
Usability test: current interface

We conducted a usability test in a simulated scenario in order to understand the usability of the current interface, evaluate existing design and form design directions.

Test setup

- 9 participants

- 2 scenarios

- 6 tasks

-Observasion, survey

Goal:

Examine first impressions
Usability for first time users
User performance on tasks
Users' ability to extract information
Analyze strengths and weaknesses
Analyze if the dashboard’s design is inclusive
Summarize design opportunities

test_setup.png
test_journey_1.png
test_journey_2.png

Test journey mapping

Data analysis

Based on the collected information, we defined 2 core criteria to determine if a task is fulfilled:

1. Will the user know what to do at this step?

2. If the user does the right thing, will they know they are doing the right thing toward the goal?

To summarize collected data, we integrated test journeys of 9 participants to see how well they performed in each task.

feedback_form.png

Results from the SUS forms.

Test insights

Emotional impact & usability:

  • Participants feel confused, frustrated and lacking confidence due to complex technical language, unclear alarm messages and graphs.

  • They find it difficult to navigate so access to critical pages was blocked.

Task completion:

  • Participants commonly couldn't fully finish tasks due to various challenges.

  • Even when they find correct information, interpretation remained hard.

Comparison between 2 scenarios:

  • In scenario 1, participants began with checking machine effectiveness, leading to generally negative emotions. In scenario 2, the same task was completed later, resulting in more positive emotions due to increased familiarity with the dashboard.

  • In scenario 2, participants were less positive due to repetitive nature of tasks.

Summary:

  • Participants consistently felt confused and frustrated.

  • Poor usability significantly hindered task completion and information interpretation.

Problem mapping

From the user test data we extracted the most common usability issues and categorized them into 5 types.

Using a impace-frequency chart, we mapped these problems to understand where to mark as top priority in redesign. The top-right corner, where problems are both frequently happening and having serious impacts, are issues that require most resources when redesigning.

problems.png
problems_mapping.png

Problem mapping in the dimensions of frequency and impact.

Problem statement

Based on the current evaluation on the dashbaord usability, we defined core problems and posed answers to them. By answering which problem is the essential and core problem, we hope to form a direction into a concrete design goal and design criteria to guide through the design.

Redesign the interface with a clear overall language that aims to give users accessible information about the working status of the machine. On the emotional level, we hope to support users with confidence in performing the tasks, which will lead to higher motivation and form a positive loop. 

problem_statement_1.png
problem_statement_2.png
Design goal &
interaction vision

From our problem statement, we developed a design goal. This goal will help us explore design solutions and can later be used to examine if our final design meets our initial goal.

To illustrate the desired user experience of operators with the redesigned interface, we use a metaphor that contains qualities that we want to include in our redesign.

Based on our design goal and interaction vision, we believe that our redesign should be intuitive, efficient, empowering and accessible. 

Design goal
Our design goal is for operators to feel confident in managing the machines, empowering them to swiftly identify and solve errors. Through an intuitive dashboard interface, we aim to provide operators with easy access to relevant data for monitoring.
Interaction vision
The interaction with the redesigned dashboard interface should feel like playing games on an iPad as a child.
interaction_vision.png
Part 2

Design ideation
brainstorming.png
clusters_2.png
Brainstorming &
concept generation

Focusing on our primary solution space defined in the problem mapping, we organized a brainstorming session to cover all the aspects and come up with design concepts.

We refined and clustered our initial ideas based on our problem statement using the "How To" method. This ideation process brought many interesting design directions. Some ideas include gamification aspects, competition, a reward system, being able to personalize the interface, and provide guidance and positive feedback.

concepts.png
concept_sketch.jpg

We converged fragments of our brainstorming insights into several solid directions based on the rolls or personalities of the dashboard. We used them as starting points to form 4 design strategies as solutions, then framed them in a pyramid diagram based on how radical the concept was. Through this pyramid, we are able to compare between conservative and ‘out of the box’ solutions and have an overall control over our concepts. When evaluating ideas in later stages, we can have a grasp on the priorities of each solution with this scale.

Based on our 4 main directions, we ideated on the layout and details of the dashboard interface, and sketched them out.

Concept evaluation

We used an adapted Likert-Scale to evaluate and compare different design concepts.. Participants were interviewed and asked to rate individual design ideas on a scale of 1-5, with each element's scale based on one or more design criteria, which were confidence, empowering, efficient navigation, understanding and inclusiveness.

concept_evaluation_A.gif
concept_evaluation_B.gif
Design decisions

Based on the results of the evaluation, we narrowed down on the design choices on elements in the dashboard.

design_decision_1.png

1. Menu

  • The menu should be visible at all times (accessibility)

  • Combination of words and icons to improve (intuitiveness)

  • The menu should not take up a lot of space in the screen

design_decision_2.png

2. Alarm message

  • Should be a pop-up that appears once the machine detects an alarm

  • Simplified 3D representation for showing the location

  • Access to alarm log to check past performance

  • Suggested solutions (confidence)

  • Videos easily accessed (accessibility)

  • Button to ask for help

design_decision_3.png

3. KPI visualizations

  • Present OEE as a horizontal equation (intuitiveness and understandability)

  • Simplified shape, preferable full circle

  • Color coding the percentages (intuitiveness)

  • A clear visual difference between products and cases

design_decision_4.png

4. KPI findability

  • Visuals/graphs for every KPI, once clicked (understandability)

  • Time selection bar, so information can be retrieved from a time frame

  • Search bar (findability)

  • A clear visual difference between products and cases

  • Less cluttered screen by organizing KPI’s and making more information optional

design_decision_5.png

5. Motivational elements

  • Very minimal information to reduce cognitive load

  • With the option to get more information or a description

  • Daily rewards and building something over multiple days helps engagement and proactiveness

Part 3

Deliver & evaluation
Final prototype

After defining the design proposal on different dashboard elements, we developed a high-fidelity prototype. We focused on incorporating elements that had proven effective in our earlier tests and aligning them with a specific style.

Design decision: Colors

The color selection is based on research focusing on including almost all types of color-blindness  Only people who see exclusively grey-scale are still unable to effectively read the colors used, for this group we included abbreviations of the alarm message.

Alarms are indicated and shown with the same colors, this way operators will learn the color coding and can quickly recognize the alarm.

design_decisions_colors.png
prototype_1.png

Design decision: Menu bar

In the initial dashboard, navigating to the other page was unclear, making information hard to find. To address this, we implemented a recognizable menu bar at the top of the dashboard that remains visible at all times. Additionally, we opted to use icons to enhance understanding, especially for visually oriented users.

prototype_2.png

Design decision: Visuals

As an addition to the KPI list we created visuals aiming to make the technical terms more understandable. For instance the relation between Availability, Performance and Quality and how they create the OEE is visualized with an equation. The KPI’s related to case production are connected in one visual, creating visual logic in one image .

Design decision: Time setting

We made the decision to have the time feature present in all pages of the dashboard. This way all pages have the same structure which creates consistency. Being able to change the time settings is relevant in all pages as all data is dependent on this setting.

Design decision: Error location

By displaying a 3D model that can be turned 360 degrees, we give the operators a clear and visual idea of where to look in the machine to find the alarm.

prototype_3.png
prototype_4.png

Design decision: Daily goal

From our previous research we concluded a lack of motivation amongst operators. To raise motivation we created a daily goal that showcases positive achievements.

Design evaluation

In order to evaluate our redesigned prototype and identify recommendations on future design, we formulated a research question with three sub-questions concerning the usability of the interface, emotions it brings to users, and opinions on the main strategy we took in our redesign.

We conducted a user test to obtain the information needed to answer these questions.

research_question_final.png
Test setup

- 10 participants

- 7 tasks

-Observation, survey

Goal:

Examine first impressions
Examine usability for first time users
Analyze user performance and user experience
Observe if users are able to extract information from the dashboard
Analyze strengths and weaknesses
Analyze if the dashboard’s design is inclusive
Evaluate interaction qualities
Determine further recommendations

final_test_setup.png
task_analysis.gif
feedback_form_final_1.png
feedback_form_final_2.png
Test results

Many of our design decisions turned out helpful for participants to retrieve information and enhanced positive emotions, such as the alarm pop up page, daily goals and KPI visuals.

However, there are several aspects that still need improving.

  • The way the alarm information is presented now turned out not able to give users the right information. Despite the correct performance of the tasks, a lot of participants feel less sure if they did the right thing. It is also found in the SUS form that some of them didn’t understand the alarm message.

  • The misinterpretation of information is also an issue. It is still hard for participants to understand some information on the dashboard like the most frequent alarms.

  • The intended use flow of knowing, solving, confirming and recapping an alarm remains unclear to some participants after the test. Some of them didn't remember what to fill in after they tackled the alarm, and some overlooked the problem to fill in when they are logging out.

  • Team collaboration elements in our design had divided feedbacks from our participants. The way it works can be still confusing to quite a few participants. What should we present concerning team progress and how to present them in a clear way still need discussing.

  • Some elements in the UI design and page layout still need developing. The clickable contents in the KPI page and the alarms page were hardly found by participants. The time selection and graphs are located on the page where they turned out to be confusing.

Report 1/3
Report 2/3
Report 3/3
bottom of page