Commenting Experience:
Smartsheet Mobile
Investigating and improving the commenting experience on the Smartsheet Mobile App for remote collaboraters
Timeline
10 Weeks
Deliverables
Final Presentation
Context
UW MHCI+D Usability Studies Class
Team
Songyi Han
Gayathri Killingam
Tavin Olarnsakul
Overview
Over 10 weeks, my team and I conducted a usability test with five participants to evaluate the commenting experience on the existing Smartsheet Mobile app. Our study objective was to gain a better understanding of the current commenting experience and to identify any potential usability issues that may be hindering collaboration. Our study resulted in a presentation of our findings and recommendations to relevant Smartsheet Mobile stakeholders.
My Role
UX Researcher
Conducted heuristic evaluation
Designed usability study
Conducted 2 remote usability tests
Transformed findings into recommendations
Communicated relevant information and updates with Smartsheet PM
Managed communication with participants
Background
Smartsheet is used to assign tasks, track project progress, manage calendars, share documents, and manage other work, using a tabular user interface. The iOS and Android mobile app provides users with convenient access to their spreadsheets on the go, with the goal of seamless collaboration with their colleagues. Comments are known as conversations in Smartsheet Mobile.
Business Need
Users are entering the conversations section of the app but exiting without leaving a comment, therefore an investigation into the current collaboration experience is needed.
Research Questions
After being challenged to understand how people view the current collaboration experience and identify usability issues and areas for improvement in the current experience, my team and I crafted these research questions:
What is the current Smartsheet Mobile collaborative experience like?
What are the enablers/barriers to collaboration?
How are people getting into conversations?
Do people understand row, sheet, vs. all comments?
Study Overview
1. Heuristic Evaluation
Action: Each team member used Smartsheet Mobile to leave and respond to comments on a sheet, and evaluated the experience using Smartsheet's heuristics and severity scales.
Goal: Familiarize ourselves with the current experience and identify critical areas for testing.
Results: We experienced difficulty:
Understanding how to find the cell a comment was left on
Understanding the meaning of icons
Parsing information that lacked hierarchy
Smartsheet's Heuristics
Clarity
Positive Emotional Engagement
Information Hierarchy
Minimal Cognitive Overload
Simplicity & Efficiency
System Feedback
Universality & Consistency
Error Prevention & Handling
Smartsheet's Severity Scale
Cosmetic Problem: Doesn't need to be fixed unless there's extra time on the project
Minor Problem: Less important to fix
Major Problem: Very important to fix and should be given high priority
Catastrophic Problem: Imperative to fix now
Finding from heuristic evaluation: Unintuitive navigation structure to find the context of a comment.
After opening a comment for the first time, the app requires users to use the back arrow in order to find the cell the comment was left on, despite the user never having been to that page before.
2. Usability Testing
RQ: What is Smartsheet Mobile's current collaborative experience like and what challenges do people face?
To answer this question we conducted 5 remote usability tests with both pre-existing and new Smartsheet users in order to gain a better understanding of the current commenting experience and identify any potential usability issues that may be hindering collaboration.
Participants (N = 5):
Existing Smartsheet Users (n = 2): people who use the desktop version of Smartsheet more than mobile; recruited via Smartsheet’s internal database of users
Users that are New to Smartsheet (n = 3): people who have never used Smartsheet before but are familiar with the workflow and other remote collaborative tools; recruited via my team’s personal network
Why a screener wasn't necessary
We did not use a screener in this study because we had access to demographic and mobile app usage data from Smartsheet’s mobile team, and were able to verify that our new-to-Smartsheet participants had never used Smartsheet mobile prior to the study.
Testing Plan
During the usability test, our participants were asked to conduct tasks using Smartsheet Mobile. Participants were asked to share their screen with us via Zoom and talk aloud about their thoughts and feelings about the app during the session. We conducted one test where we shared our screen and had a participant tell us where to click due to their unfamiliarity and uncomfortability with the app. During this test, participants:
Answered the generative research questions consisting of the questions to understand the user’s background and current experience with Smartsheet/collaborative tools
Were asked to imagine they worked for a non-profit charity and were in communication with their team using Smartsheet Mobile and needed to perform 6 tasks:
Answered questions about their overall experience of the tasks and comparison with other collaborative tools
Completed a SUS questionnaire
Usability Metrics + Methods
We used the following metrics/methods in order to collect data during each usability test. Each method is described in detail in the appendix of our final report.
SEQ
Task achievement
SUS Questionnaire
Behavioral tracking
Data synthesis & Analysis
To synthesize our qualitative data, we used an affinity map that was organized by each of the 6 tasks in the testing sessions to gather relevant comments and usability issues together. We counted the number of stoppers and slowers for each task to help us identify where users experienced the most difficulty. We also ranked the severity of each finding using Smartsheet’s severity scale. Overall, we found that users experienced the most difficulty during the following tasks:
opening the correct comment
locating the row in spreadsheet that the comment referenced
writing a comment on parent level (to everyone)
finding a specific comment in a given row
To analyze our quantitative data we created charts and found that, in accordance with our qualitative findings, users experienced the most challenges during tasks 3, 4, and 5.
What Worked Well
96.67% of all tasks were completed successfully by the participants:
73.33% of the tasks completed with ease
23.33% of the tasks completed with some minor struggles
In particular, 100% of users were able to complete the task of replying to a comment in the conversation thread and leaving a new comment on a given row without any issues (Task 2 and 6).
Areas for Improvement
1/ People were constantly tapping on objects that looked like buttons but weren’t.
Users tried clicking on the Row 10 label element thinking that it would take them there
Users did not like having to manually navigate the sheet to find the context of a comment and expressed their frustration
Recommendation:
Make the area where it says “Row 10” a clickable element that will take the users from a comment to the location it was left on
2/ People were unsure how the mentioning and notification system works.
When unable to tag all, users weren’t confident that everyone would get a notification or be able to see it
Users weren’t sure of whether they should continue tagging the people from the initial thread, impacting their workflow
Recommendations:
When first-time users click on the mention icon, have a popup that informs them about how tagging works in the context of the thread
Make it easy for people to tag others, based on recent or context of threads
Add the @all/@everyone feature to the commenting experience so that users can be confident that everyone is notified
3/ People have trouble finding information they need to reference when the sheet is very large.
Remote collaboration often includes finding and responding to information that a coworker has mentioned which is hard to do in the mobile app
Recommendation:
Explore better ways that can help users effortlessly find and retrieve the information they require to respond to comments (i.e improved mobile view, contextual search, A.I. suggestions, etc.)
4/ New users were confused by row comments because they were expecting to see all of the comments in the sheet.
Different understandings of terms like "row," "sheet," and "all comments" among participants affected their task approach and comprehension of UI screen information.
Recommendation:
Rather than using an empty state message that defines what a conversation is, consider utilizing UX copywriting to articulate the purpose of this section more effectively.
5/ New users did not understand that the two comment icons opened different comments.
Participants didn't understand the connection between which icon led to which type of comment. A few participants thought the drawer was for the entire sheet, not the highlighted cell, causing confusion.
Recommendation:
Add a red notification badge to the all tab if new comments exist to draw attention to it when users click on a row and see no comments.
Consider reassessing the interaction and aligning it more closely with users' expectations.
6/ People had difficulty finding the comments they were looking for in the conversation page.
To find a specific comment without the search button, users tried different methods and one participant gave up in frustration
New Users were expecting a way to search from the conversation page, rather than navigating back to the main sheet to search
Recommendation:
Make the comment threads visually more distinct from one another
Consider allowing users to filter/search for comments in the conversation
Explore ways to help users narrow down if item they are searching for has many queries
Outcome
We presented our research findings and recommendations to various Smartsheet stakeholders including designers and product managers.