Product development
A tailored digital solution aimed at optimising office utilisation of the hybrid workplace. An account of usability testing.
During the global pandemic of Covid19, working environments had permanently changed. Recognising the need to adapt, Torpedo group started to craft a bespoke digital solution to utilise their office more effectively. Viewteam emerged as a booking platform perfectly suited to meet the demands of the hybrid workplace.
Throughout Viewteam's development, as the only UX designer, I facilitated numerous usability tests to enhance the user experience of this SaaS product. Below I outline my method, providing an example of the extensive usability testing conducted for both desktop and mobile platforms.
THE CHALLENGE
The overall goal was to assess Viewteam's usability and identify areas for improvement. I concentrated on understanding how users navigate the system, book workspaces, and manage bookings on both desktop and mobile platforms.
I had two main objectives, identifying pain points and validating assumptions, all while ensuring the usability tests didn't disrupt development. I aimed to complete the evaluation within two weeks, focusing on tasks like testing the filter feature and assessing user interface interpretation.
I also had to resolve controversial ideas such as the necessity of a welcome page for desktop users who would visit the booking platform after a long period of time. Additionally, I needed to compare two UI options for the floor plan area, examining variations in button placements and zoomable area size.
STRATEGY AND PLANNING
To ensure a thorough evaluation of the booking system's usability, I meticulously planned my approach for testing both desktop and mobile platforms.
I started by creating a task list, detailing the screens participants would interact with for each task. Then, I reordered the list to map out the user journey, ensuring a logical flow of interactions. This blueprint guided the development of the prototype and structured the test order effectively.
Given the time-intensive nature of test preparation, I enlisted the support of a fellow UX researcher. I tasked them with scripting the sessions, recruiting participants, and scheduling sessions, allowing me to focus on finalising the high-fidelity prototype in time for the tests. I provided clear guidance on the task sequence, ensuring alignment between the script and planned activities.
It was important to make sure the prototype looked and worked like the real product. Anticipating potential pitfalls and avoiding misleading button placements was essential to accurately gauge participant navigation behaviour.
For participant selection, we opted to involve staff from Torpedo group, our target audience. We prioritised diversity, aiming for representation across departments, roles, and age groups. This diverse pool provided invaluable insights into the varied needs and expectations users might have when interacting with the workspace booking system.
Collaborating closely with the UX researcher, we conducted a test run of the script and prototype, refining them iteratively. Additionally, I conducted informal usability testing with other members of the UX team at Torpedo group, uncovering unforeseen pathways and refining the testing approach even further.
However, during these trial runs, I encountered an issue with the mobile version prototype repeatedly crashing on actual mobile devices. Despite exploring various solutions, I ultimately resorted to testing the mobile version on a desktop and utilising a mobile simulator to replicate real-world conditions as closely as possible.
THE SOLUTION
Throughout the usability testing sessions, recordings were paramount to ensure comprehensive data capture for later review. To facilitate this, both the UX researcher and I alternated roles as lead facilitators, utilising Google Sheets to log our observations, benefiting from the timestamp feature for accurate documentation.
For desktop A/B testing, participants were divided into two groups. Each group was assigned to test one design option while providing feedback on the alternative. This approach enabled us to evaluate the effectiveness of two different design options simultaneously, streamlining the comparison and analysis process.
The same pool of participants smoothly switched to testing the mobile version. They were guided through the prototype with scenarios and tasks designed to prompt certain actions.
During sessions, I took advantage of every opportunity to delve deeper, asking follow-up questions and offering prompts when participants faced challenges. By using open-ended inquiries, participants were able to explore solutions independently, enriching the qualitative insights we gathered.
With two distinct desktop journeys to evaluate, we conducted a total of eight moderated usability testing sessions, with four participants assigned to each journey. Additionally, six sessions were dedicated to testing the mobile version, ensuring comprehensive coverage across both platforms.
The moderated format provided us the flexibility to engage with users in real-time, enabling rich conversations and extracting valuable insights through questioning.
THE OUTCOME
Desktop A/B testing revealed that most users confidently navigated the list view UI and comprehended room statuses. Eliminating the welcome page in favour of a more direct office selection streamlined the desk booking process, aligning with users' preference for swift reservations.
Users largely failed to notice differences in the floor plan container, though all successfully accessed the meeting room view for booking. Option 2, preferred by stakeholders, was implemented based on user feedback. Clearer signposting for filtering and the removal of the 'Add new booking' feature were also enacted.
Some users found the office view overwhelming, prompting design adjustments to enhance scannability. A month view for the calendar was also implemented in response to user requests for better planning capabilities, and the issue of floor plan order preference was noted for future consideration.
The mobile usability testing gave positive outcomes across the board, with assumptions largely validated and only minor adjustments needed. Participants preferred utilising a floor plan map when booking meeting rooms, particularly favouring filtering by space type to display relevant markers exclusively.
However, some found the placement of the 'view' filter unclear and recommended relocating it to the filter section. Despite this suggestion, stakeholders opted to retain the current setup, considering the adjustment unimportant compared to the convenience it offered.
Booking meeting rooms proved straightforward for most participants, although some expected a list view rather than a floor plan display after applying filters. When rescheduling bookings, the majority opted for the 'Make a Booking' section over 'My Bookings’, likely influenced by their current interface location and visible booking confirmation. Editing booking times posed no difficulties.
Participants described the booking and editing process as clear and intuitive, mainly canceling bookings through the 'My Bookings' section using the 'bin' icon. However, one user preferred 'All Bookings' for cancellations. Filtering by tags facilitated locating a fire marshal for half of the respondents, while others required guidance.
The changes made after the usability tests were crucial in enhancing Viewteam's user-friendliness for individuals working in hybrid environments. Moreover, these tests provided valuable insights, helping me to validate or challenge assumptions, reinforcing the fact that there is diversity among users, and emphasising the significance of usability in ensuring product satisfaction.