Unmoderated usability testing synthesis
While initially, the project focus was to improve the tutorials experience, a larger effort to overhaul many aspects of the overall documentation experience has been implemented. Currently, design assets have been delivered with development set to deliver the initial phase of consolidating navigation between marketing properties and documentation (which currently resides in the IBM Cloud console experience) by Q2 2020's end. A second phase will commence to implement the framework of the upcoming IBM Cloud Docs tutorials experience.
Offering Manager + Architect + Content Strategist: Jenifer Schlotfeldt
Developers: Nick Gong + Raks Padmanabha + Grace Lo
Researcher: Mallory Anderson
Designers: Mina Adame + Missy Yarbrough
Primary Skills Utilized
wireframes + sketching
Understanding Pain Points
Tutorials are a core user need for technical documentation. IBM Cloud Docs has hundreds of tutorial based content but they are buried within their individual subcollections. Additionally, other than “Getting started”, there is no guidance or structure for tutorials. This in return leaves inconsistent content across 200+ services.
By taking a look through our Usabilla feedback, our team observed several opportunities to improve the tutorials experience.
Our users want to be able to navigate tutorials with ease of navigation.
"Given that this is an intuitive AI system... How come it is so hard to navigate through the tutorials with no interactivity whatever? It is deeply frustrating and not a good advertisement for what is clearly possible. I cannot even query in plain English how to find what I want! Oh the irony"
Received November 2017
Our users want to accomplish their goal.
"simply does not work fields are read only, app name will not save, cannot enter in anything regarding tokens, urls etc... I would say more but I have already waste enough time"
Received November 2018
Our users expect documentation information to be current and correct.
"I'm spending time to go through the sight follow every step as best as i can. Next thing i find out is that an important link is not working."
Received July 2019
After Mina Adame's extensive kick-off work on a competitive analysis and an in-depth product audit, I was brought in to continue validating our understanding of user needs. With our researcher Mallory Anderson's guidance, we worked to create and launch an unmoderated survey with the intent to observe how participants verbalized and reacted to our proposed tasks contained in our initial prototype.
Find a tutorial that uses Containers, Kubernetes, and Watson Tone Analyzer using dropdown filters.
Navigate to the tutorial "Creating Kubernetes clusters."
Skim through the tutorial contents. Assess if there is missing information needed to complete the creation of Kubernetes clusters successfully.
Advance to the next step.
Change the page's content from console instructions to CLI instructions.
8 qualified users shared their thoughts and time, and each had varying exposure to tutorials throughout their careers. The testing insights revealed the following user needs for a more fulfilling experience:
Visual progress indicator
Diagrams and examples
Identify by use cases
Search bar location
Single page vs. Multi page view
Next step advancement
Iterating for MVP
Amped with so many possibilities for a tutorials experience, Mina and I brainstormed many ideas for UI exploration and user flows. Even though we initially ideated in the Carbon Design System V9 style, we also started exploring how to design in the upcoming Carbon Design System V10 expression.
Since tutorials is such a largely used content type across many experiences, the design concepts were exposed to an immense amount of internal feedback. From our VP of Design to our IBM Design System team, Mina and I relentlessly refined the tutorials work to reflect the upcoming Carbon V10 style changes.
What did I learn?
While we were working on this effort, our squad was also responsible for integrating our IBM Docs Cloud and the existing marketing experiences since users had difficulties tracking down information + to improve the SEO performance. The combined experience is the most cross-collaborative effort to-date in which I have been directly involved. Not only were we connecting with our greater marketing counterparts, but we also drove conversations with multiple product teams and their respective content contributors. It was really cool to see how closely the two efforts interweaved so that we could deliver more cohesive experiences for users.
Another area of learning was gathering feedback. We managed to perk a lot of drive-by interest of people on their way to get snacks near our desks by printing the iterative screens in large format (3+ feet tall). This allowed people to be more precise with their feedback while snacking on noms.
What would I change?
Instead of a traditional table of contents section, I would like to explore more about the relationship of a user's progress as it relates to the tutorial's depicted progress in the indicator. While initial iterations had the concept of checkboxes, it did not logically pan out to be a true positive indicator of a user's progress.
We also had trouble identifying the proper language syntax for using natural language to filter out tutorial selections. After a lot of invested time, we wound up using a more traditional filter interaction. In the future iteration, it would have been helpful to collaborate with someone that's a bit more knowledgeable about the combined utilization of content design and natural language.