lab-screen-thumb.png

A Cloud Guru Hands-on Labs

A Cloud Guru’s Hands-on Labs

Background

In early 2021, Amazon announced it would include graded labs in its Cloud Certification exams, are requirement for most cloud jobs. To prepare, many students use A Cloud Guru. ACG's labs platform let learners test their cloud skills in a risk-free lab environment, but they didn't have a graded component providing feedback.


Opportunity

One of ACG’s main OKRs in 2021 was to increase standalone lab engagement as well as foster continuous learning. With AWS adding graded labs to their certification exams in just a few months, we saw an opportunity to develop a lab experience that would better prepare students for their certifications while also helping to foster continuous learning by giving students more direct feedback when building on existing skills.


My Role

  • Facilitated sync and async discovery with product, content and marketing stakeholders

  • Conducted user and stakeholder interviews

  • Prototyped concepts and ran usability tests

  • Worked with engineers to deliver experience into production


Goals

  1. Understand gaps in our current labs and deliver an experience that can provide the best possible exam preparation for students.

  2. Improve usability of the lab experience while utilizing as many existing components as possible to deliver a practice-exam-ready lab before certifications in a few months.

  3. Develop a way to provide students with feedback on what they’re doing in the cloud.

Plan to track & measure

  • % increase in unique standalone lab usage (standalone = labs outside of a course)

  • % increase in overall standalone lab engagement


 

Understanding the current experience

Primary Tasks
To familiarize my self with the current experience I started by identifying the primary tasks a user performs in running through the current lab experience.

UX Evaluation
Next I went through a UX evaluation of each page across the experience in order to understand what components were reusable, what technical limitations we were under, and what opportunities areas we needed to improve.

 

Identifying student & business needs

With a solid understanding of the current experience I worked with my PM to conduct a series of user and internal stakeholder interviews. We also reached out to AWS exam beta testers, in order to get their perspective on what would be most helpful to prepare for the new exam.

From these interviews we determined our student and business needs and had cross functional session to prioritize what we believe we could deliver for an MVP.

Determining Scope

After getting the team’s consensus on user value and level of effort for design & engineering, we clarified what was and wasn’t in scope for our fast approaching MVP.

 

Curveball: Finding the right mental model for new labs

Leadership had become invested in splitting up labs 3 ways. As we were working through the solution and sharing internally, I began to notice there was a lot of confusion around having multiple types of labs. Internally we were using around names like “Graded”, “Hands-on” and “Challenge” labs, but no one was clear what the difference was, especially because all of them were technically “Hands-on”.

To mitigate further confusion I facilitated an async cross-functional brainstorm session to get everyone aligned on how we were thinking of delivering these new labs and where we should position them.

Clarifying the Mental Model of “Hands-on” Labs

From that brainstorm I noticed that everyone had very distinct mental models. It became clear many of these stakeholders didn’t see a need for a third lab type. I made some visual diagrams to illustrate their mental models, then went back to the stakeholders to dot vote on which mental models made the most sense to them.

Challenge & Guided Modes

The session resulted in the clear winner being the mental model of “Hands-on Labs” with 2 modes “Guided” where learners can check work as they go, and “Challenge” where users will take the lab without help and receive a score at the end, like the AWS exam.



Usability Test Goals

After determining our MVP scope and clarifying the mental model, I explored several designs and built a prototype to conduct a usability test.

 

Key Learnings from Usability Testing

 

Hands-on Labs Designs

The Lab Overview

This is the updated lab overview page. I designed the mode switcher as a dropdown, in order to give users some extra context for the mode they’re selecting, as well as to give us some future scalability for additional modes we may add to labs.
We also were able to add the lab history section. This section provides a users a sense of how they’ve done in the past with this lab, and what version they’ve taken this lab in.

 

Hands-on labs in Challenge mode

This is the lab in challenge mode. Notice the introduction and prerequisite sections. From our user testing we were able to find the best placement of elements on the page to give the users the right starting point, without a video introduction.

 

Challenge Mode Results

Below is our results page for learners who complete a lab in challenge mode. A huge takeaway from our user testing was giving learners the option “Show” and “Hide” answers they may have missed. We also worked with learners and content to come up with the best language to use for when a question may have gotten wrong, notice the use of “Missed” vs “Failed”.

 

Hands-on Labs in Guided Mode

The grader built for Challenge mode created an incredible opportunity to improve Guided mode by allowing users to check their progress as they worked through a lab with video instruction.
Now learners not only had the video instruction to follow along, but they could also check their answers as they went. This feature helped learners see where they went wrong, how to improve, and ultimately resulted in an increased lab completion rate from 79% to 92%.

 

Outcomes

  • Challenge mode helped spark a 33% increase in unique lab starts

  • Monthly active engagement with standalone labs went from 20% to 43% by the end of H1

  • From the addition of Guided mode, we saw a 17% bump in users “Completing” labs in guided mode after we added the ability for learners to check their answers as they went


 

Next Steps

  • After launching Challenge Mode successfuly, we were also able to build and integrate our labs with practice exams delivering an experience that provided our learners with extra practice preparing for their AWS certification exams.

  • Looking forward we plan to improve the discovery of labs. Continuing to observe engagement, I have begun watching Fullstory videos and conducting stakeholder interviews to see how we might make improvements to the labs discovery experience to help users find the content they’re looking to learn