During my stay at Kollin.io, I conducted several user tests and interviews. Several types of research have been conducted: structured, unstructured interviews, user tests, and evaluation tests. This is a difficult process to showcase effectively, but resulted in most if not all the development of the product.

Below one of the main scripts that were compiled to instruct other members on how to conduct interviews. It contains examples specific to Kollin and was created from several online and offline resources demonstrating industry best practices.

Script

  1. Preparation

  2. Decide which part of your product or website you want to test

    1. Go to the course page
    2. Impression course page
      • Graph
      • Recommended
      • Side/Navbar content
    3. Tasks – user’s most common goals when they interact with Kollin
      • Go to one exercise of X cluster and category and self assess
      • Create a test
      • Look for an exam of 2019
    4. Impression on the homepage
      • Message (copy)
      • Testimonials
    5. Impression on the course index
  3. Set a standard for success

    1. Go to one exercise of cluster “X” and category and self assess
      • Sidebar
      • Category exercise click
      • Click on one self-assessment
    2. Create a test “mock exam”
      • Go to the page (tab)
      • Start filling the information
      • Create the test
    3. Look for an exam of 2019
      • Go to tab
      • Select or filter
  4. Have a plan and script.

    Moderators should follow the same script in each user session.

    At the beginning talk about:

  5. Delegate roles.

    During your usability study, the moderator has to remain neutral, carefully guiding the participants through the tasks while strictly following the script.

    Note-taking during the study is also just as important. If there’s no recorded data, you can’t extract any insights that’ll prove or disprove your hypothesis.

  6. Conduct the study.

    During the actual study, you should ask your participants to complete one task at a timewithout your help or guidance. If the participant asks you how to do something, don’t say anything. You want to see how long it takes users to figure out your interface.

    Asking participants to “think out loud” is also an effective tactic — you’ll know what’s going through a user’s head when they interact with your product or website.

    After they complete each task, ask for their feedback at the end, e.g. :

  7. Analyze your data.

    When you analyze your data, make sure to pay attention to both:

Extra: Select your participant.

Screening and recruiting the right participants is the hardest part of usability testing. Most usability experts suggest you should only test five participants during each study, but your participants should also closely resemble your actual user base. With such a small sample size, it’s hard to replicate your actual user base in your study.

To recruit the ideal participants for your study, create the most detailed and specific persona as you possibly can and incentivize them to participate with a gift card or another monetary reward