You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Briefing

You are junior marketing analyst (or any other non-software job you’ve dreamed of having) who has recently been asked by his boss to learn how to code. “Your next assignment will involve considerable coding.” your boss adds. You have never coded a day in your life and don’t quite know where to start from. Also, you have no other information about what you might be asked to code in the assignment. Having taken a MOOC before, you head over to rethinkED to see if there’s a relevant computer science course that suits your needs. You hope the class is not a lot of unnecessary work and gives you a good introduction into “coding”. Finally, impressed with the activity on the website, you decide to contribute by reviewing 8.02X (the MOOC you recently enrolled in) as well.

Scenario Tasks

Task 1:
Search for relevant Computer Science Courses. You find too many results and decide to refine your search to find computer science courses only offered by either MIT, Stanford or Harvard. Find the three highest rated courses (CS 101, 6.00X and Intro to CS) and compare them to decide which course you like the best.

Task 2:
You friend told you about a course “Startup Engineering” he’s enrolled in and really likes. You want to find more information regarding the course in case it’s something you might find interesting. Search “Startup Engineering” to find out more about the course.

Task 3:
You are extremely happy with rethinkED and decide to contribute to the reviews. Review 8.02X and submit a comment describing your experience.

Observations and Changes

User Observations 

Changes for next Iteration

Users had some trouble understanding the difference between “Upcoming” and “Announced” filters.

Learnability Issue -- To fix this, we changed the options in the Date filter to say Ongoing, Starting soon, Just Announced. This made the distinction between “Upcoming” courses were those starting soon and “Announced Courses” were the ones that started later on or “not soon”.   

Another user tried to find classes that were scheduled in May, two months from now. He wasn’t sure if that was “Starting Soon”, and clicked on both “Announced” and “Starting Soon”.

Our quick fix had failed soon after, and we decided to change “Starting soon”. We changed the label of the “Dates” filter to say “Start Date”, and changed the options to “Ongoing”, “This month”, “Next 3 months”, “Just Announced”. This change makes the options much more specific in relation to the months of the calendar and removes ambiguity regarding what “soon” meant.

The user encountered a problem while trying to close the filters menu. We had forgotten to put a close button, and so the user had to click on apply even though he didn’t want to!

As a simple fix -- we added the close button on the filter. A side note: At this point, our observers discussed the possibility of changing the filter menu interface for round 2. (User feedback: category is probably the most important filter) We  noticed an efficiency issue -- users had to go through several unnecessary clicks to reach filter categories that could be eliminated.

A user faced some difficulty understanding what “Initiative” filter did and clicked on it instead of “Institute” by mistake.

We changed “Initiative” to say “MOOC Provider”. This was in our initial design, but we were unsure which one was better. We decided to change it to “Initiative” for User testing and observe the user’s response. Turns out, they preferred “MOOC Provider”.

The users (2 and 3) tried writing into the filter’s results as our design said “Search for” and then presented search results.

We removed the “Searched for” and replaced it by “Filter by”

Once the results were listed, the user had some trouble knowing that he must select a few courses and then click “Compare Selected”. The user kept trying to click on “Compare” next to the checkbox for each course. He said that he was expecting a new page/interface for compare.

Differentiating a label from a hyperlinked text would be much easier with the actual design. As an added safety measure, we decided to add a dialog box that informed the user of the necessary steps in case the user made a mistake.

On the “Compare” page, after reading the review of Intro to CS, he decided against it and looked around for a button to remove that course from the comparison. Thu user finally went to the previous page and re-selected the two leftover options.

A simple fix that we overlooked was placing the close button (again!) next to each review. Additionally, we realized that we had forgotten the “Reset Filter” button to start with. This was also fixed.

One of the users searched keywords “Computer Science MIT Upcoming”

We still presented the current results, but observers noted that all courses needed tags with as many synonyms as possible including tags like “upcoming”. Observers detected a safety issue and noted-- Remember tags for start dates during the implementation.

User 2 had some trouble navigating back to the main page after comparing the results.

We added a menu option that navigated to the home page.  

User 1 had some trouble knowing where to click to find more information about the course.

Differentiating a label from a hyperlinked text would be much easier with the actual design. It was noted that there must be sufficient feedback and affordances  to indicate clickability

User 2 clicked on the Write Review option, proceeded to the interface with the search option for finding courses to review. User initially found 8.02x on main page (popular courses) and navigated to the course review page to write review.

We added the Write Review menu option in the individual review page. Initially, this was present only on the main page. We fixed this by adding the “Write Review” button for individual course page review as well.

One user pointed out that he was unsure what “Score” -- how was it calculated?

Learnability issue -- We added a How is the Score Calculated option on the main page. 

  • No labels