online learning tool
Client: IXL Learning
Teammates: Jeremy Murphy, Fang Chang, Mark Ritterhoff, Kate Matisson, Ian Malave
My role: web design | curriculum design | feature & microinteraction design
I was tasked with creating an online learning tool that addresses the following NGSS standard:
Construct an argument with evidence that in a particular habitat some organisms can survive well, some survive less well, and some can't survive at all.
ADAPT-1 is an online learning tool designed for Grades 3-5. It tests and augments students' understanding of animal adaptations. It creates questions from a near-unlimited pool of options, provides instant feedback, and adapts to student proficiency.
Student must learn new things. They must also have fun.
I used the above standard to identify learning goals for each student:
Then, I outlined the pathway of student experience. The easiest questions would focus on refining students’ observation skills. Harder questions would introduce more complicated anatomical concepts. The hardest questions would ask students to apply their observation skills and conceptual understanding to novel scenarios.
Questions should bridge the gap between familiar and strange. No student should see the same question twice.
I designed an algorithm that would use our database to generate questions. I also added logic that would provide specific feedback for each question that a student gets wrong. Then I created a variety of question types, ranked by difficulty, so that students experience a incremental challenge as they progress through the tool. I received input and feedback from content reviewers and software engineers at regular intervals throughout the process.
Here are some of the early question sketches that I developed. We eventually decided to proceed with the types numbered 1 and 3.
Multiple Choice: Out of a number of options, only one answer is correct.
Multiple Select: Out of a number of options, one or more may be correct.
Technology is great as a tool, not as a master.
There are a number of things that teachers are excellent at doing. Good teachers can inspire passion among students, and they can serve as effective mentors when students experience uncertainty or stress. But there are a number of things that teachers are not good at, by virtue of the fact that they are human. Here are three examples:
I view these issues as opportunities where technology might be able to augment a teacher's effectiveness. This learning tool attempts to do exactly that.
Students may answer hundreds of questions a week. So, the interactions need to be as efficient and intuitive as possible. There is no place for gimmicks.
We used my designs to prototype different aspects of the student experience, such as questions, feedback, and the difficult progression of the learning tool. Here is an example of a question. The labels demonstrate how my design decisions informed the prototyping process.
The biggest challenge was how to ensure that the questions generated by my algorithms were of a consistent quality. Fine-tuning the algorithms took twice as long as we expected.
We finetuned the algorithm and added content to the database based on the results from stakeholder and user testing. Given the massive scale of the database, we involved five different team members in the iteration and testing process. At the end, we were left with a tool we could all be proud of. ADAPT-1 launched successfully in December 2015.
ADAPT-1 is part of IXL Science, a product which I designed as part of a team of 9-member team. IXL Science has over 100,000 users and which generated over $1,000,000 of revenue in its first year.
When a student gets a question right, they progress to a harder question. A streak of 5 advances them to the next level.
When a student gets a question wrong, they receive immediate and specific feedback. This information is also passed on to their teacher.
During my time at IXL, I designed over 30 such learning tools for elementary school students.
If I had more time...
...I would have tested our products with real users, i.e., students using IXL at home and in school. This is not something that IXL Learning did in a systematic manner, and there was little incentive to do it because our product was already making money. But I believe it's essential to do in order to remain competitive in the long run. Observing users in context provides valuable insight into a product's capabilities and potential evolution over time.