How do we at Macmillan Learning decide what to work on next in Achieve?

becky_anderson
Macmillan Employee
Macmillan Employee
1 0 1,491

We talk to customers. Our sales staff talks to customers everyday, as do our marketing and editorial teams, our trainers, our learning science team, our product teams, our support teams, and more. We get lots and lots of one-on-one feedback from customers that we track and we can ‘vote up’ an issue that lots of people are hearing about or ‘vote down’ an issue that is less common. 

becky_anderson_0-1618839032935.png

 

 

We look at in-product data. With Achieve, we have lots more data on the backend--which is awesome! We can see the volume of people (overall, not the individual list of people) who used a particular feature. Obviously, this means we can when a certain feature is not being used as well. All of this data is considered directional as it helps us figure out what’s important to our users. 

 

We survey users. We ask users who are in Achieve what they think of a particular page or feature and we track those results. We have internal goals about what each page should be achieving on their SEQ (Single Ease Question) and pages that fall below that goal are noted as noted to need more analysis for potential refinement. In addition, we do a longer user survey to both students and instructors at the end of the term so we can ask more questions and also track results over time. This is another way of checking to see what parts of Achieve are working well and where we need improvement. 

 

becky_anderson_1-1618839032911.png

 

 

We track the issues that come into our support team. You as a customer (both students and instructors) can actually submit requests for new features or changes via our support form:  https://macmillan.force.com/macmillanlearning/s/contactsupport . We also track the calls, emails, and chats to our support team to figure out the topics that have the most problems and we look at what users are searching for in the Knowledge Based. These contacts to support usually signify areas where refinement to the product is needed as customers aren’t finding it intuitive to use on their own.

 

We do usability testing. We give students and instructors (not always users of our products, and not even always people who are familiar with online learning tools) a goal to accomplish in Achieve like “Assign this homework” or “Check your grade” and then we ask them to complete that task with no additional instructions, but we do ask them to vocalize their thought process. This gives us great insight into things like, “Well I assumed it would be here” or “Because you used this word” and tells us how customers are using the product. 

 

We look outward. In addition to talking to users, we also talk to people who don’t use our product to find out why not. We follow trends in tech education so we can see the big issues under discussion and see how we can help. And we look at other products to check out other great ideas that are out in the world. 

 

So is the formula as simple as A+B=C? No, but we do look at all sorts of data and take your feedback in mind as we continue to improve Achieve. Please keep letting us know what you like and where we can improve.

About the Author
I've been working in publishing since 1997, doing everything from the front desk to marketing and sales, and a few things in between. And I love working working with media and helping students succeed.