While we attained our tentative goal in terms of registered players, participation in the game was generally low. However, those who did participate, even a little, had a good experience. Answers to questions about learning and retention indicate that when students did play the game, even briefly, they both learned things and felt more comfortable and knowledgeable about the library. This means that the game actually does what it was designed to do. The challenge is getting more people to play.
The final tally of registered players comes in at 397. That means 397 people who downloaded the app and logged in at least once. I haven’t removed all the non-students from this tally, because additional players have signed up since the last time I weeded, but I’m pretty confident from looking that the number of nonstudent players is small. Of the registered players, 173 earned points, meaning they completed at least one quest. The maximum number of points you could earn was 625, which was attained by one person, although a few others came close. We published almost 30 quests over the course of the game. Our most completed quest was “Home Sweet Homepage,” a simple quest designed to introduce students to the library homepage. It was completed by 109 people. In general, online-only quests were more popular than quests involving physical space, and were taken and completed more often. Of the top five most-completed quests, four are online-only. There are a number of possible explanations for this, including the observation offered by one of our survey recipients that possibly a lot of players were stationed downtown and didn’t want to travel to Allendale to do the physical quests.
In addition to data collected from the game itself, I also put out two surveys over the course of the game. The first was a mid-game survey that asked questions about quest design (what quests students liked or didn’t like, and why). Responses to this survey were contradictory. Students would cite a quest as their favorite, while others would cite the exact same quest as their least favorite (and often for the same reasons). The qualitative evaluation we did provides some possible explanation for this (see below). The second survey was a simple post-game evaluation that asked whether students had enjoyed the game, whether they’d learned something, and if this was something we should continue doing. 90% of the respondents to this survey indicated that they had learned something about the library, that they thought this was a good idea, and that it was something we should do again.
Things people said they learned from the survey:
How to request a document
How to find books, catalogs, maps, etc.
How to use the online databases
What call numbers actually mean
How to use the online catalog
Names of furniture
I learned where things were, as well as functions of some of the spaces
At the conclusion of the game, we offered players points and free coffee to come in to the library and spend 15-20 minutes talking to us about their experience playing the game. We had about 12 people show up for these interviews. I kept questions short and general to fit the limited time, but we still got some very interesting feedback.
The general tone of the feedback was very positive. Students seemed intrigued by the idea and appreciated that the library was trying to teach in nontraditional, self-directed ways. When asked to sum up their overall impressions of the game, students said things like “Very well done, but could be improved upon”or “good but needs polish,” or my personal favorite: “an effective use of bribery to learn about the library.”
When we asked people about whether the game had changed how they thought about the library, they typically answered that it wasn’t so much that the game had changed how they thought about the library so much as it changed the way they thought about themselves in relation to it. They used words like “”aware,” “confident,” and “knowledgeable.” They felt like they knew more about what they could do here and what we could do for them. Their retention of some of the quest content was remarkable, including library-specific lingo and knowledge of specific procedures (like how to use the retrieval system and how document delivery worked).
Players noted a variety of problems with the game. Some were technical in nature. The game app takes a long time to load, likely because of the way the back-end is designed. Some of them didn’t like the facebook login. Stability on android devices was problematic (this is no surprise). Other problems were nontechnical, including quest content that didn’t work or took too long (my own lack of experience designing quests is to blame), communication issues (there’s no way to let me know when quest content doesn’t work), and the flow and pacing of new quests (more content faster).
People had a variety of reasons for playing. While most cited the iPad grand prize as the major motivator, several of them said they wanted to learn about the library or were curious about the game, and that they thought it might be fun. This may explain differing reactions to some of the quest content that so confused me. People who just wanted to have fun were irked by quests that had an overt educational goal. Students who just wanted the iPad didn’t want to do lengthy or complex quests. Students who loved games for the fun wanted very hard quests that challenged them. This diversity of desire is something all game developers face, and it’s a challenge for designing popular games that appeal to a wide variety of people.
Marketing is something that almost everyone noted as being a problem. Students felt that not enough people knew about the game. This is partially a function of where and how we market and partially a function of the game itself not incentivizing sharing. As noted previously, people wanted different things out of the game. Crafting a game that speaks to multiple kinds of motivations is going to be challenging, and communicating what the game is about to people with different desires is going to be hard. We may need to think about being more specific about aiming the experience at a specific kind of user, say, incoming freshmen or nontraditional students. We can then gear the kind of content and the message to those sorts of students. Because the game is totally elective, creating that experience and message is going to be important in getting people to play. This also may require rethinking what we define as success. 400 players isn’t much when looked at against the general population, but if our goal is a smaller subset, then numbers like that are much more impressive.
Finally, we asked players for suggestions for improvement. Most centered around fixing technical problems and introducing new features:
- Speed up load times
- Provide an alterative to Facebook login
- Provide a way to communicate about faulty quests
- More quests and shorter times between quest releases
- More and more varied marketing
- Provide an overall game leaderboard in addition to personalized leaderboards
- Label game decals with the quests they relate to
- Use in conjunction with classroom instruction
- Multiple reward structures
- Have the game itself incentivize recruitment
Where to go from here:
I think the data that I have supports the contention that games can be used to teach people things. Even students who only played a short time learned things from the game. The challenge is creating engaging games that are appealing to a large number of students in a way that’s economical in terms of staff time. I’d like to try running the game as a shorter, more intense experience next year, and targeting it at a specific audience, possibly incoming freshmen.