Elevating the Educational Experience


Elevate is a recent addition to the genre of brain fitness games. The company markets itself as a cognitive training tool designed to build communication and analytical skills where members are provided with a personalized game-based training program that adjusts over time based on performance. It features a free-to-play mode with limited features and a subscription program that unlocks additional exercises and modes for 4.99/month. Since launching in May 2014, Elevate has been a commercial success with more than 5 million downloads on the App Store and Google Play. It has also been selected by Apple as ‘App of the Year’ for 2014.

After installing the app, Elevate asks the user for preferences in training goals (ex. Articulate your thoughts more clearly), and more recently has added a formative assessment that tests you in key areas to gauge your initial skill level. With its emphasis on communication and analytical skills, the application tracks the users in speaking, writing, reading, listening, and math abilities. Each of these categories has 3-4 exercises that help users can complete in order to gain proficiency points for the ability. The difficulty of these exercises increase with performance up to a maximum level. Elevate unlocks 3 semi-random exercises every day, depending on personal training goals, and tracks the user performance history. The exercises are all unique in design and execution, each with its own graphics, sounds, and user interaction.

In this article, we will delve into the machinations of Elevate because it exemplifies the modern learning experience combining design elements with a data-driven approach to learning. We will consider its engaging user experience and explicit tracking of proficiency levels, while comparing them to their counterparts in public education. While Elevate is targeted towards a different audience with its own goals, it’s possible to consider tangible elements from its structure and apply them the learning experience in traditional schools.

The 'Diversifying Design' section below refers to these case Studies of the five Elevate subjects along with a featured game evaluated for content, engagement, and validity.


Diversifying Design


For Learning

On average, students’ learning experiences are extremely structured, often at the expense of engagement. Especially towards secondary schools, classroom layout follows a pattern, there is little variation in teaching style throughout the year, and students spend the entire day sitting in desks, taking notes, listening to the instructor, participating in some discussion, or filling out assessments. The biggest criticism of students not engaged in learning is that school is ‘boring’. Most adults assume that this comment is directed at the content, but perhaps it’s equally important to consider the environment and how the content is presented.
One of the most remarkable attributes of Elevate that distinguishes itself from traditional learning is the number of different ways players can interact with the games. Even in the confines of the application, you can tap the screen, swipe left or right, choose between responses, drag facts to appropriate markers, build a tower, and more. Let’s consider the Mathematics Case Study. The featured game teaches players how to organize units in different systems of measurement, by having them build a tower of length or weight units in order. In school, the student will learn the ratios between the numbers and convert them on a test, perhaps with a few examples. Especially as we progress to middle and high school, concepts and ideas are taught, practiced, and assessed in only abstract terms. This game design allows student to visualize explicit and relevant numbers. In order to succeed, you must still internalize the proper ratios, but the application of this knowledge is more meaningful a range of learners, and allows abstract, visual, and tactile learners to absorb the information.

Singapore math is one traditional approach to mathematics education that uses some of the principles outlined above to teach standardized curricula. The method uses three steps to teach students the same concept. First, students engage in hands-on learning using concrete objects, followed by drawing pictorial representations of the concepts, and finally solving problems by using numbers and symbols. Variations of this methodology are still employed in elementary school, but concrete and pictorial exercises don’t mature well with students, and are therefore discarded in secondary education. Digital tools offer a clean solution to this problem, and well-designed applications or games can supplement in-class instruction to provide diverse and interactive learning opportunities in not just mathematics, but other subjects as well.


For Testing

Assessments present an interesting design issue, because the process itself is so detached from learning. Standardized tests use Scantron paper forms which allow machines to quickly translate responses to scores. Widely used to save time and resources, the test limits the questions with discrete answers, where the student must choose between options. Many secondary schools began to follow this trend, partly for their benefits, but also because it helps prepare students for the standardized tests to come. Even schools and teachers that don’t use it directly employ the same elements in their tests like multiple choice, true/false, and matching questions. This simplification comes at the expense of student learning by introducing discrepancies between the goals of the knowledge or skill and how it’s being tested.

Alternatively, let’s consider Elevate’s approach, where assessment and learning are intertwined and game design reflects the development goals. The games offer instant feedback and corrections to mistakes, prioritize accuracy and speed differently, and reward streaks and specialization over comprehensive knowledge gain. 

For example, the Syntax game in the Writing Case Study helps you identify grammatical errors. It displays a highlighted word or phrase within a sentence, and you must choose if its usage is correct. If the usage is incorrect, regardless of whether you were able to identify it, the game displays the proper replacement in its place. This adds a dimension to the otherwise simple true-false structure where the player gains more information than just whether he was right or wrong. Both, the Conversion game in the Math Case Study and the Precision game in the Speaking Case Study have similar features where they provide immediate feedback or additional knowledge to the player after his response.

Also, the game designs also reflect the values of the goals. The Syntax game features a boat trying to reach the harbor before the sun sets, personifying progress and time respectively. The boat moves forward with every correct answer, but back for every mistake, and the game is scored on the time left. Therefore, a player who is quick but makes an error may earn more points than one who is careful. It’s an assessment designed to prioritize speed and efficiency of the skill. Conversely, the ‘Processing’ game from the Reading Case Study uses its design to control reading speed, but values accuracy by testing comprehension—you lose the game if you make two mistakes in the session. These are but a few ways that design in assessments can help cater to specific goals and skill objectives. 

Over the last few years, teachers have begun to use an assortment of formative analysis and digital tools to provide gauge student learning and provide feedback. One widespread example of this innovation is the usage of the clicker, which allows teachers to gauge the proficiency of students to determine if more time is required to cover particular concepts. But a majority of these tools are either diagnostic in nature, directed towards teachers, or derivatives of simplistic evaluation metrics. A combination of good design and digital tools can transform assessments into powerful and engaging learning tools that empower students.


For Engaging

The final argument for diversifying the design in education is engagement. Our culture has evolved considerably within the past few decades, especially with regards to technology and how we interact and communicate with the world. The current generation of students has grown up in a world where they have unfettered access to an expansive world within the internet. They have always known the accessibility of smartphones, computers, and tablets. Their videogames and movies feature incredible graphics, and unprecedented interactivity. The consumer culture is accompanied by escalation in engagement metrics that our current system of education has been unable to match so far. It’s only natural that students will feel uninterested reading from textbooks and copying notes from a chalkboard.
Much of Elevate’s novelty comes from its visual design and interactive games. Consider Focus from the Listening Case Study, where users listen to a conversation between two people regarding a theme with multiple subtopics. These subjects are visually represented by three different circles on the screen. As facts about these subjects surface in the conversation, they get added to the screen as a hollow ring. Players must drag the ring to their respective topic circle, after which the game visually and audibly rewards you with a flash and sound effect. These fact rings then rotate around the circle, until the topic set is completed, and the user is left with a list of facts on a topic. While the purpose of the game is actually to improve concentration and memory, it can just as easily be a more modern approach to note-taking, where audio and visual cues reward adding information.

The process of increasing engagement through digital tools is met with resistance from some educators who provide some variation of the reasoning, ‘I learned things this way, why can’t they?’ The problem with this line of thinking is that the goal of education is to prepare its students for the future. It not only takes more resources to teach students with tools outside of their comfort zone, but it also isolates their learning from their interests and peer interactions. If education is to cater to the needs of children, educators need to adapt and teach through a language that’s familiar to students.





Students in the current public system maintain an unhealthy relationship with numbers that are used to evaluate them and their peers. Assignments, quizzes, participation, reports, tests, projects, and exams all provide a number that is supposed to measure students and hold them accountable for their learning. In this section, we will examine the philosophy behind how Elevate measures and uses the proficiency levels of the players and compare it to the approach in schools.

Compared to the escalating linear approach in schools, Elevate employs a more gradual, cyclic approach. Each subject category is divided into core competencies rather than units. Players begin at a low difficulty in each game, and their performance determines both the subsequent changes in difficulty and proficiency levels. This creates a system that assesses students, reveals the explicit metrics used in the evaluation, tracks trends in student performance, and refines itself for a personalized model of achievement that values constant development.


Transparency in Numbers

Elevate shares all explicit numbers with its players behind the details of their performance. In the same way that the game’s design highlights certain objectives, the scoring system explicitly states the different ways students can improve.

At the end of each session, the game displays the base score for completing the game as well as additional bonuses for speed, accuracy, and difficulty. This has several implications, notably, that the game explicitly notes the value of participation by having a base score. Essentially, the game assigns a value in the player putting in time to practice a skill, and the learning that takes place because of it. Next, the speed and accuracy bonuses give players points for finishing faster and making fewer mistakes respectively than what is required of them to complete the session. Finally, the difficulty bonus varies depending on the level; the bonus is negligible at lower levels, but gives substantial (20%) at higher levels.

The numbers also help target specific skills for each subject. For example, the Proportion game asks players to scroll through and match various fractions to their respective decimals and pictorial representations. As players start to improve, they become more familiar with the ratios and are able to complete the exercise faster. Conversely, Retention asks the player to listen to a list of things and answer questions about them from memory. The speed at which the player responds is negligible in value compared to his accuracy because intuitively the skill is developing your memory skills, not the speed at which you can respond.

When you consider assessments in schools, students can’t easily discuss their proficiencies, because they are never explicitly stated. In the course of reviewing homework, or receiving tests and quizzes, both students and educators are made aware of how many mistakes were made, not necessarily what they specifically were or if they are relevant. In order for students to improve, there needs to be motivation, but also clarity regarding the direction. For example, if a test is designed to delineate between structures (multiple choice, matching) instead of subject categories (fractions, graphs), it’s harder for the students to recognize where they performed well, and where they faltered. A student who can see that he missed four graph-based problems is better informed that one who sees he missed two true-false and two matching problems. There are a host of similar initiatives that schools and teachers can utilize by finding synergy between transparent assessment structure and course design. 


Tracking Scores and Recognizing Trends

Elevate’s loop structure of Assess > Reveal > Refine > Assess… has another enormous advantage; the structure allows players to track their scores and recognize trends, a sort of meta-learning. 

At the end of each session, Elevate displays a graph of your recent scores in the game and that your high score for the game is. The graph above shows a player slowly developing in two subjects, Reading (Connotation and Visualization) and Math (Proportion and Tipping). If we assume that the design of progression is consistent, then several important points become significant. First, despite Proportion and Tipping both being subcategories of Math, the skill at the onset and development is different. For example, as the difficulty in Proportion increases, the player goes through phases of equilibration, where he adjusts to the new standards, resulting in performance dips. Meanwhile, Tipping shows a slower, but more consistent growth.
This information is valuable to players because it allows them to not only gauge their current status, but also how far they have progressed in a given period of time. Next, as players achieve the highest level of mastery in the game, the scores normalize for difficulty, but players can still improve in other metrics such as speed and accuracy, encouraging players to continue playing the game to maintain their skill. Elevate’s goals are much more grounded in application and place less relevance on the consistency of objective metrics; however, the model presented is extremely relevant when applied to schools.
Students often don’t have an accurate gauge on their skill level in subjects, and their perception depends on their test and exam scores, due to their value on final grades. Similarly, parents have to rely on teacher conferences to glean their child’s growth, and even these conversations turn out to be subjective depending on parent’s expectations. Finally, while teachers have a general idea regarding the proficiency of each student, their understanding is based on in-class interactions and discrete scores.

Valerie Shute and Matthew Ventura offer the following metaphor. Retail outlets in the past had to close down once or twice a year to take inventory of their stock. But with the advent of automated check-out and bar codes, these businesses have access to a continuous stream of information that can be used to monitor inventory and flow of items. Not only can a business continue without interruption: the information obtained is also far richer than before, enabling stores to monitor trends and aggregate data into various kinds of summaries as well as to support real-time inventory management.

Similarly, Elevate’s model applied to schools can help both students and parents recognize development in various categories throughout the year. Students will have a solid basis for introspection regarding their strengths and deficiencies. For example, a student completes daily homework assignments in a digital or web-based application. Over the course of two weeks, the application points out that while the student has kept up with expected math skills, but his Algebra scores are lower than average. As the student starts to make a conscious effort in the category, the Algebra scores begin to improve.

Teachers benefit enormously as well. The past decade has seen an increase in initiatives to measure teacher effectiveness in the classroom. However, one major criticism of this system that teachers cite is that these evaluations don’t account for the starting level of the students. If a student is too far behind at the onset of the class, teachers point out that it’s impossible to get him to the appropriate standard. Tracking performance data helps alleviate a lot of these issues, by demonstrating student levels at the beginning, as well as their development throughout the year. Additionally, teachers can compare growth rates and proficiencies among clusters of students, helping them tailor content and pace in the class.

This process of data tracking is not simple; it requires both a proper infrastructure and class designed around its usage. Teachers need professional development to be able to interpret and discern relevance from this information and students need to be able to use the tools effectively. But the potential benefits after the capital investment are recurring, both for the teachers and the students. 

Adjusting Pace and Tempo

In order to develop the player skills in its games, Elevate tries to achieve a difficulty range between comfortable and challenging. It does this through an iterative process responding to player performances, and uses the same methodology to adjust player proficiency. 

At the end of each session, you receive a score composed of a base score along with bonuses for difficulty level, accuracy, and speed. The game evaluates your score and adjusts your difficulty and proficiency level accordingly. The graph above shows normalized scores compared to the subsequent difficulty increases. Additionally, the player’s performance in each game also affects their proficiency in the subject. For example, the graph above compares the scores from the game ‘Retention’ to the resulting increases in the ‘Listening’ subject proficiency. In this scenario, the player receives 6-8 proficiency points per session because his current level is Listening is quite high. If the player had a higher or lower proficiency level, the points per session would lower and rise respectively— a novice level player would gain more for the same performance than someone who is an advanced or expert level.

This concept of proficiency is a mastery-based approach to learning instead of a score-based system. Players have a proficiency rating for each subject: 0-1250 for novice, 1250-2500 for intermediate, 2500-3750 for advanced, and 3750-5000 for expert. The formative assessment at the beginning of the game places you at a proficiency level. You now have to do several things, first of which is play different Listening games because mastering and repeating one game will lead to marginal returns in terms of proficiency points. Next, you have to maintain or increase your level of performance in games; otherwise you will begin to lose points. Elevate has quantified breadth and depth of skill or knowledge into its subject that motivates novice players to improve and expert players to maintain their performance. More importantly, this happens at an individually appropriate pace set by difficulty levels that iterate in response to your sessions.

The application of this flexibility already exists in the classroom in the form of differentiated instruction, where teachers respond to the needs of their classroom to cater to specific groups of students or results of recent assessments. Most classrooms contain a diverse group of students with regards to skill level, learning abilities, and interests. Teachers can differentiate content or process to best suit the needs of their class. For example, if the recent assessment shows that the class is behind the curve, the teacher can reiterate concepts or adjust her pace. If current events are of interest to students, the teacher can leverage that to teach concepts, or if the students in the class are technologically inclined, the teacher can use more digital tools to teach. 

The biggest issue with differentiation is that no technique, direction, or process is completely inclusive. Slowing the pace in the classroom inhibits the growth of students who have the potential to learn a greater depth of knowledge through a rigorous, challenging pace. The same theory applies to adjusting process and focus. Digital media, particularly video games, have refined the ability to engage players and develop relevant skills at a pace that is challenging to them. If we are able to extract these elements and apply them to our assessment philosophy, it will help alleviate the stress of having to differentiate from the teachers.



Integrating technology in the classroom allows schools to establish some equity in terms of learning and opportunities. The problem is that the efficiency of technologies and digital tools used in schools is primitive in comparison to those used for in every other aspect of the current culture. Elevate shows how the proper integration of design can create an engaging and personalized learning experience that also highlights focus and objectives. For example, the same elements applied properly to standardized curricula of mathematics, literacy, sciences, etc. have the potential to create a more engaging and relevant environment for students. 

The theory of Connected Learning proposes that students motivated by understanding how particular knowledge or skill is applicable to their own lives will yield the most meaningful and sustained learning. However, the current system prioritizes accountability through scores and assessments that determine student’s level of academic achievement. The Elevate Case Study demonstrates that there is a compromise between the two by making the assessments themselves engaging and relevant enough that they produce learning. In order to be practical, this entails realigning the philosophy of assessments from stigmatizing mistakes to rewarding development and redesigning the structure to diversify goals and offer feedback. 

Formative assessments, especially short-cycle ones can help bring about this philosophical shift. Elevate shows how a data-driven approach using a series of short game sessions can provide meaningful information about the learner and personalize their learning. This information can help students focus their learning and help connect parents to the full details of their children’s learning. When this assessment approach is coupled with teacher’s professional development, they can leverage data to compare and track group performances, and employ more efficient differentiation in the classroom.

Our current system of public education is a protracted process of university entrance (later job placement), and as a result, the deeply-rooted culture of comparative assessment is not likely to change dramatically in the next decade. However, we can shift the surrounding infrastructure so that these assessments are meaningful to students, teachers, and parents in terms of development, rather than reductionist measures of judgment.