Data Archives - 91 /blog/category/data/ Prep for Success Mon, 30 Sep 2024 18:17:33 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://assets.testinnovators.com/wp-content/uploads/2022/02/favicon-85x85.png Data Archives - 91 /blog/category/data/ 32 32 The Impact of Test Practice: A Data Study /blog/impact-test-practice-data-study/ Thu, 12 Oct 2023 18:43:30 +0000 /?p=9991 In 2020, ERB and 91 partnered with the goal of leveling the playing field for all ISEE test takers. The partnership provides every student with access to high-quality test practice, which reduces anxiety and increases confidence through familiarization and learning. In order to assess the impact the partnership has on test day performance, particularly for fee waiver students, a data study was conducted. The study included 31,926 students with 40,073 ISEE tests and 21 million data points.

The post The Impact of Test Practice: A Data Study appeared first on 91.

]]>

The Education Records Bureau (ERB) provides products and services to assess and enhance the educational experience of students in grades 1-12 with the goal of increasing every student’s potential. ERB is the publisher of the Independent School Entrance Exam (ISEE), an admissions assessment taken by students applying to independent schools.

91 offers high-quality test practice for admissions exams with the goal of providing students with a positive educational tool that helps them perform their best and unlock opportunities for their future. 91 is the only ERB-endorsed practice for the ISEE.

In 2020, ERB and 91 partnered with the goal of leveling the playing field for all ISEE test takers. The partnership provides every student with access to high-quality test practice, which reduces anxiety and increases confidence through familiarization and learning.

Access to practice supports the shared mission of helping students fully demonstrate their knowledge, and thus unlock educational opportunities.

Furthermore, in an effort to create equitable opportunities, students with an ISEE fee waiver receive free access to all of the 91 ISEE practice materials, including full-length practice tests with immediate scoring, time management feedback, a personalized prep plan, targeted practice exercises, skill-building videos, and more.

In order to assess the impact the partnership has on test day performance, particularly for fee waiver students, a data study was conducted. The study included 31,926 students with 40,073 ISEE tests and 21 million data points.

The data study included 31,926 students with 40,073 ISEE tests and 21 million data points.

Students in all grade levels were included. There were concentrations of students applying to grade 6 and grade 9 because those are the most common entry points.

Students in all grade levels were included.

The scores used in the study were normally distributed.

The Quantitative Reasoning scores used in the study were normally distributed.
The Verbal Reasoning scores used in the study were normally distributed.
The Mathematics Achievement scores used in the study were normally distributed.
The Reading Comprehension scores used in the study were normally distributed.

However, the average scores for students who used a fee waiver were lower than the average scores for students who did not use a fee waiver.

the average scores for students who used a fee waiver were lower than The average scores for students who did not use a fee waiver.
Density curve showing average Quantitative Reasoning scores for students who used a fee waiver were lower than the average scores for students who did not use a fee waiver
Density curve showing average Verbal Reasoning scores for students who used a fee waiver were lower than the average scores for students who did not use a fee waiver
Density curve showing average Mathematics Achievement scores for students who used a fee waiver were lower than the average scores for students who did not use a fee waiver
Density curve showing average Reading Comprehension scores for students who used a fee waiver were lower than the average scores for students who did not use a fee waiver

To measure the impact practice had on student performance, students were placed into three categories based on the amount of time they spent practicing: None (0 – ¾ hour), Familiarity (¾ – 2 ½ hours), and Learning (2 ½ hours or more).

Students were placed into three categories based on the amount of time they spent practicing: None (0 - ¾ hour), Familiarity (¾ - 2 ½ hours), and Learning (2 ½ hours or more)

The results were clear: the more a student practices, the better they perform.

The results were clear: the more a student practices, the better they perform.
Graph showing that for Quantitive Reasoning, the more a student practices, the better they perform.
Graph showing that for Verbal Reasoning, the more a student practices, the better they perform.
Graph showing that for Mathematics Achievement, the more a student practices, the better they perform.
Graph showing that for Reading Comprehension, the more a student practices, the better they perform.

The impact of practice for students with a fee waiver was the same. The more students practice, the better they are able to demonstrate their knowledge.

The impact of practice for students with a fee waiver was the same.
Graph showing that the impact of practice for students with a fee waiver was the same for Quantitative Reasoning. The more a student practices, the better they perform.
Graph showing that the impact of practice for students with a fee waiver was the same for Verbal Reasoning. The more a student practices, the better they perform.
Graph showing that the impact of practice for students with a fee waiver was the same for Mathematics Achievement. The more a student practices, the better they perform.
Graph showing that the impact of practice for students with a fee waiver was the same for Reading Comprehension. The more a student practices, the better they perform.

This analysis indicates that dedicated study for the ISEE on 91 is associated with an average increase of 13 percentile points. What’s more, simply gaining familiarity with the ISEE is associated with a score increase of 5.5 percentile points on average.

To read about ERB’s take on our analysis, check out 

Practicing for the ISEE positively impacts students’ experience on test day, particularly students who receive a fee waiver. Practice allows students to fully demonstrate their knowledge, which unlocks educational opportunities.

If you’d like to learn more about practice for the ISEE, including the fee waiver practice program, please.

Are you an educator?

Contact us to learn about our educator partnerships.

Are you a student?

Start practicing for the ISEE!

The post The Impact of Test Practice: A Data Study appeared first on 91.

]]>
Student Growth Percentiles – FAQ /blog/sgp-faq/ Wed, 10 Feb 2021 20:45:00 +0000 /?p=2617 A student growth percentile (SGP) describes a student's growth compared to other students with the same initial test score. SGP is similar to a standard percentile, but instead of measuring student achievement compared to peers, SGP measures comparative growth. The student growth percentile allows us to compare the growth of students at different achievement levels.

The post Student Growth Percentiles – FAQ appeared first on 91.

]]>

What are student growth percentiles?

A student growth percentile (SGP) describes a student’s growth compared to other students with the same initial test score. SGP is similar to a standard percentile, but instead of measuring student achievement compared to peers, SGP measures comparative growth. The student growth percentile allows us to compare the growth of students at different achievement levels.

Like all percentiles, SGP is a number between 1 and 99. Receiving an SGP score of 70 indicates that you demonstrated more growth than 70 percent of your academic peers. A student with a low raw score can show high growth, and a student with a high raw score can demonstrate low growth. Similarly, two students with very different test scores can have the same SGP.

SGP is commonly used at the school, district, and state level as the primary growth metric or a key performance indicator. Traditionally, SGP is used to calculate growth year-over-year on a given exam. However, here at 91, we take advantage of our large pool of historical data to calculate SGP test-to-test on much smaller time scales.

How are student growth percentiles calculated?

Student growth percentiles are measured by calculating the raw score change between test sections of the same subject (e.g. Reading) for a given student. This difference is compared to the distribution of score changes for past students with initial section scores similar to that of the student of interest. By observing where the student’s score change falls on the distribution, we can estimate the student growth percentile based on the expected growth for the student’s previous test scores and the number of practice exams the student has completed.

To whom are students being compared? What is an academic peer?

When calculating the SGP, students are compared to their peers with a similar achievement level on the previous exam. Peers include all 91 students who completed that particular test section in the past and have practiced a similar amount. Raw section scores and the number of tests a student has taken are the only factors 91 takes into account when comparing peers.

Can high-scoring students still demonstrate growth?

Yes. Students with high raw scores on previous test sections will be compared to all other students who also achieved high scores in the past. Even high-performing students have varied performance; for this reason, students with high test scores may notice that simply maintaining their high achievement is indicative of growth, given the relativity of the SGP metric.

Which students receive growth percentiles?

All students (excepting those with trial accounts) will receive student growth percentiles. However, due to the nature of SGPs, students must complete at least two of the same test section in order to receive an SGP score.

What can student growth percentiles tell us?

Student growth percentiles are a descriptive metric, like test percentiles, and describe the amount of growth a student has made since the previous test. SGPs can help answer questions like the following:

For individual students:

      • Is the student growing at an appropriate rate?
      • Is the student demonstrating more growth in some test sections than in others?

For classes or groups of students:

      • Are there students with low growth who may need additional support?

How will the student growth percentile data be used?

SGPs is used alongside student performance metrics to identify student needs and to further personalize our platform to meet student needs. Students can access their SGP data from the Analysis tab of their accounts by scrolling down to the “Your Growth” graph, which can be toggled to display different metrics, including SGP.

What is Average SGP and how should I interpret it?

Average Student Growth Percentile, or Average SGP, is the cumulative mean of SGP scores achieved by a student at a given time. For example, if a student completed their fourth practice exam and received a score of 40 for their third SGP, this SGP will be averaged with the student’s previous SGP scores of 50 (test 1) and 60 (test 2) and will result in an Average SGP of 50 [(40+50+60)/3]. Prior to this fourth test, the student’s Average SGP would have been 55 [(50+60)/2]. An Average SGP of 50 reflects that the student has attained the expected average growth over time. Any Average SGP above 50 indicates above-average growth from test 1 to the current period. Conversely, any Average SGP below 50 indicates below average growth over this same time period. Individual SGP scores measure student growth from test to test compared to peers with similar test scores. Average SGP is used to observe a student’s average growth over time.

Published on:November 11, 2019
Updated on:February 10, 2021

The post Student Growth Percentiles – FAQ appeared first on 91.

]]>
2019 Year in Review /blog/2019-year-in-review/ Thu, 23 Jan 2020 15:45:00 +0000 /?p=3353 2019 was a major year of growth for 91. To date, we have helped 135,000+ students, and those students have answered more than 20 MILLION questions on our platform. To support this growth, our team nearly doubled in size, so we can continue to enhance student learning.

The post 2019 Year in Review appeared first on 91.

]]>

2019 was a major year of growth for 91. To date, we have helped 135,000+ students, and those students have answered more than 20 MILLION questions on our platform. To support this growth, our team nearly doubled in size, so we can continue to enhance student learning.

Having trouble viewing the infographic?Click here to view the PDF version.

As you can see, 2019 was a year of student perseverance and pushing boundaries. 28,786 students from over 100 different countries answered over 8.5 MILLION practice questions. Impressively, they spent 84,638 hours taking practice tests, studying vocabulary, watching instructional videos, and working through practice exercises.

For the first time ever, 91 started to measure Student Growth Percentile (SGP). SGP quantifies an individual’s growth compared to peers of a similar starting point. To learn more about SGP, see thispost.

2020 has just begun, but our team is already enthusiastically working on new features, new content, and new ways to learn. Stay tuned to see what 2020 will bring!

The post 2019 Year in Review appeared first on 91.

]]>
Data Fueled Practice Exercises – SSAT /blog/data-fueled-practice-exercises-ssat/ Wed, 18 Dec 2019 16:13:00 +0000 /?p=2602 A Note from 91’ CEO, Edan Shahar To date, students have answered over 20 million questions on the 91 platform. Now, we are diving into this data to understand how students learn and to help them achieve their academic and test-taking goals. After exploring how our students improve, we investigated further to learn […]

The post Data Fueled Practice Exercises – SSAT appeared first on 91.

]]>

A Note from 91’ CEO, Edan Shahar

To date, students have answered over 20 million questions on the 91 platform. Now, we are diving into this data to understand how students learn and to help them achieve their academic and test-taking goals.

After exploring how our students improve, we investigated further to learn how our various tools help students. We know our students improve by36% after six practice tests,but we wanted to know which tools help them improve the most and how we can better adapt our platform to maximize its impact.

What Are Practice Exercises?

After a student completes a practice test, the 91 system will generate specific, targeted recommendations for additional practice based on those test results. Practice exercises help students fill content gaps and improve their scores.

Practice Exercises Improve Scores

We initially analyzed the effect of practice exercises on overall student performance. We saw a five-percentile-point increase for students who complete at least one practice exercise compared to students who do not complete any practice exercises. We then examined the effect practice exercises have on student performance for each section, and we saw improvement across the board in every section; the math practice exercises for the Quantitative section have the biggest impact on improvement.

We discovered that students who complete math practice exercises not only increase their probability of correctly answering similar questions on the next test, but they also decrease the amount of time spent answering those questions.

Figure 1 depicts the percentage increase in the proportion of correct answers for section and difficulty matched questions from Test 1 to Test 2 when students completed or did not complete practice exercises between the tests. This shows a much greater increase in the proportion of correct answers after practice exercise completion. Note: The baseline proportion was higher for ‘Exercises Not Completed,’ which contributes to the difference in increased proportion.

Students who complete recommended practice exercises observe a much larger increase in the number of correct answers for questions of the same type and difficulty compared to students who do not complete practice exercises. It is important to note that students who do not complete exercises score higher on average on Test 1, which contributes to a smaller increase in correct answers on Test 2. However, the significant impact of practice exercises is evident, especially for students with lower percentile scores on Test 1.

Students Who Complete Practice Exercises

Students Who Do Not Complete Practice Exercises

Figure 2 displays the average time taken to answer an SSAT math question. Error bars represent the 95% confidence interval of the mean. We see a significant decrease in time taken to answer for students who completed practice exams (light blue) and a non-significant increase in time for students who did not (black).

Students who complete practice exercises don’t just improve their performance on similar questions, they also answer questions of the same type and difficulty on average 11 seconds faster on the second test than on the first test. Students who don’t complete practice exercises see little difference in time spent answering questions from Test 1 to Test 2. This suggests that students who spend more time answering questions and students who perform poorly on Test 1 may benefit the most from practice exercises.

Through our analyses, we learned that practice exercises can increase your performance and decrease the time spent answering questions. Additionally, we discovered that more practice is associated with higher achievement. But what does completing practice exercises do for long-term growth?

By looking at average student growth percentiles before and after a completed practice exercise, we can estimate increases in growth over time for students who complete practice exercises.

Figure 4 depicts Average Student Growth Percentiles before and after the completion of a practice sections. Error bars represent 95% CI.

On average, students who complete practice exercises see a three-point increase in Student Growth Percentile (SGP), increasing their math growth score from average to slightly above average.

More information about SGP can be foundhere.

A Note on Methodology from 91’ Data Scientist, Sean Coffinger

Analysis conducted with paired pre-practice exercises and post-practice exercises of similar difficulty for individual 9th-grade SSAT students in 2018. Performance increase analysis observed the proportion of students answering correctly prior to or after a practice exercise completion timestamp. We then looked at the percent increase in the proportion relative to the proportion of students answering a question correctly on the first attempt (Test 1). Pairs were matched by question type and difficulty. For question duration analysis, students who missed their first question, completed practice exercises, then answered a corresponding question correct on the following test showed a statistically significant decrease in question duration (paired t(540) = 4, p

To see data pertaining to ISEE students, please visit ourData Fueled Practice Exercises-ISEE blog post.

The post Data Fueled Practice Exercises – SSAT appeared first on 91.

]]>
Data Fueled Practice Exercises – ISEE /blog/data-fueled-practice-exercises-isee/ Wed, 18 Dec 2019 15:12:00 +0000 /?p=2591 A Note from 91’ CEO, Edan Shahar To date, students have answered over 20 million questions on the 91 platform. Now, we are diving into this data to understand how students learn and to help them achieve their academic and test-taking goals. After exploring how our students improve, we investigated further to learn […]

The post Data Fueled Practice Exercises – ISEE appeared first on 91.

]]>

A Note from 91’ CEO, Edan Shahar

To date, students have answered over 20 million questions on the 91 platform. Now, we are diving into this data to understand how students learn and to help them achieve their academic and test-taking goals.

After exploring how our students improve, we investigated further to learn how our various tools help students. We know our students improve by45% after six practice tests,but we wanted to know which tools help them improve the most and how we can better adapt our platform to maximize its impact.

What Are Practice Exercises?

After a student completes a practice test, the 91 system will generate specific, targeted recommendations for additional practice based on those test results. Practice exercises help students fill content gaps and improve their scores.

Practice Exercises Improve Scores

We initially analyzed the effect of practice exercises on overall student performance. We saw a five-percentile-point increase for students who complete at least one practice exercise compared to students who do not complete any practice exercises. We then examined the effect practice exercises have on student performance for each section, and we saw improvement across the board in every section; the math practice exercises for the Quantitative and Mathematics Achievement sections have the biggest impact on improvement.

We discovered that students who complete math practice exercises not only increase their probability of correctly answering similar questions on the next test, but they also decrease the amount of time spent answering those questions.

Figure 1 depicts the percentage increase in the proportion of correct answers for section and difficulty matched questions from Test 1 to Test 2 when students completed or did not complete practice exercises between the tests. This shows a much greater increase in the proportion of correct answers after practice exercise completion. Note: The baseline proportion was higher for ‘Exercises Not Completed,’ which contributes to the difference in increased proportion.

Students who complete recommended practice exercises observe a much larger increase in the number of correct answers. It is important to note that students who do not complete exercises score higher on average on Test 1, which contributes to a smaller increase in correct answers on Test 2. However, the significant impact of practice exercises is evident, especially for students with lower percentile scores on Test 1.

Students Who Complete Practice Exercises

Students Who Do Not Complete Practice Exercises

Figure 2 displays the average time taken to answer an ISEE math question. Error bars represent the 95% confidence interval of the mean. We see a significant decrease in time taken to answer for students who completed practice exams (light blue) and a non-significant decrease in time for students who did not (black).

Students who complete practice exercises don’t just improve their performance on similar questions, they also answer questions of the same type and difficulty on average 7.6 seconds faster on the second test compared to the first test. Students who don’t complete practice exercises see little difference in time spent answering questions from Test 1 to Test 2. This suggests that students who spend more time answering questions and students who perform poorly on Test 1 may benefit the most from practice exercises.

Through our analyses, we learned that practice exercises can increase your performance and decrease the time spent answering questions. Additionally, we discovered that more practice is associated with higher achievement. But what does completing practice exercises do for long-term growth?

By looking at average student growth percentiles before and after a completed practice exercise, we can estimate increases in growth over time for students who complete practice exercises.

Figure 4 depicts Average Student Growth Percentiles before and after the completion of a practice sections. Error bars represent 95% CI.

On average, students who complete practice exercises see a three-point increase in Student Growth Percentile (SGP), increasing their math growth scores from average to slightly above average.

More information about SGP can be foundhere.

A Note on Methodology from 91’ Data Scientist, Sean Coffinger

Analysis conducted with paired pre-practice exercises and post-practice exercises of similar difficulty for individual 9th-grade ISEE students in 2018. Performance increase analysis observed the proportion of students answering correctly prior to or after a practice exercise completion timestamp. We then looked at the percent increase in the proportion relative to the proportion of students answering a question correctly on the first attempt (Test 1). Pairs were matched by question type and difficulty. For question duration analysis, students who missed their first question, completed practice exercises, then answered a corresponding question correct on the following test showed a statistically significant decrease in question duration (paired t(954) = 3.82, p

To see data pertaining to SSAT students, please visit ourData Fueled Practice Exercises-SSAT blog post.

The post Data Fueled Practice Exercises – ISEE appeared first on 91.

]]>
Trial User Student Growth /blog/trial-user-student-growth-percentiles/ Wed, 04 Dec 2019 18:47:00 +0000 /?p=2803 To date, students have answered over 20 million questions on the 91 platform. Now, we are diving into this data to understand how students learn and to help them achieve their academic and test-taking goals. Our new Data Science team will be providing actionable analysis and creating cutting-edge features and tools to help our students, our schools, and our tutoring partners better understand the learning process.

The post Trial User Student Growth appeared first on 91.

]]>

A Note from 91’ CEO, Edan Shahar

To date, students have answered over 20 million questions on the 91 platform. Now, we are diving into this data to understand how students learn and to help them achieve their academic and test-taking goals.

Our new Data Science team will be providing actionable analysis and creating cutting-edge features and tools to help our students, our schools, and our tutoring partners better understand the learning process.

We’re thrilled for what’s coming down the pipe… stay tuned.

Trial User Improvements

We know that students use our platform to improve their test scores, but we wanted to know exactly how much they improve. We started by looking at student percentiles from test to test to capture the trajectory of student performance.

Figure 1 depicts the average increase in student percentile compared to the percentile score achieved on the first practice test. All students in this analysis were prior trial users. The light blue bar represents the increase in percentile after the second practice test and the dark blue bar represents the increase in percentile after the sixth practice test. Error bars represent the 95% confidence interval.

We offer a free trial to show users the features and functionality of the platform and to give them an insight into the types of questions on the test. Students who use the 91 platform after completing the trial see a 13% percentile increase after just the second practice test. Students who complete our Valedictorian package see a 28% percentile increase.

Students who complete the trial not only observe statistically significant growth from test-to-test, they also have higher average percentiles compared to students who do not take advantage of the trial. This suggests the trial gives students a ‘head-start’ by acclimating them to online tests, potentially reducing the learning curve for the first practice test.

Figure 2 shows average achieved percentiles on each test (1-6). We observe a statistically significant increase after each test iteration for students who did not complete trials. Test 1-2 and 2-6 proved statistically significant for trial students. The values in each bar correspond to the percent increase in percentile relative to the first exam (far-left bar). Note: Higher percentage increase in student growth for non-trial students attributed to lower initial score.

Across our platform, students exhibit significant increases in test percentile after every practice test. We see that our students don’t just improve by using our platform, they improve consistently over time, further reinforcing that practice makes perfect. Based on this information, we stand by our recommendation for students to complete all of the practice tests available to them. However, if a student does not have time to take all of the practice tests, completing just two practice tests has a significantly positive impact on scores.

A Note from 91’ Data Scientist, Sean Coffinger

Average percent increases in student percentiles were derived by comparing quantitative SSAT student percentiles from practice test 1 to practice tests 2 and 6, respectively, and observing percent increases in the paired scores. Error bars represent 95% CI. All percentile increases between test iterations were statistically significant (alpha = .01). Percent increase is relative to initial practice test percentile and describes the average percent increase in section percentile. Percentiles are initially derived from raw scores.

To see our data pertaining to SSAT students, visit ourSSAT Student Growth Percentiles page.

To see our data pertaining to ISEE students, visit ourISEE Student Growth Percentiles page.

Get started with your test prep today!

ACT
SAT

The post Trial User Student Growth appeared first on 91.

]]>
ISEE Student Growth Percentiles /blog/isee-student-growth-percentiles/ Tue, 26 Nov 2019 20:49:00 +0000 /?p=2882 Across our platform, students exhibit statistically significant increases in test percentile after every practice test. We see that our students don’t just improve by using our platform, but they improve consistently over time, further reinforcing that practice makes perfect. Students who use the 91 platform see a 20% increase in percentile after just the second practice test. Students who complete our Valedictorian package and take all six tests see a 45% increase in percentile.

The post ISEE Student Growth Percentiles appeared first on 91.

]]>

A Note from 91’ CEO, Edan Shahar

To date, students have answered over 20 million questions on the 91 platform. Now, we are diving into this data to understand how students learn and to help them achieve their academic and test-taking goals.

Our new Data Science team will be providing actionable analysis and creating cutting-edge features and tools to help our students, our schools, and our tutoring partners better understand the learning process.

We’re thrilled for what’s coming down the pipe… stay tuned.

Improvement from Test to Test

We know that students use our platform to improve their test scores, but we wanted to know exactly how much they improve. We started by looking at student percentiles from test to test to capture the trajectory of student performance.

Figure 1 depicts the average increase in student percentile compared to the students percentile score achieved on the first practice test. The light blue bar represents the increase in percentile after the second practice test and the dark blue bar represents the increase in percentile after the sixth practice test. Error bars represent the 95% confidence interval.

Across our platform, students exhibit statistically significant increases in test percentile after every practice test. We see that our students don’t just improve by using our platform, but they improve consistently over time, further reinforcing that practice makes perfect. Students who use the 91 platform see a 20% increase in percentile after just the second practice test. Students who complete our Valedictorian package and take all six tests see a 45% increase in percentile.

ISEE SGP 2

Figure 2 shows average achieved percentiles on each test (1-6). We observe a statistically significant increase after each test iteration. The values in each bar correspond to the percent increase in percentile relative to the first exam (far-left bar).

Based on this information, we stand by our recommendation for students to complete all of the practice tests available to them. However, if you don’t have time for all of the practice tests, completing just two practice tests has a significant positive impact on scores.

Improvement by Section

Figure 3 displays the average increase in student percentile compared to the students percentile score achieved on the first practice test disaggregated by test section subject. The light blue bar represents the increase in percentile after the second practice test and the dark blue bar represents the increase in percentile after the sixth practice test. Error bars represent the 95% confidence interval.

Seeing how much students improve from test to test wasn’t enough for us. We wanted to learn more about how our students improve in each section of the test. When we analyzed improvement by section, we discovered that after the second practice test our students, on average, see a 14% increase in Verbal, a 20% increase in Quantitative, a 16% increase in Reading, and a 29% increase in Math.

By the time our students complete their sixth practice test, they see a 31% increase in Verbal, a 53% increase in Quantitative, a 40% increase in Reading, and a 66% increase in Math.

Improvement by Level

Figure 4 displays the average increase in student percentile compared to the students percentile score achieved on the first practice test disaggregated by student level. The light blue bar represents the increase in percentile after the second practice test and the dark blue bar represents the increase in percentile after the sixth practice test. Error bars represent the 95% confidence interval.

Finally, we studied how students within each level of the test improved. We discovered that students at the Primary Level see a 12% increase after the second test and a 45% increase after the sixth test; students at the Lower Level see a 17% increase after the second test and a 38% increase after the sixth practice test; students at the Middle Level see an 18% increase after the second test and a 42% increase after the sixth test; and students at the Upper Level see a 24% increase after the second test and a 55% increase after the sixth test.

Student Growth Percentile

At 91, we help students improve their scores through personalized practice plans. These individualized plans require a holistic approach and a comprehensive picture of each student’s ability.

Traditionally, test prep companies have viewed performance as the single most important metric, but we are also interested in measuring individual student growth. Year-to-year growth measures, like student growth percentiles (SGPs), are routinely utilized at the state, district, and school levels as a key performance indicator alongside student academic performance. Student growth percentiles measure a student’s level of growth compared to other students of similar academic performance.

To learn more about student growth percentiles, visitour Student Growth Percentiles FAQ.

We have modified the SGP metric to quantify an individual student’s test-to-test growth relative to other students on our platform. By combining performance and growth metrics, we can better determine what students need to practice in order to succeed on test day.

Figure 5 depicts multiple student growth percentile curves. The y-axis displays the student growth percentile value (ranging from 1st-99th). The x-axis represents the raw score achieved on a students second test. Each colored line corresponds to a possible raw score on a previous test (here test 1). The curves are arranged from low- (left) to high- (right) performers. For example, the yellow curve (low-performer) will need to score lower on test 2 than the red curve (high-performer) in order to receive the same student growth percentile score. This is because we have different score expectations for high-performers and low-performers. The grey dashed line represents a 50% estimate of the raw score of Test 2.

The above graph depicts how student growth percentiles work. In this example, a student’s score on Test 1 will determine his/her curve color (yellow represents the lowest score possible and red represents the highest score possible). Next, the student’s performance on Test 2 (the x-axis) will determine his/her student growth percentile. For example, if a ‘yellow-curve’ student scores a 50% on Test 2 (vertical dashed grey line), he/she would achieve a high student growth percentile, indicating above-average growth compared to other ‘yellow-curve’ students. On the other hand, if a ‘red-curve’ student scores a 50%, he/she would show below average growth because he/she scored significantly lower than other ‘red-curve’ students.

Student growth percentiles will play a large role in advancing our future analyses, both in terms of richness and complexity. For example, we will incorporate student growth percentiles alongside practice test percentiles to measure the effectiveness of various tools that our platform offers. Using this combination of metrics, we can also identify groups of similar students that can be targeted with specific interventions. By gaining a richer view of how students are learning, we can better gauge what they need to succeed.

A Note from 91’ Data Scientist, Sean Coffinger

Average percent increases in student percentiles were derived by comparing ISEE student percentiles from practice test 1 to practice tests 2 and 6, respectively, and by observing percent increases in the paired scores. Error bars represent 95% CI. All percentile increases between test iterations were statistically significant (alpha = .01). Percent increase is relative to initial practice test percentile and describes the average percent increase in section percentile. Percentiles are initially derived from raw scores.

Student growth percentiles were built using ISEE raw score data from test to test to construct a distribution of scores achieved for each previous score possibility. “Test to test” is defined as Test(n-1) to Test(n) and is not specific to the first and second test as exemplified in Figure 5. Ground truth distributions were then interpolated to fully model all potential outcomes. A second interpolation between previous score possibilities ensured the preservation of ordinality. Both interpolations were conducted using local regression and locally estimated scatterplot smoothing. SGP curves were built for each grade:subject:raw score(test n-1) combination. By definition, students are required to complete a minimum of two sections of the same grade to receive a SGP measure.

To see our data pertaining to SSAT students, visit ourSSAT Student Growth Percentiles page.

Get started with your test prep today!

ACT
SAT

The post ISEE Student Growth Percentiles appeared first on 91.

]]>
SSAT Student Growth Percentiles /blog/ssat-student-growth-percentiles/ Tue, 26 Nov 2019 13:12:00 +0000 /?p=2928 Across our platform, students exhibit statistically significant increases in test percentile after every practice test. We see that our students don’t just improve by using our platform, but they improve consistently over time, further reinforcing that practice makes perfect.

The post SSAT Student Growth Percentiles appeared first on 91.

]]>

A Note from 91’ CEO, Edan Shahar

To date, students have answered over 20 million questions on the 91 platform. Now, we are diving into this data to understand how students learn and to help them achieve their academic and test-taking goals.

Our new Data Science team will be providing actionable analysis and creating cutting-edge features and tools to help our students, our schools, and our tutoring partners better understand the learning process.

We’re thrilled for what’s coming down the pipe… stay tuned.

Improvement from Test to Test

We know that students use our platform to improve their test scores, but we wanted to know exactly how much they improve. We started by looking at student percentiles from test to test to capture the trajectory of student performance.

Figure 1 depicts the average increase in student percentile compared to the student’s percentile score achieved on the first practice test. The light-blue bar represents the increase in percentile after the second practice test and the dark-blue bar represents the increase in percentile after the sixth practice test. Error bars represent the 95% confidence interval.

Across our platform, students exhibit statistically significant increases in test percentile after every practice test. We see that our students don’t just improve by using our platform, but they improve consistently over time, further reinforcing that practice makes perfect. Students who use the 91 platform see a 15% increase in percentile after just the second practice test. Students who complete our Valedictorian package and take all six tests see a 36% increase in percentile.

Figure 2 shows average achieved percentiles on each test (1-6). We observe a statistically significant increase after each test iteration. The values in each bar correspond to the percent increase in percentile relative to the first exam (far-left bar).

Based on this information, we stand by our recommendation for students to complete all of the practice tests available to them. However, if you don’t have time for all of the practice tests, completing just two practice tests has a significant positive impact on scores.

Improvement by Section

Figure 3 displays the average increase in student percentile compared to the student’s percentile score on the first practice test, disaggregated by test section subject. The light-blue bar represents the increase in percentile after the second practice test and the dark-blue bar represents the increase in percentile after the sixth practice test. Error bars represent the 95% confidence interval.

Seeing how much students improve from test to test wasn’t enough for us. We wanted to learn more about how our students improve in each section of the test. When we analyzed improvement by section, we discovered that after the second practice test our students, on average, see a 17% increase in Verbal, a 10% increase in Quantitative, and a 20% increase in Reading.

By the time our students complete their sixth practice test, they see a 42% increase in Verbal, a 44% increase in Quantitative, and a 37% increase in Reading.

Improvement by Level

Figure 4 displays the average increase in student percentile compared to the student’s percentile score achieved on the first practice test, disaggregated by student level. The light-blue bar represents the increase in percentile after the second practice test and the dark-blue bar represents the increase in percentile after the sixth practice test. Error bars represent the 95% confidence interval.

Finally, we studied how students within each level of the test improved. We discovered that students at the Elementary Level see an 8% increase after the second test and a 38% increase after the sixth test; students at the Middle Level see a 16% increase after the second test and a 38% increase after the sixth test; and students at the Upper Level see a 16% increase after the second test and a 41% increase after the sixth test.

Student Growth Percentile

At 91, we help students improve their scores through personalized practice plans. These individualized plans require a holistic approach and a comprehensive picture of each student’s ability.

Traditionally, test prep companies have viewed performance as the single most important metric, but we are also interested in measuring individual student growth. Year-to-year growth measures, like student growth percentiles (SGPs), are routinely utilized at the state, district, and school levels as a key performance indicator alongside student academic performance. Student growth percentiles measure a student’s level of growth compared to other students of similar academic performance.

To learn more about student growth percentiles, visitour Student Growth Percentile FAQ.

We have modified the SGP metric to quantify an individual student’s test-to-test growth relative to other students on our platform. By combining performance and growth metrics, we can better determine what students need to practice in order to succeed on test day.

Figure 5 depicts multiple student growth percentile curves. The y-axis displays the student growth percentile value (ranging from 1st-99th). The x-axis represents the raw score achieved on a students second test. Each colored line corresponds to a possible raw score on a previous test (here, Test 1). The curves are arranged from low- (left) to high- (right) performers. For example, the yellow curve (low-performer) will need to score lower on Test 2 than the red curve (high-performer) in order to receive the same student growth percentile score. This is because we have different score expectations for high-performers and low-performers. The grey dashed line represents a 50% estimate of the raw score of Test 2.

The above graph depicts how student growth percentiles work. In this example, a student’s score on Test 1 will determine his/her curve color (yellow represents the lowest score possible and red represents the highest score possible). Next, the student’s performance on Test 2 (the x-axis) will determine his/her student growth percentile. For example, if a ‘yellow-curve’ student scores a 50% on Test 2 (vertical dashed grey line), he/she would achieve a high student growth percentile, indicating above-average growth compared to other ‘yellow-curve’ students. On the other hand, if a ‘red-curve’ student scores a 50%, he/she would show below average growth because he/she scored significantly lower than other ‘red-curve’ students.

Student growth percentiles will play a large role in advancing our future analyses, both in terms of richness and complexity. For example, we will incorporate student growth percentiles alongside practice test percentiles to measure the effectiveness of various tools that our platform offers. Using this combination of metrics, we can also identify groups of similar students that can be targeted with specific interventions. By gaining a richer view of how students are learning, we can better gauge what they need to succeed.

A Note from 91’ Data Scientist, Sean Coffinger

Average percent increases in student percentiles were derived by comparing SSAT student percentiles from practice test 1 to practice tests 2 and 6, respectively, and by observing percent increases in the paired scores. Error bars represent 95% CI. All percentile increases between test iterations were statistically significant (alpha = .01). Percent increase is relative to initial practice test percentile and describes the average percent increase in section percentile. Percentiles are initially derived from raw scores.

Student growth percentiles were built using SSAT raw score data from test to test to construct a distribution of scores achieved for each previous score possibility. “Test to test” is defined as Test(n-1) to Test(n) and is not specific to the first and second test as exemplified in Figure 5. Ground truth distributions were then interpolated to fully model all potential outcomes. A second interpolation between previous score possibilities ensured the preservation of ordinality. Both interpolations were conducted using local regression and locally estimated scatterplot smoothing. SGP curves were built for each grade:subject:raw score(test n-1) combination. By definition, students are required to complete a minimum of two sections of the same grade to receive a SGP measure.

To see our data pertaining to ISEE students, visit ourISEE Student Growth Percentiles page.

Get started with your test prep today!

ACT
SAT

The post SSAT Student Growth Percentiles appeared first on 91.

]]>
Where do top-performing SSAT practice test takers come from? /blog/where-do-top-performing-ssat-practice-test-takers-come-from/ Wed, 19 Aug 2015 13:58:00 +0000 /?p=4126 The SSAT has three sections: Verbal, Reading, and Quantitative. When we looked at our practice test data from this past year, we noticed that five states—Massachusetts, New Jersey, California, Florida, and Washington—attained the highest scores on these sections, but there were variations between each state's particular strengths.

The post Where do top-performing SSAT practice test takers come from? appeared first on 91.

]]>

This is the first in a series of posts where we explore the data trends from our 2014-15 SSAT practice tests. In this installment, we’ll see which US states’ test takers had the strongest performances.

Performance by SSAT Section

The SSAT has three sections: Verbal, Reading, and Quantitative. When we looked at our practice test data from this past year, we noticed that five states—Massachusetts, New Jersey, California, Florida, and Washington—attained the highest scores on these sections, but there were variations between each state’s particular strengths. Here is a map of the top-performing states by section:

New Jersey and Massachusetts tended to outperform other states on all three sections. Florida came in with a relatively strong performance on Reading, and California’s students were second only to Massachusetts when it came to math. Those are the general trends. We also collected data on performance when it came to specific math question types. Let’s delve deeper into which precise subjects were the strongest for these top-five states.

California SSAT Math Performance

When Californian students aren’t out enjoying the sun and surf, they are dominating the competition in determining combinations of items like coins (12.34% above average), evaluating fraction operations (14.42% above average), and figuring out word problems involving proportions (24.56% above average).

Florida SSAT Math Performance

Florida’s test takers outperformed others in multiplying and dividing with decimals (14.40% above average), determining the nth term when faced with a pattern (15.28% above average), and finding the slope of a line when given a perpendicular slope.

Massachusetts SSAT Math Performance

Massachusetts’ students powered through questions that involved multiplying and dividing integers (13.64% above average), shape construction of 2-D figures (20.26% above average), and solving problems given unit rates and costs (6.98% above average).

New Jersey SSAT Math Performance

Denizens of the Garden State edged out other contenders when it came to metric unit conversions (7.57% above average), predicting the next number when given a pattern (16.46% above average), and solving for missing digits (5.13% above average). Jersey seems to be training some top-notch detectives.

Washington SSAT Math Performance

Test takers from Washington won out on questions involving total costs (7.55% above average), addition and subtraction estimates (5.61% above average), and division problems with no remainder (12.07% above average).

Conclusions

We have a few untested hypotheses about these marked differences between states. They might be due to specific curriculum disparities in certain parts of the country. There is also a chance that more stringent competition affects these numbers: perhaps students applying to prestigious boarding schools in New Jersey are already more prepared for the SSAT than students in Seattle applying to slightly less competitive day schools. And lastly, as statisticians we do need to point out that sample sizes varied between some data sets—there were more Californian test-takers, for example, than Washingtonian ones. In any case, we found these disparities very interesting to look at, and we hope you did too!Data Analyzed by Sam C. Written by Geoff D.

The post Where do top-performing SSAT practice test takers come from? appeared first on 91.

]]>