Ahead of the Curve: How PEG™ Has Led Automated Scoring for Years

, , , ,

What is PEG™?

PEG, or Project Essay Grade, is the automated scoring system at the core of ERB Writing Practice.  It was invented in the 1960s by Ellis Batten Page, a former high school English teacher, who spent “many long weekends sifting through stacks of papers wishing for some help.” His guiding principles? 1) the more we write, the better writer we become, and 2) computers can grade as reliably as their human counterparts (Page, 2003).  The state of computers at the time of Page’s invention did not leave much room for automation, so PEG lay dormant until the mid-1980s.  Given that Page’s two principles are still as relevant today as they were then, PEG was given new life in the 1990s scoring essays for NAEP, Praxis, and GRE testing programs when computerization became feasible.  PEG was eventually acquired by ERB’s longtime partner, Measurement Inc., and continues to evolve and find new uses today.

The foundational concept of automated scoring is that good writing can be predicted.  PEG and other systems require training essays that have human scores, and these systems use such essays to create scoring (or prediction) models.  The models typically include 30-40 features, or variables, within a set of essays that predict human ratings.  Typical examples of such variables include sentence length, use of higher-level vocabulary, and grammar.  In most instances, the combination of these variables yields correlations with human raters in the mid .80s on a scale of 0-1, which is a high level of prediction accuracy—one that is typically higher than correlations between different human raters and themselves.  Once the model is trained, the automated scoring system “reads” subsequent essays, quantifies values for them on each variable in the model, and uses the prediction model to score the essay. 

Despite the proven accuracy of automated scoring systems, a common criticism is that the scores such systems produce lack an understanding of the meaning of a student-written essay.  Humans can rate the quality of an idea or the strength of an argument in ways that computers cannot, even if such ratings can be idiosyncratic and inconsistent at times.  While that criticism is valid, the 30-40 variables used by PEG represent the traits and skills of good writing, and thus are extremely relevant to budding writers who need feedback to learn how to improve their writing as they practice.  To balance out the automated PEG feedback, ERB Writing Practice also includes options for users to collect feedback from peers and/or teachers.  Teachers can give quick, quantitative ratings on how effectively students used textual evidence as well as how accurate the content of their writing is in relation to a given prompt topic. 

When PEG was first used operationally, its focus was on predicting scores holistically; that is, recovering the overall writing score a human assigned the essay.  Over time, scoring evolved to provide feedback on unique traits of effective writing, and different scoring algorithms were developed for distinct genres.  Today, PEG provides scores on six characteristics of writing and uses separate models for three genres: argumentative, informational/explanatory, and narrative. The six characteristics of effective writing that PEG provides scores on are outlined below (learn more at support.wpponline.com).

  1. Development of Ideas — The writer’s presentation of supportive details and information pertinent to support their idea.
  2. Organization — The writer’s overall plan (coherence) and internal weaving together of ideas (cohesion).
  3. Style — The use of strong word choices and varied sentence constructions to establish a unique voice that connects with the audience.
  4. Word choice — The appropriate use of advanced vocabulary, precision, and application of vocabulary to an essay.
  5. Sentence fluency — The use of complex and varied sentences to skillfully create a smooth flow of ideas.
  6. Conventions — Conventions include grammar, usage, pronoun reference, consistency in number and person, and mechanics (spelling, capitalization, punctuation, and paragraphing).

The strong reliability of PEG scoring for different genres has also enabled teachers to introduce their prompts for automated essay scoring.  When teachers do so, they can select the PEG model that aligns with the genre of their writing prompt, ensuring more nuanced automated scoring. 

Since the advent of PEG, other automated essay scoring systems have been launched, and research has been conducted.  In a recent study conducted by the National Center of Education Statistics, PEG was shown to be the most accurate among automated scoring alternatives at scoring prompts developed for The Nation’s Report Card (NCES, 2022).  Research has also focused on the efficacy of writing practice with PEG scoring.  An important study found that after “controlling for students’ initial writing quality and the amount they used PEG writing, students who used PEG produced higher quality essays at the end of the intervention … 22% higher than those who didn’t” (Palermo, 2018).

So what does this all mean for ERB members?

Our purpose at ERB is to provide member schools with scientifically developed measures they can use to understand gaps in curriculum and instruction, as well as specific areas where students can improve.  ERB Writing Practice is a new program that provides students and educators with a steady stream of reliable data they can use to target improvements to individual writing.  It addresses the enormous time commitment for teachers of grading papers by hand.  ERB Writing Practice also has the evidence to support its efficacy to improve their writing. These benefits open many opportunities for students to write more, and in doing so, become better writers.  

  

References

Page, E. B. (2003).  Project Essay Grade: PEG.  In Automated Scoring: A Cross-Disciplinary Perspective (edited by M. Shermis and J. Burstein). Mahwah, NJ: Erlbaum.

Palermo, C. (2018).  Research student finds using PEG writing helps students write higher quality essays.  Retrieved August 12, 2022 from: https://measurementinc.com/news/research-study-finds-using-peg-writing-helps-students-write-higher-quality-essays.

NCES.  (January 21, 2022).  Four Teams Win Top Prize in Automated Scoring Challenge for The Nation’s Report Card.  Retrieved August 12, 2022 from: https://nces.ed.gov/whatsnew/press_releases/1_21_2022.asp


Contact your Member Services Director or Submit a request form if you have questions about ERB Writing Practice.

Related Reading

Developing an Assessment Strategy and Plan for Your School

When it comes to measuring student growth, independent schools should design an assessment strategy that works best for them, selecting the right combination of tools that align with their mission. Yet educational leaders not only have to decide what to assess and how often but also analyze and use the data. […] read more

Using Data to Measure and Analyze Students’ Social and Emotional Skills

A high-quality K-12 education involves far more than learning the ABCs — or preparing for the SATs. In addition to academics, school is where students learn the essential social and emotional (SEL) skills they need to develop healthy, successful lives. […] read more

Bouncing Back from COVID: Key Findings from ERB’s 2023-2024 Assessment Data

We recently reported on key findings of an analysis of our student learning data from 2023-2024 during a webinar with President Tom Rochon offering a comprehensive, multi-year view of student progress in independent schools, along with actionable insights to guide decision-making in the upcoming year. […] read more

Educator Well-Being: The Foundation of School Well-Being

Just as gauging student well-being is vital to an educational community, it’s equally as important to explore and prioritize educator well-being and teacher self-care. […] read more

Become a member

An ERB membership unlocks access to our portfolio of assessments and measurement tools to better understand the whole child and enables you to become part of a community of like-minded educators.

Are you an ERB Member?

Update your email preferences to receive news and updates from ERB.

Not an ERB member? Join our global community today!
ERB and EMA are excited to announce their intent to merge.