Explaining computerized English testing in plain English

app Languages
a pair of hands typing at a laptop

Research has shown that automated scoring can give more reliable and objective results than human examiners when evaluating a person’s mastery of English. This is because an automated scoring system is impartial, unlike humans, who can be influenced by irrelevant factors such as a test taker’s appearance or body language. Additionally, automated scoring treats regional accents equally, unlike human examiners who may favor accents they are more familiar with. Automated scoring also allows individual features of a spoken or written test question response to be analyzed independent of one another, so that a weakness in one area of language does not affect the scoring of other areas.

was created in response to the demand for a more accurate, objective, secure and relevant test of English. Our automated scoring system is a central feature of the test, and vital to ensuring the delivery of accurate, objective and relevant results – no matter who the test-taker is or where the test is taken.

Development and validation of the scoring system to ensure accuracy

PTE Academic’s automated scoring system was developed after extensive research and field testing. A prototype test was developed and administered to a sample of more than 10,000 test takers from 158 different countries, speaking 126 different native languages. This data was collected and used to train the automated scoring engines for both the written and spoken PTE Academic items.

To do this, multiple trained human markers assess each answer. Those results are used as the training material for machine learning algorithms, similar to those used by systems like Google Search or Apple’s Siri. The model makes initial guesses as to the scores each response should get, then consults the actual scores to see well how it did, adjusts itself in a few directions, then goes through the training set over and over again, adjusting and improving until it arrives at a maximally correct solution – a solution that ideally gets very close to predicting the set of human ratings.

Once trained up and performing at a high level, this model is used as a marking algorithm, able to score new responses just like human markers would. Correlations between scores given by this system and trained human markers are quite high. The standard error of measurement between app’s system and a human rater is less than that between one human rater and another – in other words, the machine scores are more accurate than those given by a pair of human raters, because much of the bias and unreliability has been squeezed out of them. In general, you can think of a machine scoring system as one that takes the best stuff out of human ratings, then acts like an idealized human marker.

app conducts scoring validation studies to ensure that the machine scores are consistently comparable to ratings given by skilled human raters. Here, a new set of test-taker responses (never seen by the machine) are scored by both human raters and by the automated scoring system. Research has demonstrated that the automated scoring technology underlying PTE Academic produces scores comparable to those obtained from careful human experts. This means that the automated system “acts” like a human rater when assessing test takers’ language skills, but does so with a machine's precision, consistency and objectivity.

Scoring speaking responses with app’s Ordinate technology

The spoken portion of PTE Academic is automatically scored using app’s Ordinate technology. Ordinate technology results from years of research in speech recognition, statistical modeling, linguistics and testing theory. The technology uses a proprietary speech processing system that is specifically designed to analyze and automatically score speech from fluent and second-language English speakers. The Ordinate scoring system collects hundreds of pieces of information from the test takers’ spoken responses in addition to just the words, such as pace, timing and rhythm, as well as the power of their voice, emphasis, intonation and accuracy of pronunciation. It is trained to recognize even somewhat mispronounced words, and quickly evaluates the content, relevance and coherence of the response. In particular, the meaning of the spoken response is evaluated, making it possible for these models to assess whether or not what was said deserves a high score.

Scoring writing responses with Intelligent Essay Assessor™ (IEA)

The written portion of PTE Academic is scored using the Intelligent Essay Assessor™ (IEA), an automated scoring tool powered by app’s state-of-the-art Knowledge Analysis Technologies™ (KAT) engine. Based on more than 20 years of research and development, the KAT engine automatically evaluates the meaning of text, such as an essay written by a student in response to a particular prompt. The KAT engine evaluates writing as accurately as skilled human raters using a proprietary application of the mathematical approach known as Latent Semantic Analysis (LSA). LSA evaluates the meaning of language by analyzing large bodies of relevant text and their meanings. Therefore, using LSA, the KAT engine can understand the meaning of text much like a human.

What aspects of English does PTE Academic assess?

Written scoring

Spoken scoring

  • Word choice
  • Grammar and mechanics
  • Progression of ideas
  • Organization
  • Style, tone
  • Paragraph structure
  • Development, coherence
  • Point of view
  • Task completion
  • Sentence mastery
  • Content
  • Vocabulary
  • Accuracy
  • Pronunciation
  • Intonation
  • Fluency
  • Expressiveness
  • Pragmatics

More blogs from app

  • children holding hands in a line with a parent outside

    11 Offline English learning ideas

    Por app Languages

    In today's fast-paced digital era, online resources and language learning apps have become the popular means for mastering English. However, offline language learning has its own unique charm and benefits. Engaging our senses and connecting with the physical world around us can enhance our language skills in ways that no app can match. In today's language learning blog, we discuss offline language learning activities that can help you towards becoming a confident English speaker, even without an internet connection.

  • A teacher helping a student at a table

    The Global Scale of English and planning: A perfect partnership

    Por

    As a teacher, I realized that planning had become an 'automatic pilot' routine from which I did not learn much. Like many others, I thought scales such as the Global Scale of English (GSE) or the Common European Framework of Reference are just that; references that are beyond the realities of their lessons.

    However, I've seen that the GSE is a very powerful resource to help us at the level of planning.

    If you're using a coursebook you may have noticed that, after completing one of the books in the series, students move up one level, such as from elementary to pre-intermediate or from intermediate to upper-intermediate.

    We all understand what it means to be an elementary or intermediate student. These levels are usually defined in terms of structures – conditional sentences, passive voice, and tenses – Simple Past, Future Continuous, etc.

    But why do students want to learn English? Using it means being able to listen or read and understand, interact with others, and communicate in writing. Even if it is parents who enroll their children in language institutes, what they want is for them to use the language. We can see a mismatch between how levels are defined and students' aims to study English.

    Here's how the GSE can help English language teachers

    First, you need the right scale for your group – Pre-primary, Young Learners, Adults, Professionals or Academic, which can be downloaded at:

    /languages/why-pearson/the-global-scale-of-english/resources.html

    Focus on your students' level. There you will see all the learning objectives that students need to achieve to complete the level at which they are and move on in their learning journey.

    What are learning objectives?They are can-do statements that clearly describe what students are expected to achieve as the result of instruction. In other words, these objectives guide teachers in our planning to help students learn.

    When we plan our lessons, rather than working at lesson level only, we should reflect on how the activities proposed are referenced against the learning objectives of the level. We may see that some activities need some adapting in order to focus on the selected learning outcomes.

    At the level of planning as well, I also use the GSE to analyze the activities proposed in the materials I am using. Let me tell you what I do. Let's take listening, for instance. You may use the downloaded scales or the Teacher Toolkit that the GSE provides. Let's run through how this works.

  • A teacher stood at the front of a class holding a tablet in front of adult students

    9 steps to teaching advanced business English

    Por Margaret O'Keeffe

    The challenge of teaching business English to C1 level students

    Once your English students reach a B2 level of English, they’re fairly competent communicators. For many learners, their motivation to improve starts to suffer when they reach this intermediate plateau. They understand almost everything and can express themselves clearly enough - so why would they want to continue learning English and achieve a C1 level of English?

    The CEFR describes C1-level learners as proficient users of a language. C1-level students have a high proficiency in English and perform well in an international work environment.

    How can we help our upper intermediate students reach this level and see the benefits in their own lives and careers? Here are nine steps you can take as an English language teacher to help your students achieve language proficiency.