Get to know » Language

Language is the one of the most important tools we use to learn. While English is a common language of instruction in Pacific schools, there are hundreds of Pacific languages and the language that students use at home is not always the language they use at school. Similarly, the language that students took the PILNA assessments in may not have been the language most familiar to them.

Taking assessments in a language that is not a student’s most familiar language may affect their performance. This is important to consider, as the PILNA assessments aim to assess student performance independently from any one language – literacy and numeracy can be demonstrated using any Pacific language. Therefore, the PILNA programme takes into account the language students used to complete their assessments relative to the languages they are most familiar with.

Students were asked about the language they mostly used to converse with family, friends, teachers, and in other settings. This was recorded, alongside the language the students used to sit the PILNA assessments. With this information, a regional scale was formed to describe how much a student used the language they completed the PILNA assessments in.

Higher scores on this language scale indicate that the student uses the language that they completed the PILNA assessment with for conversing across a greater range of settings. Lower scores indicate that the student uses the language they completed the PILNA in for conversing across fewer settings.

This scale allowed for comparisons between student performance and the number of settings the students use the language they completed the PILNA assessments in. For students in Federated States of Micronesia, Figure STF1.1 shows differences in average language scale scores by year level and domain (numeracy and reading), and by meeting or not meeting expected (benchmarked) performance levels in each domain. Note that comparisons to the writing domain are unavailable as the proficiency scale for writing has not yet been established.

Figure STF1.1: PILNA LANGUAGE SCALE
Average scores of students on language scale by year level and proficiency

Numeracy

  • Grade 5

    • 167 5.2
    • 178 4
  • Grade 7

    • 164 4.1
    • 176 3.6

Reading

  • Grade 5

    • 167 3.6
    • 184 4.5
  • Grade 7

    • 164 3.7
    • 177 4
  • Scale score for students below expected proficiency level
  • Scale score for students at or above expected proficiency level
  • Statistically significant correlation (p <0.05)
  • Standard errors appear in parentheses

Language and numeracy performance

When looking at language scale scores by numeracy performance, students whose main language of conversing was the same language as PILNA tended to perform better, compared to those students whose language of conversing differed.  This effect was observed across both year levels.

Grade five students who met the expected level of numeracy performance had higher average language scale scores (178) than grade five students who did not meet the expected level of numeracy performance (167). It appears that grade five students who met the expected level of numeracy performance used the language they completed PILNA in more commonly to converse.

The same positive association was observed for students operating at the equivalent of grade five regionally. However, it was not observed for the equivalent of grade seven students regionally.

Language and reading performance

For language scores by reading performance, both year levels show similar results. Grade five students who met the expected level of reading performance had higher average language scores (184) than those who did not (167). Similarly, grade seven students who met the expected level of reading performance had slightly higher average language scores (177) than those who did not (164).

The same association was observed regionally. For both grade five and grade seven students, those who met the expected levels of performance in reading used the language they completed PILNA in to converse in more settings than those who did not meet the expected levels of performance.

What does this mean?

Interestingly, but perhaps intuitively, students who performed better in reading tended to use the language they were assessed with in more settings. This suggests that using a language in everyday conversation may influence students’ reading ability.

For Federated States of Micronesia, the same effect was observed for mathematics. It may be that students who use a language in more settings are more able to understand the language components of mathematics questions, although other factors, such as demographic factors, may have impacts here. This may be an area that researchers in Federated States of Micronesia might like to explore further.