1887
Volume 17, Issue 1
  • ISSN 0155-0640
  • E-ISSN: 1833-7139
USD
Buy:$35.00 + Taxes

Abstract

There has been a surprisingly limited amount of research comparing the direct assessment of writing by teachers of English as a mother tongue and English as a second language (abbreviated as English and ESL respectively) given the amount of common ground they share as teachers of writing. This study aims to investigate whether these two groups of teachers rate writing samples differently using both holistic (global) and analytical (multiple trait) scoring methods. The research compares the assessments made by four experienced teachers from each of these two rater groups of the same set of 20 native speaker (English) and 20 non native speaker (ESL) essays written by final year secondary students. While no significant difference was found between the single global essay ratings of the two groups of teachers, this was not the case for the essay totals obtained by combining the global and analytical scores. The comparison based on these essay totals indicated that overall English teachers rated all of the essays significantly more harshly than ESL teachers. These findings suggested that the analytical scoring method may be more faithful to real dissimilarities which exist between raters of different backgrounds and professional experience than is the holistic scoring method in the assessment of writing. The choice of scoring procedure when both types of raters are used, therefore, is likely to determine whether or not these differences are highlighted and thus the overall level of inter-rater reliablity.

Loading

Article metrics loading...

/content/journals/10.1075/aral.17.1.02olo
1994-01-01
2024-10-06
Loading full text...

Full text loading...

References

  1. Bartko, J.J.
    (1966) The intra-class correlation coefficient as a measure of reliability. Psychological Report19:3–11. doi: 10.2466/pr0.1966.19.1.3
    https://doi.org/10.2466/pr0.1966.19.1.3 [Google Scholar]
  2. Brown, J.D.
    (1991) Do English and ESL faculties rate writing samples differently?Tesol Quarterly, 25,4:587–603. doi: 10.2307/3587078
    https://doi.org/10.2307/3587078 [Google Scholar]
  3. Carlson, S. , B. Bridgeman , R. Camp and J. Waanders
    (1985) Relationship of admission test scores to writing performance of native and non native speakers of English. (TOFEL Research Report 19) Princeton, NJ, Educational Testing Service.
    [Google Scholar]
  4. Elder, C.
    (1992) How do subject specialists construe language proficiency?Melbourne Papers in Language Testing1,1:17–36.
    [Google Scholar]
  5. Hamp-Lyons, L.
    (ed.) (1991) Assessing second language writing in academic contexts. Norwood, NJ, Ablex.
    [Google Scholar]
  6. Hatch, E. and Lazaraton, E.
    (1991) The research manual design and statistics for applied linguistics. Rowley, Mass: Newbury House.
    [Google Scholar]
  7. Henning, G.
    (1987) A guide to language testing: Development, validation, research. Cambridge, Mass: Newbury House.
    [Google Scholar]
  8. Huot, B.
    (1990a) The literature of direct writing assessment: major concerns and prevailing trends. Review of Educational Research60,2:237–263. doi: 10.3102/00346543060002237
    https://doi.org/10.3102/00346543060002237 [Google Scholar]
  9. (1990b) Reliability, validity and holistic scoring: what we know and what we need to know. College Composition and Communication41,3:201–213. doi: 10.2307/358160
    https://doi.org/10.2307/358160 [Google Scholar]
  10. Linacre, J.M.
    (1988) FACETS, A computer program for the analysis of multi-facted data, Chicago, MESA Press.
    [Google Scholar]
  11. McNamara, T.F.
    (1990) Assessing the second language proficiency of health professionals. Unpublished PhD thesis, University of Melbourne.
  12. Morgan, J.
    (1990) ESL students and the new Victorian Certificate of Education common study. Unpublished MA thesis, University of Melbourne.
    [Google Scholar]
  13. O’Loughlin, K.J.
    (1992) Final report on the University of Melbourne Trial English Selection Test. NLLIA Language Testing Research Centre, University of Melbourne.
    [Google Scholar]
  14. Purpura, J.E.
    (1992) Rater consistency between and among ESL teachers and writing program teachers. Unpublished typescript, Department of TESL/Applied Linguistics, University of Los Angeles, California.
    [Google Scholar]
/content/journals/10.1075/aral.17.1.02olo
Loading
  • Article Type: Research Article
This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error