- Home
- e-Journals
- Dutch Journal of Applied Linguistics
- Previous Issues
- Volume 2, Issue, 2013
Dutch Journal of Applied Linguistics - Volume 2, Issue 1, 2013
Volume 2, Issue 1, 2013
-
Writing assessment in higher education: Making the framework work
Author(s): Marcus Callies, Ekaterina Zaytseva and Rebecca L. Present-Thomaspp.: 1–15 (15)More LessThe importance of appropriate assessment methods for academic writing skills in higher education has received increasing attention in SLA research in recent years. Despite this, there is still relatively little understanding of how academic writing skills develop at the most advanced levels of proficiency. Use of the Common European Framework of Reference for Languages (CEFR) is one way to ensure the comparability of findings across research efforts and continue to move the field forward. This paper presents some key concepts and definitions from the fields of SLA and advancedness research, language assessment and corpus linguistics and introduces several papers that address writing assessment within the context of higher education.
-
Assessing the use of sophisticated EFL writing: A longitudinal study
Author(s): Pieter de Haan and Monique van der Haagenpp.: 16–27 (12)More LessEven very advanced EFL writing tends to be less sophisticated than native writing. One of the problems seems to be finding the right collocations and the correct register. The aim of this article is to pinpoint what characterizes the development in very advanced Dutch EFL students’ written language production. We discuss the development of students’ ability to use appropriate intensifiers. Compared to their native English speaking contemporaries, the Dutch students initially tend to use intensifiers that are found typically in spoken English, such as really and a bit, but they gradually replace them by modifiers more suitable to academic writing. It is argued that the use of appropriate intensifiers can be seen to be a measure of advancedness and hence be used as a criterion in the assessment of advanced EFL writing quality.
-
UniTIE: Towards a transparent description of learning outcomes for academic writing
Author(s): Paula Haapanen, Suzy McAnsh, Eva Braidwood and Robert Hollingsworthpp.: 28–42 (15)More LessIn its aspirations to enhance mobility within higher education and focus on the comparability and quality of university degrees, the Bologna Process has emphasised the need for transparency in describing students’ knowledge and abilities. In response, the project, “UniTIE: Curriculum planning in the Language Centres”, was initiated to improve the transparency with which English courses are described and assessed at university language centres in Finland. This article provides a brief overview of the project, discusses the work carried out within a community of practice to develop descriptors for university English courses, and reports on the outcomes of the project.
-
Fleshing out CEFR descriptors at C1 and above for the assessment of academic writing in departments of English at Austrian universities
Author(s): Helen Heaneypp.: 43–56 (14)More LessAustrian education is currently undergoing long overdue standardization procedures, with nationwide school-leaving exams still not available for most subjects. A similar standardization procedure was introduced at Graz, Klagenfurt, Salzburg and Vienna universities in 2006 when linguists and language teachers started professionalizing assessment practices for exit-level English exams in their BA programmes by developing analytic rating scales for writing. This paper describes which dimensions were selected, and how the scales were developed. Anchor descriptors were extracted from the CEFR for a bare pass (C1) and the top grade (C2) in Grammar, Vocabulary, Textual competence and Pragmatic competence (later Task achievement), and exam performances on typical writing prompts were rated, including brief justifications. Subsequently, extended descriptors were formulated for C1/C2 based on the teams’ justifications, and a mid-way level (C1.2) was inserted. Finally, the scales were modified during benchmarking procedures.
-
Defining proficiency: A comparative analysis of CEF level classification methods in a written learner corpus
Author(s): Rebecca L. Present-Thomas, Bert Weltens and John H.A.L. de Jongpp.: 57–76 (20)More LessIn this study, various proficiency classification methods are explored in order to describe the relevant levels on the Common European Framework of Reference for Languages (CEF) that are represented by a group of 127 incoming English students at a Dutch university with respect to academic writing. The weakness of the widely-used group-based institutional status approach is demonstrated with two distinct student-centered approaches, self-assessment and test scores, both of which highlight the within-groups variation that is hidden in group-based approaches. Between-texts variation is further explored through the comparison of self-assessment and text-centered approaches to classification such as test item (response) scores, and widely used measures of lexical variation and syntactic complexity. Findings demonstrate the potential variation in the understanding of academic writing development depending on the the methods of proficiency classification used.
-
EMBEDding the CEFR in academic writing assessment: A case study in training and standardization
Author(s): Kevin Haines, Nicole Schmidt, Petra Jansma and Wander Lowiepp.: 77–91 (15)More LessThe CEFR is increasingly being used as the framework of choice for the assessment of language proficiency at universities across Europe. However, to attain consistent assessment, familiarization and standardization are essential. In this paper we report a case study of embedding a standardization procedure in writing assessment activities at the University of Groningen. The project shows the value of standardization procedures within the CEFR and reports on the difficulty of finding consistently assessed ‘flat’ samples. Moreover, it reports on the desirability to scaffold teacher training sessions using flat samples with less ‘even’ samples.
-
Information Structure: The final hurdle?: The development of syntactic structures in (very) advanced Dutch EFL writing
Author(s): Lieke Verheijen, Bettelou Los and Pieter de Haanpp.: 92–107 (16)More LessAlthough texts produced by (very) advanced Dutch learners of English as a foreign language (EFL) may be perfectly grammatical, they often feel distinctly non-native. Dutch, as a verb-second language, makes separate positions available for discourse linking and aboutness-topics. Although the English sentences of these advanced learners conform to the subject-verb-object order of English, the pre-subject adverbial position in English is made to perform the information-structural function of the verb-second discourse-linking position, producing texts that are perceived as non-native, without being ungrammatical. A side-effect of this L1 interference is the underuse of special focusing constructions in English, like the stressed-focus it-cleft. This paper investigates the progress of Dutch writers towards a more native-like use of the pre-subject position and the it-cleft in a longitudinal corpus of 137 writings of Dutch university students of English. We conclude that information-structural differences present the final hurdle for advanced Dutch EFL writers.
-
An investigation into the writing construct(s) measured in Pearson Test of English Academic
Author(s): Ying Zheng and Shaida Mohammadipp.: 108–125 (18)More LessPearson Test of English Academic (PTE Academic) has six item types that assess academic writing either independently or integratively. This research focuses on evaluating the construct validity and effectiveness of the six writing item types. Exploratory Factor Analysis was performed to examine the underlying writing constructs as measured by the six item types. Item scores for different writing skills were subjected to Rasch IRT analysis. The difficulty of the item types was estimated and the effectiveness of each item type was evaluated by calculating the information function of each one. The results identified two writing constructs: an Analytical/Local Writing construct and a Synthetic/Global Writing construct. The study has implications for test developers on the use of multiple item types and their effectiveness, and for test users on how they can improve their writing skills.
-
The Corpus of Academic Learner English (CALE): A new resource for the assessment of writing proficiency in the academic register
Author(s): Marcus Callies and Ekaterina Zaytsevapp.: 126–132 (7)More LessLearner corpora present an option to inform, supplement and advance the way language proficiency is operationalized and assessed, and may also be used in data-driven approaches to the assessment of writing proficiency that are largely independent of human rating. The aim of this contribution is twofold: first, to introduce a new Language-for-Specific-Purposes learner corpus, the Corpus of Academic Learner English (CALE), currently being compiled for the study of academic learner writing; and second, to illustrate how the CALE is useful in a text-centered, corpus-driven approach to the assessment of academic writing to achieve a higher degree of reliability in assessing language proficiency.
Most Read This Month
Article
content/journals/22117253
Journal
10
5
false
-
-
The EPPM put to the test
Author(s): Joëlle Ooms, Carel Jansen and John Hoeks
-
-
-
Foreign language attrition
Author(s): Monika S. Schmid and Teodora Mehotcheva
-
-
-
Supervernaculars and their dialects
Author(s): Jan Blommaert
-
-
-
Transfer in L3 acquisition
Author(s): Lukas Eibensteiner
-
- More Less