Welcome to Francis Academic Press

Frontiers in Educational Research, 2020, 3(5); doi: 10.25236/FER.2020.030517.

Analysing students’ written products in response to a TEM4 integrated task

Author(s)

Weilie Lu

Corresponding Author:
Weilie Lu
Affiliation(s)

Guangdong University of Foreign Studies, Guangzhou 510420, China
Email: [email protected]

Abstract

As integrated tasks are used in more and more testing contexts, it is necessary to investigate how test-takers approach such tasks. The present study aims to explore how English majors in the sophomore year perform in their written products in response to an interated task in TEM4. Analyses were conducted from the perspective of discource features, idea coverage the summary and the origin of ideas in the argement. Results showed that the two proficiency groups significantly differed only in grammatical accuracy, there were no significant difference between the two groups in terms of the other features: fluency and lexical sophistication. Besides, the two groups did not differ significantly in idea coverage in the summary, ideas used in the argument, ideas borrowed or ideas generated. The current study has implication for classroom teaching and test score interpretation. Directions for future research were recommended at last.

Keywords

integrated task, written products, discourse features, summary, argument

Cite This Paper

Weilie Lu. Analysing students’ written products in response to a TEM4 integrated task. Frontiers in Educational Research (2020) Vol. 3 Issue 5: 80-89. https://doi.org/10.25236/FER.2020.030517.

References

[1] Asención, Y. (2008). Investigating the reading-to-write construct. Journal of English for Academic Purposes, 7, 140–150.
[2] Brown, A., Iwashita, N., & McNamara, T. (2005). An examination of rater orientations and test-taker performance on English-for-Academic-Purposes speaking tasks (TOEFL Monograph Series MS29). Princeton, NJ: Educational Testing Service.
[3] Chapelle, C., Enright, M. & Jamieson, J. (2008). Score interpretation and use. In C. Chapelle, M. Enright, & J. Jamieson (Eds.), Building a validity argument for the Test of English as a Foreign Language (pp. 1–25). New York, NY: Routledge.
[4] Cumming, A., Grant, L., Mulcahy-Ernt, P., & Powers, D. (2004). A teacher-verifica-tion study of speaking and writing prototype tasks for a new TOEFL. Language Testing, 21, 159–197. doi:10.1191/0265532204lt278oa
[5] Cumming, A., Kantor, R., Baba, K., Erdosy, U., Eouanzoui, K., & James, M. (2005). Differences in written discourse in independent and integrated prototype tasks for the next generation TOEFL. Assessing Writing, 10, 5–43.
[6] Cumming, A., Kantor, R., Baba, K., Erdosy, U., Eouanzoui, K., & James, M. (2006). Analysis of discourse features and verification of scoring levels for independent and integrated tasks for the new TOEFL (TOEFL Monograph No.MS–30 Rm 05–13). Princeton, NJ: ETS.
[7] Esmaeili, H. (2002). Integrated reading and writing tasks and ESL students’ reading and writing performance in an English language test. The Canadian Modern Language Review, 58, 599–622. doi: 10.3138/cmlr.58.4.599
[8] Gebril, A., & Plakans, L. (2009). Investigating source use, discourse features, and process in integrated writing tests. Spaan Working Papers in Second or Foreign Language Assessment, 7,47–84.
[9] Gebril, A., & Plakans, L. (2013). Toward a transparent construct of reading-to-write tasks: the interface between discourse features and proficiency. Language Assessment Quarterly, 10, 9–27.
[10] Grant, L., & Stoller, F. (2001). Reading for academic purposes: Guidelines for the ESL/EFL teacher. In M. Celce-Murcia (Ed.), Teaching English as a second or foreign language (Third ed.) (pp. 187-203). Boston, MA: Heinle & Heinle.
[11] Hamp-Lyons, L., &Henning, G. (1991). Communicative writing profiles: An investigation of the transferability of a multiple-trait scoring instrument across ESL writing assessment contexts. Language Learning, 41, 337–373.
[12] Jennings, M., Fox, J., Graves, B., & Shohamy, E. (1999). The test-takers’ choice: An investigation of the effect of topic on language-test performance. Language Testing, 16, 426–456.
[13] Johns, A. M. 1985b. 'Summary protocols of "under-prepared" and "adept" university students: replications and distortions of the original.' Language Learning 35/4: 497-517.
[14] Johns, A., & Mayes, P. (1990). An analysis of summary protocols of university ESL students. Applied Linguistics, 11, 253–271.
[15] Lee, Y.-W. (2006). Dependability of scores for a new ESL speaking assessment consisting of integrated and independence tasks. Language Testing, 23, 131–166.
[16] Lewkowicz, J. (1997). The integrated testing of a second language. In C. Clapham, & D. Corson (Eds.), Encyclopedia of language and education (Vol. 7, pp. 121–130).
[17] Dordrecht, The Netherlands: Kluwer Academic.
[18] Luoma, S. (2004). Assessing speaking. Cambridge: Cambridge University Press.
[19] Plakans, L. (2008). Comparing composing processes in writing-only and reading-to-write test tasks. Assessing Writing, 13, 111–129.
[20] Plakans, L. (2009a). Discourse synthesis in integrated second language writing assessment. Language Testing, 26, 561–587.
[21] Plakans, L. (2009b). The role of reading strategies in integrated L2 writing tasks. Journal of English for Academic Purposes, 8, 252–266.
[22] Plakans, L., & Gebril, A. (2012). A close investigation into source use in integrated second language writing tasks. Assessing Writing, 17, 18–34.
[23] Polio, C. (1997). Measures of linguistic accuracy in second language writing research. Language Learning, 47, 101–143.
[24] Read, J. (1990). Providing relevant content in an EAP writing test. English for Specific Purposes, 9, 109–121.
[25] Rukthong & Brunfaut, (2020). Is anybody listening? The nature of second language listening in integrated listening-to-summarize tasks. Language Testing, 37(1), 31–53.
[26] Shi, L. (2004). Textual borrowing in second-language writing. Written Communication, 21, 171–200.
[27] Tedick, D. J. (1990). ESL writing assessment: Subject-matter knowledge and its impact on performance. English for Specific Purposes, 9, 123–143.
[28] Watanabe, Y. (2001). Read-to-write tasks for the assessment of second language academic writing skills: Investigating text features and rater reactions. Unpublished doctoral dissertation, University of Hawaii.
[29] Weigle, S. (2004). Integrating reading and writing in a competency test for non-native speakers of English. Assessing Writing, 9, 27–55.
[30] Weigle, S., & Parker, K. (2012). Source text borrowing in an integrated reading/writing assessment. Journal of Second Language Writing, 21, 118–133.
[31] Weir, C. (1993). Understanding and developing language tests. London: Prentice Hall.
[32] Wesche, M.B. (1987). Second language testing: The Ontario test of ESL as an example. Language Testing, 4, 28–47.