Southern Illinois University Carbondale
OpenSIUC
;eses ;eses and Dissertations
8-1-2016
MACRO AND MICRO SKILLS IN SECOND
LANGUAGE ACADEMIC WRITING: A
STUDY OF VIETNAMESE LEARNERS OF
ENGLISH
Ha i anh Nguyen
Southern Illinois University Carbondale, ha.nguyen@siu.edu
Follow this and additional works at: h<p://opensiuc.lib.siu.edu/theses
;is Open Access ;esis is brought to you for free and open access by the ;eses and Dissertations at OpenSIUC. It has been accepted for inclusion in
;eses by an authorized administrator of OpenSIUC. For more information, please contact opensiuc@lib.siu.edu.
Recommended Citation
Nguyen, Ha ;i ;anh, "MACRO AND MICRO SKILLS IN SECOND LANGUAGE ACADEMIC WRITING: A STUDY OF
VIETNAMESE LEARNERS OF ENGLISH" (2016). eses. Paper 2008.
MACRO AND MICRO SKILLS IN SECOND LANGUAGE ACADEMIC WRITING: A
STUDY OF VIETNAMESE LEARNERS OF ENGLISH
by
Ha Thi Thanh Nguyen
B.A., College of Foreign Languages 2009
A Thesis
Submitted in Partial Fulfillment of the Requirements for the
Master of Arts Degree in TESOL
Department of Linguistics
in the Graduate School
Southern Illinois University Carbondale
August 2016
Copyright© by Ha Thi Thanh Nguyen, 2016
All Rights Reserved
THESIS APPROVAL
MACRO AND MICRO SKILLS IN SECOND LANGUAGE ACADEMIC WRITING: A
STUDY OF VIETNAMESE LEARNERS OF ENGLISH
by
Ha Thi Thanh Nguyen
A Thesis Submitted in Partial
Fulfillment of the Requirements
for the Degree of
Master of Arts
In the Field of TESOL
Approved by:
Dr. Krassimira Charkova, Chair
Dr. Katherine I. Martin
Dr. Laura Halliday
Graduate School
Southern Illinois University Carbondale
June 23, 2016
i
AN ABSTRACT OF THE THESIS OF
HA THI THANH NGUYEN, for the Master of Arts degree in Teaching English to Speakers of
Other Languages, June 23, 2016, Southern Illinois University at Carbondale, IL, USA.
TITLE: MACRO AND MICRO SKILLS IN SECOND LANGUAGE ACADEMIC WRITING:
A STUDY OF VIETNAMESE LEARNERS OF ENGLISH
MAJOR PROFESSOR: Dr. Krassimira Charkova
The ability to write in a second language is one of the major skills required in academic
settings. However, research about the effectiveness of academic programs on second language
writing with a long term perspective is rather scarce and the findings are mixed (e.g. Archibald,
2001; Elder & O’Loughlin, 2003; Hu, 2007; Knoch et al., 2014, 2015; Storch 2007).
The present study aimed to contribute further empirical evidence about the effectiveness
of academic training on the development of the writing skills of Vietnamese second language
learners enrolled in an undergraduate English program. The investigation was designed in view
of the second language (L2) writing standards set by the Common European Framework of
Reference (CEFR) and in reference to the specifications of the Vietnamese English language
educational system.
The sample involved a total of 90 participants, 30 from each of the following CEFR
English language proficiency levels: B1, B2, and C1. The instrument was modeled after the
IELTS Academic Module Writing Task 2 which requires test-takers to write a minimum 250-
word essay on a given prompt. The participants’ essays were scored by two independent raters
following the IELTS Writing Task 2 Band Descriptors.
ii
The data was analyzed through 5 one-way ANOVAs, which aimed to compare the three
levels of proficiency, B1, B2, and C1, on their overall writing scores, and on each of the two
macro (Task Response and Cohesion and Coherence) and micro sills (Lexical Resources and
Grammatical Range and Accuracy)
The results revealed two main trends. First, the writing skills of Vietnamese L2 learners
of English showed a significant improvement in the course of their study in all proficiency
levels. Second, the development was of a larger magnitude between levels B1 and B2 and of a
smaller magnitude between levels B2 and C1. The latter trend appears more meaningful when
juxtaposed with the expected IELTS writing band score ranges for each of the three CEFR levels
investigated in the present study. Specifically, the obtained scores matched the CEFR standards
at level B2, but were above the expected minimum score for level B1 and below the minimum
expected score for level C1.
These findings carry valuable implications for the specific Vietnamese educational
context, highlighting both the strengths and weaknesses of the English language writing
curriculum. They pinpoint issues related to the placement of students in CEFR levels without
specific empirical data as well as raise questions about the time, effort, and teaching practices
necessary to secure learners’ progress from lower to higher proficiency, particularly after level
B1.
Another contribution of the study is that it examined developments in L2 academic
writing both on the macro and micro level, and thus has offered a more comprehensive picture of
the different components of the writing skill and their development through a course of study. In
contrast, existing research has either looked at the writing skill in a holistic way or focused on
iii
one or some of its elements, but has rarely approached writing as a balanced composite of macro
and micro skills.
Key words: L2 writing skills, macro-skills, micro-skills, developmental trends, CEFR, IELTS
iv
DEDICATION
This thesis is dedicated to my wonderful parents, Nguyen Ngoc Le and Nguyen Thi
Chau, who have given me this precious life and have shaped me as who I am; my foster parents,
Gary and Joyce Mosimann and their amazing family for making the United States the second
home of mine; and my mentor, Gus Vrolyk, for encouraging me generously to always dream
bigger.
You are present in every success I have made.
Because of that, I just want to say: “Thank you! I love you!”
v
ACKNOWLEDGMENTS
I would love to thank wholeheartedly all the individuals who have greatly supported me
during my Master’s program and the thesis writing. I owe my deep gratitude to Dr. Charkova,
my beloved professor and my Thesis Chair, for her enthusiasm, patience, encouragement,
inspiration, responsibility, professionalism, and immense expertise; Dr. Halliday and Dr. Martin
whose valuable suggestions and encouragements are highly appreciated; Ms. Diane Korando, the
secretary of the Linguistics Department, whose great assistance in any forms has helped smooth
my dual major graduate study; my teachers and colleagues in Vietnam who enthusiastically
encouraged their students to participate in the study; the participating Vietnamese undergraduate
students for their priceless time and responses; the close students who assisted me in various
forms of collecting data; The Fulbright Program and Alpha Delta Kappa (International
Organization of Women Educators) for sponsoring my Master’s program in SIUC; my dear
parents and my foster parents for their loving support and encouragements; and Gus Vrolyk, my
friend and mentor, for his belief in my future.
Thank You! For everything you have done for me!
vi
TABLE OF CONTENTS
CHAPTER PAGE
ABSTRACT .................................................................................................................................... i
DEDICATION .............................................................................................................................. iv
ACKNOWLEDGMENTS ..............................................................................................................v
LIST OF TABLES ........................................................................................................................ vii
LIST OF FIGURES ..................................................................................................................... viii
CHAPTERS
CHAPTER I Theoretical Background .............................................................................1
CHAPTER II – Empirical Background .............................................................................15
CHAPTER III – Methodology ..........................................................................................25
CHAPTER IV – Results ...................................................................................................33
CHAPTER V – Discussion and Conclusions .................................................................. 47
REFERENCES .............................................................................................................................66
APPENDICES
Appendix A – Consent Form .............................................................................................74
Appendix B – Research Instrument ...................................................................................76
Appendix C – IELTS Writing Task 2 Descriptors.............................................................79
VITA ............................................................................................................................................80
vii
LIST OF TABLES
TABLE PAGE
Table 1 ...........................................................................................................................................11
Table 2 ...........................................................................................................................................28
Table 3 ...........................................................................................................................................31
Table 4 ...........................................................................................................................................34
Table 5 ...........................................................................................................................................37
Table 6 ...........................................................................................................................................39
Table 7 ...........................................................................................................................................41
Table 8 ...........................................................................................................................................43
Table 9 ...........................................................................................................................................59
viii
LIST OF FIGURES
FIGURE PAGE
Figure 1 ..........................................................................................................................................02
Figure 2 ..........................................................................................................................................34
Figure 3 ..........................................................................................................................................36
Figure 4 ..........................................................................................................................................37
Figure 5 ..........................................................................................................................................38
Figure 6 ..........................................................................................................................................39
Figure 7 ..........................................................................................................................................41
Figure 8 ..........................................................................................................................................42
Figure 9 ..........................................................................................................................................43
Figure 10 ........................................................................................................................................44
Figure 11 ........................................................................................................................................45
Figure 12 ........................................................................................................................................51
Figure 13 ........................................................................................................................................56
Figure 14 ........................................................................................................................................56
Figure 15 ........................................................................................................................................58
Figure 16 ........................................................................................................................................58
1
CHAPTER I
THEORETICAL BACKGROUND
The Importance of English for Academic Purposes
In the 21
st
century, English has unquestionably acquired the status of a Global Language
and the world’s biggest Lingua Franca (Crystal, 2003; Goodal & Roberts, 2003, Jenkins, 2009;
Seidlhofer, 2004, 2011; Sung, 2014). Thus, it has established itself as the primary language of
academic studies and research, of political and cultural communication, business, and travel. The
English language has become a prerequisite and a top priority for employers, especially
international companies in their hiring preferences (Tini, 1998). Being aware of the growing
importance of the English language and the necessity of having proficient English skills in
having a well-paid job, more and more people invest time, money, and effort in English language
education.
The importance of English skills in the market economy has led to a dramatic increase in
the number of English language students. For example, the number of Vietnamese test-takers of
the two most popular standardized exams, the International English Language Testing System
(IELTS) and the Test of English as a Foreign Language (TOEFL), has increased remarkably in
the recent years. Reports from ETS - the company which is in charge of the TOEFL test - show
an increase of Vietnamese TOEFL test takers from 1,699 in the period 2001-2002 to 1,963 in the
years of 2002-2003, and a rocketing rise to 5,194 between the years 2005 and 2006 (Educational
Testing Service, 2001, 2002, 2006).
The growing importance of the English language can also be seen in the total number of
Vietnamese students studying abroad at universities where English is the main instructional
language. The “Open Doors Fact Sheet Country Report 2014” by the Institute of International
2
Education (IIE) shows that the number of Vietnamese students studying in English speaking
countries, such as Australia and the US, has been on the rise. Figure 1 contains some of these
statistics as published on the IIE website (2015, September 10). This trend provides support for
Hyland’s (2013, p. 54) observation that being proficient in English is nowadays “less a language
than a basic academic requirement for many users around the world.
Figure 1. Data from the IIE website
The Controversy of Teaching ESL Writing Skills in Academic Contexts
The Dynamism of Language and Language Development
Among the four language skills, reading and writing are the most important for academic
performance (SavilleTroike, 1984). In order to become effective L2 readers and writers,
students go through different developmental stages. Some researchers are in favor of the view
that L2 learners do not consistently make progress through a series of stages (Larsen-Freeman,
2006) and fluctuation is an inseparable part of a dynamic system (Thelen & Smith, 1996; Van
3
Geert & Van Dijk, 2002). Researchers have attempted to capture the dynamism of language and
language development by the use of different terms, such as “motors of change” (e.g. Thelen &
Smith, 1996), “the developmental ladder metaphor” (Fischer, Yan, & Stewart, 2003), and “a
make-do solution” (Larsen-Freeman, 2006). Larsen-Freeman claims that “its [learner language]
development is not discrete and stage-like but more like the waxing and waning of patterns; that,
from a target-language perspective, certain aspects of the behavior are progressive, others,
regressive; that change can be gradual and it can be sudden” (p. 1).
Similarly, Fischer et al. (2003) claim that “[language] development is seen as a complex
process of dynamic construction within multiple ranges in multiple directions” (p. 492).
Learners’ progress can be made at different levels, at different times (Marchman, Thal,
Tomasello, & Slobin, 2005). “The sudden discontinuity of the phase shifts illustrates the non-
linearity of complex systems” and these fluctuations and variations should not be considered as
data outliers (Larsen-Freeman, 2006, p. 3-4). Therefore, Bley-Vroman (1983) refutes the position
that language learning is target-centric and he argues that learners’ language does not necessarily
improve through stages in a linear model, each of which gets closer to the target (L2). In several
texts (e.g. Humphreys, Haugh, Fenton-Smith, Lobo, Michael, & Walkinshaw, 2012; Shaw &
Liu, 1998; Storch, 2007, 2009) it has been noted that although English as second language (ESL)
learners invest in time and finance, their writing skills do not show considerable changes over
time.
The theory of language dynamism has empirical implications for teachers and policy-
makers as individual learner characteristics should be taken into consideration in order to
facilitate the learning progress. According to Selinker (1972, p. 213), “a theory of second
language learning that does not provide a central place for individual differences among learners
4
cannot be considered acceptable.” However, the implementation of this idea into practice is not
that easy because most English language programs follow rigid curricula within fixed timelines,
which leave little space for flexibility and changes that would accommodate individual learning
styles. Consequently, the issue of whether English training programs always lead to positive
developments in the students’ language skills is controversial.
The Roles of Micro and Macro Skills in Writing Development
Another debatable issue among L2 composition researchers concerns the role that micro
and macro skills play in becoming effective writers. Brown (2007, p. 399) identifies the
following micro and macro skills:
Micro-skills
Produce graphemes and orthographic patterns of English.
Produce writing at an efficient rate of speed to suit the purpose.
Produce an acceptable core of words and use appropriate word order patterns.
Use acceptable grammatical systems (e.g., tense, agreement, pluralization),
patterns, and rules.
Express a particular meaning in different grammatical forms.
Macro-skills
Use cohesive devices in written discourse.
Use the rhetorical forms and conventions of written discourse.
Appropriately accomplish the communicative functions of written texts according
to form and purpose.
5
Convey links and connections between events and communicate such relations as
main idea, supporting idea, new information, given information, generalization,
and exemplification.
Distinguish between literal and implied meanings when writing.
Correctly convey culturally specific references in the context of the written text.
Develop and use a battery of writing strategies, such as accurately assessing the
audience’s interpretation, using prewriting devices, writing with fluency in the
first drafts, using paraphrases and synonyms, soliciting peer and instructor
feedback, and using feedback for revising and editing.
The macro and micro skills of writing are also known as the sub-constructs of writing
(Brown, 2007). They constitute the emphasis of writing rubrics which are used in standardized
and classroom based assessment. A controversial issue in ESL writing pedagogy concerns the
question of which is more important in the development of the L2 writing skill, the micro or
macro skills. According to Robinson (2001) and Van Geert and Steenbeek (2005), there are
many dimensions of language (macro versus micro skills, and/ or sub-elements within each
macro and micro level) that interact and influence each other, depending on the goals and
priorities of ESL learners at a certain period of time. These dimensions are mutually supportive
in a way that an improvement in one subset may lead to an improvement in another subset.
For example, this interconnectedness can be seen clearly in the close and positive
relationship between lexical and syntactic development in second language acquisition. As L2
learners develop their vocabulary, grammatical structures are shown to become more complex
(Robinson & Mervis, 1998). There is also competition among the dimensions, subsystems, or
elements, i.e., macro versus micro skills. This competitive relationship occurs because the human
6
brain cannot perform more than one task at a time, i.e., multi-tasking (Van Geert, 2003). For
instance, when ESL learners focus their attention on the micro-elements of their writing,
involving grammatical accuracy and lexical sophistication, they may ignore issues on the macro-
level concerning the coherence and cohesion of their writing.
The dynamic, yet complex, nature of the language system and language learning have
caused a significant debate about what should be taught in an L2 composition classroom and
how it should be taught. It is an idealized goal that the “college-level L2 writing program […]
prepares students to become better academic writers” (Spack, 1988, p. 29). As Sizer (2004) notes
these are high standards that teachers and learners may find difficult to measure up to because of
circumstances beyond their control, such as a vast number of students in one class, mixed ability
levels, and a variety of other hurdles.
Unquestionably, there is a need for bridging the gap between L2 writing theory and
practice (Null, 2011). However, there is little empirical research that could help provide specific
guidelines for English writing teachers with regards to what specific aspects of the writing skill
should be emphasized at different levels of the developmental process. Moreover, there is little
empirical evidence about possible developmental trends in the writing ability of English
language learners and whether such trends could be associated with the type of instruction and
the level of proficiency.
Grammatical Feedback in Writing Development
Within the existing research, another issue is that the focus has been mostly on
grammatical errors and the pros and cons of giving corrective feedback. Thus, the other micro
and macro skills of writing have been left out of the picture. Among the studies that have
addressed the question of corrective feedback on students’ grammatical errors, two opposing
7
views have been expressed. For example, Truscott (1996) argues against the effectiveness of
second language composition correction. He states that grammar error feedback is not only
ineffective but also even harmful to ESL student writers’ development in writing skills, and
hence should be abolished. A completely different view is expressed by Ferris (1999) in her
response to Truscott (1996), in which she makes compelling points for grammar correction in L2
writing. She recommends error feedback from ESL/ EFL teachers as they help their learners to
make progress in the L2 composition classroom.
Expanding on the issue of error correction, Bitchener and Cameron (2005) investigated
whether different types of error correction could lead to better accuracy in L2 writing. The three
correction strategies that they examined included: 1) written feedback and student-researcher
conference; 2) written feedback only; and 3) no corrective feedback. The study focused on three
common types of grammatical errors concerning the employment of the simple past tense, the
definite article, and prepositions. The study affirmed that the combination of written feedback
and conference was more helpful than written feedback alone. It also found that combined
feedback made a significant impact on the accuracy improvement in “treatable” errors (the past
simple tense and definite article), whereas no considerable accuracy progress was made for the
“untreatable” errors (prepositions). These results suggest that not all types of errors are the
same. It appears that the ones that can benefit the most from corrective feedback are those that
are rule-based and can be explained in an analytical way; those that do not seem to benefit from
direct feedback are related to language aspects that need to be memorized, such as prepositions.
As mentioned earlier, most of the existing research has focused on the role of grammar
correction, however, Ferris and Hedgcock (2013, p. 310) aptly observe that “successful writing,
by definition, includes and requires the effective deployment of a range of linguistic and extra-
8
linguistic features, including vocabulary, syntax, punctuation, capitalization, paragraphing, and
spacing.” The researchers also mention rhetorical grammar, genre awareness, and lexical
variation as additional requirements for a proficient ESL writer. Ultimately, writing gives writers
opportunities to show their ideas and engage the reader in the text (Kreidler, 1971). Therefore, in
order to become proficient writers, second language students need to know and to be able to
utilize appropriately different aspects of the writing skills, which are usually categorized as
micro and macro skills of writing (Brown, 2007).
Context of the Study
The English language has become crucial in the modernization and industrialization of
Vietnam in recent years. Understanding its role in bridging the socio-economic and educational
gaps between Vietnam and other countries, the Vietnam Ministry of Education and Training has
been adopting various international standardized proficiency language tests as standards for the
English language education. These tests serve as entry and exit requirements for English
language learners at all educational levels, as models for curriculum design and as standards for
the teaching practice and assessment.
After many years of using IELTS as the standard for curriculum design and language
assessment, under Decision 1400/QD-TTG in 2008, the Vietnam Ministry of Education and
Training (MOET) officially adopted the Common European Framework of Reference (CEFR) as
the national framework of reference for English education in Vietnam. This has substantially
changed the national entry and exit requirements for English proficiency. It has also led to major
changes in the curriculum and the criteria for linking standardized international English language
tests such as TOEFL and IELTS to current various English programs in Vietnam. The University
9
of Foreign Languages in Danang, Vietnam, where the study took place, officially adopted CEFR
as the framework of curriculum development in 2014.
What is CEFR?
CEFR was first published in 2001 by the Council of Europe after more than twenty years’
discussion and research dating back to the 1970s. CEFR was designed to be a “transparent,
coherent, and comprehensive” (Cambridge, 2011, p. 8) framework of reference for language
learning and teaching across languages. It provides criteria for language curriculum elaboration,
materials design, and foreign language proficiency assessment (Cambridge, 2011).
Language learners’ skills, including listening, speaking, reading and writing, are
categorized into six reference levels in CEFR, from levels A1 to C2. A1 is the lowest level and
C2 the highest in the framework. Each level has a list of the expected competences and skills that
learners at each level should have and be able to demonstrate in performance. More specifically,
A1 and A2 represent the lowest proficiency levels under the general name Basic User. At these
very first levels, language learners just start their foreign language learning, and therefore, they
can only “understand and use familiar everyday expressions” and phrases that are “related to
areas of most immediate relevance”, e.g., family and routine tasks (Cambridge, 2011).
The next two levels, B1 and B2, are called Independent User. At these two levels,
learners feel freer to express their opinions about a regular topic. Specifically, B1 learners can
comprehend clear standard text that appears regularly in their immediate environment, such as at
work and school. They are able to use language to survive in new places where the target
language is spoken. These learners are able to produce simple texts and they can connect ideas
on familiar topics or topics of their interest. They are further able to describe their own
experiences, dreams, and other aspects and can “briefly give reasons and explanations or
10
opinions and plans” (Cambridge, 2011, p. 8). At level B2, learners should be able to handle
topics on more abstract issues, and understand complex texts on both concrete and abstract
topics, especially related to their field of specialization. They are able to interact fluently and
spontaneously with native speakers, with no difficulties in regular communication. They can
write comprehensive texts on a variety of topics and clearly express their perspectives on specific
topics with presenting pros and cons for each option.
The highest levels of proficiency, C1 and C2, are called the Proficient User. At this point,
learners master the use of the language and get closer to native-like skills. C1 level learners can
understand longer complicated texts and their implicit meaning. They are able to use language
without difficulties in expressing themselves fluently and effectively in social, academic, and
professional settings. They have a significant ability to develop clear, well-organized, and
cohesive texts on complex topics. Being the most fluent in language usage, C2 level learners are
able to comprehend everything they hear or read. They can easily summarize and organize their
arguments for a cohesive presentation. C2 learners are able to differentiate connotations of words
and express themselves fluently and spontaneously in an unrehearsed situation. Apart from the
above described six levels, CEFR also defines three plus levels (A2+, B1+, and B2+) in order to
supplement gaps in the scales.
Matching CEFR Levels with Standardized Language Proficiency Scores
For the sake of generalizability, CEFR is used to compare proficiency levels among
language learners and to offer a “means to map the progress” of learners (de Europa & de
Cooperación Cultural, 2002). CEFR was conceived as a framework of reference that would be
used to compare language tests across national boundaries, providing a foundation for language
qualifications recognition and, hence, facilitating educational and professional mobility.
11
At present, specific CEFR levels have been linked to particular ranges of scores on
various international standardized proficiency tests such as the Cambridge English Test, IELTS,
and TOEFL as shown in Table 1. The IELTS and TOEFL tests, for example, are set side by side
on the scales of CEFR. While IELTS band scores are linked to levels B1 through C2, TOEFL
test scores are linked from levels B1 through C1, as the highest TOEFL score range of 110 to
120 corresponds to level C1. Another aspect that is worth noticing is that no A levels in the
CEFR correspond to these two international tests.
Table 1
Alignment of CEFR levels with the major English language proficiency tests
A2 B1 B2 C1 C2
Cambridge
English
Key (KET)
Cambridge
English
Preliminary
(PET)
Cambridge
English
First (FCE)
Cambridge
English
Advanced
(CAE)
Cambridge
English
Proficiency
(CPE)
IELTS 4-4.5
IELTS 5-6.5
IELTS 7-8
IELTS 8.5-9
TOEFL iBT
57-86
TOEFL iBT
87-109
TOEFL iBT
110-120
Since this study examined the alignment between the actual obtained writing scores of 1
st
,
2
nd
, and 3
rd
year students with their expected CEFR levels, an emphasis was given to the IELTS
score ranges which correspond to levels B1 (IELTS 4 to 4.5), B2 (IELTS 5 – 6.5), and C1
(IELTS 7-8). It should be noted that the B1 range is very narrow, allowing only half a band
difference between the lowest and highest cut-off level scores, whereas level B2 allows a 1.5
band difference between the lowest and highest scores within the level. Finally, level C1 includes
a difference of 1 band between the lowest and highest scores that are identified with the level.
These observations are important and will be used in the discussion of the results of this study.
12
Purpose of the Study
As discussed above, a relatively small number of studies have been carried out to
investigate L2 writing skills as a complex system in which the development of each element can
affect the others. Empirical research about the effect of English writing courses on the
development of writing skills is still not sufficient and the findings tend to be mixed (DeKeyser,
2007; Humphreys et al., 2012; Storch, 2007, 2009). DeKeyser (2007) observes that there is
insufficient research on the effects of studying in an English-language-medium university on
ESL learners’ reading, listening, and writing skills.
Most importantly, research about the English writing skills of Vietnamese learners of
English is non-existent (at least to the current knowledge of this researcher). As English gains
greater popularity in Vietnam, there is an urgent need for cooperation among teachers,
researchers, and educators. It is difficult for English as a Foreign Language (EFL) teachers who
teach Vietnamese-speaking ESL learners to come up with appropriate teaching methods and
assessment strategies when there is still insufficient data and research about Vietnamese ESL
learners’ writing skills.
Overall, this study hopes to offer some insight about the effect of formal instruction
within an academic program of study on the development of L2 writing skills. Existing studies
have investigated mechanical strategies for writing used by ESL learners in various types of
programs and at various levels of proficiency, but focused primarily on writings by non-English-
majored learners (e.g., Knoch et al, 2014, 2015; Serrano et al, 2012; Storch, 2009). This study
focuses on English-majored students in an undergraduate program in Vietnam with three
proficiency levels as described in CEFR, specifically B1, B2, and C1, and thus provides cross-
sectional data from lower and higher proficiency levels. It also employs a comprehensive
13
operationalization of the writing skill as it looks at both the macro- and micro skills that partake
in the shaping of L2 learners’ writing ability.
Another question of interest is related to the fact that even though the English language
education in Vietnam follows the CEFR benchmarks and is aligned with the CEFR proficiency
levels, this alignment is rather problematic because it is not based on empirical evidence. For
example, the first year students at the University of Foreign Languages in Danang are placed in
level B1, the second year students are placed in level B2, and the third year students are placed in
level C1. This placement is made on the assumption that with each year in the program, students’
proficiency level improves and that the improvement is valid for all students in the same year of
study. However, the assignment of CEFR levels is not based on specific assessment data that
shows that the students have actually progressed from level B1 to B2 to C1.
Considering these issues in the Vietnamese English language system, the present study
has set out to address the following topics:
1) To provide empirical evidence that shows whether, and to what extent, Vietnamese L2
learners of English develop their L2 writing skills on the macro and micro levels as a result of
their academic English curriculum.
2) To juxtapose the obtained performance data from three CEFR proficiency levels, B1, B2, and
C1, with the expected outcomes as specified in CEFR, and thus determine whether the
Vietnamese alignment of CEFR levels with years in the program is valid.
Both topics will provide theoretical and practical implications for curriculum developers,
teachers, theorists, and policy-makers in the field of English language teaching in Vietnam.
14
Chapter I Summary
This chapter has provided an overview of the theoretical background of the present study.
It attempted to highlight the following interrelated issues:
1) The L2 writing skill is a composite of macro and micro skills.
2) There is a need for specific empirical research into the effect of academic writing classes and
programs on the development of both the macro- and micro skills of L2 writing.
3) The teaching of academic writing in the Vietnamese English language context follows the
CEFR benchmarks and is thus aligned with the CEFR proficiency levels; however, this
alignment is rather questionable because it is not based on empirical evidence.
The present study aims to put together the three issues outlined above by examining
possible developmental trends in the writing skills of Vietnamese learners of English on the
macro and micro levels, and also to establish whether these developmental trends correspond to
the expected levels of performance at levels B1, B2, and C1 as described in CEFR.
The next chapter provides a more detailed review of the empirical body of research that is
related to the present study.
15
CHAPTER II
EMPIRICAL BACKGROUND
Currently, there is a relatively small but growing body of research about the development
of academic L2 writing skills. The existing studies range in terms of the types of courses
participants are enrolled in, the length and intensity of the programs, and the age and proficiency
levels of the ESL learners. Some studies (e.g., Brown, 1998; Craven, 2012; Green, 2005) focus
on learners’ preparation for English proficiency tests, such as IELTS, whereas others examine
advanced ESL learners in immersion environments (e.g., Astin, 1993; Elder & O’Loughlin,
2003; Hill & Storch, 2008; Knoch, Rouhshad, Oon, & Storch, 2015; Knoch, Rouhshad, &
Storch, 2014; Larsen-Freeman, 2006; Serrano, Tragants, & Llanes, 2012). Overall, the focus of
the existing research is on ESL intermediate learners in an English-medium school.
The review of literature is divided into three sections. The first two sections introduce
and differentiate existing studies in L2 writing development by the length of the examined
courses. The first section focuses on short duration courses, whereas the second section focuses
on longer duration courses. The final section examines and reviews influences of English
language training on the improvement of ESL writing skills.
The Effect of Short Duration Instruction on ESL Writing Skills
Research on the effect of short duration training has focused on the 2 composition skills
of students at universities where English is used as the language of instruction (e.g., Brown,
1998; Elder & O’Loughlin, 2003; Green, 2005; Green & Weir, 2003; Humphreys et al., 2012;
Larsen-Freeman, 2006; Storch & Tapper, 2009). These studies have typically employed different
types of measures and a test-retest design to examine a variety of micro and macro skills of
writing in English-medium courses. The empirical findings are mixed, depending on how the
16
data were collected and measured. The two typical types of measures are writing band scores and
discourse analysis. Studies using writing band scores assess student writing by assigning a
holistic score and/ or analytic scores for each macro and micro skill (e.g., Archibald, 2001;
Green, 2005; Sasaki, 2007, 2009). Discourse analysis, in contrast, does not assign a certain score
to an essay. Instead, it examines the development of L2 writing through various discourse
analytic measures, e.g., word count and the ratio of words to T-units (e.g., Larsen-Freeman,
2006) and the ratio of error-free T-units (EFT/T) (e.g., Larsen-Freeman, 2006; Tsang & Wong,
2000). The length of the investigated courses varies between approximately three and six months
(e.g., Brown, 1998; Green, 2005; Hu, 2007).
Released in 2002 by the IELTS testing agency, the IELTS guidelines recommended that
learners can improve their IELTS score by one band by taking up to three months of an intensive
English course (IELTS, 2002). In order to test the cited recommendation, Green (2005)
conducted a large scale quantitative research project in cooperation with IELTS partners. The
research involved two linked studies that are called Phase 1 and Phase 2. In Phase 1, Green
(2005) surveyed all the data on those who took the IELTS test more than once between January
1998 and June 2001. In Phase 2, 476 learners from 15 institutions were asked to take the official
IELTS test at the entrance and exit of their intensive English courses. These courses were either
IELTS Preparation or EAP, or a combination of both. A large portion of the learners taking part
in Phase 2 were from China or Taiwan (52.5%), whereas the rest were from other areas of East
Asia (13.9%) and Europe (15.8%) (p. 51).
In Phase 1, the research used a total of 15,380 collected records to examine the score
change as a whole. The interval between the two IELTS tests was at least 12 weeks, i.e., the test-
takers of this phase were not allowed to retake the exam before three months had passed after the
17
first taking. The results from both Phase 1 and Phase 2 showed many similarities in L2 writing
development (Green, 2005). Interestingly, the findings did not support the recommendation in
the British Association of Lecturers in English for Academic Purposes (BALEAP) Guidelines on
English Language Proficiency Levels for International Applicants to UK Universities. The
Guidelines affirmed that three month intensive courses should be able to help ESL students
improve their IELTS score by 1.5 band score for their university entrance language
requirements. Green (2005) found that those whose writing skills were at lower band score – a
band score of 5.0 or lower - were more likely to improve their scores in the following IELTS
test. In contrast, the band scores for those who scored 7.0 or above in the first test were shown to
decrease; for those with band score of 6.0, the composition score tended to remain the same. The
period of time between the two successive tests did not affect the test scores (Green, 2005).
Green observes that progress was only found for lower IELTS composition band scores, and the
length of the English courses did not account for any changes in ESL learner writing.
Similar findings were also reported by Brown (1998), who carried out an experimental
study on a smaller scale in Melbourne, comparing the effectiveness of two English courses on
developing ESL learners’ writing skills. One course was an IELTS preparation course and the
other was English for Academic Purposes (EAP) course without an IELTS focus. Both of them
were only 10 weeks long. The participants were administered an entry and exit exam. Between
the pretest and the posttest, the students were also asked to take an interview regarding their
motivation and their perspectives on their development. Classroom observation was also done by
the researcher during the whole period of ten weeks. The learners were encouraged to do a
number of writing assignments and these assignments were corrected by the teachers of each
course.
18
There were differences in the motivation between ELLs in IELTS Preparation and the
EAP class. While IELTS students were reportedly motivated to complete all the assigned essays
(some even did extra practice), those in the EAP class felt reluctant in doing the homework
(Brown, 1998). The pre-test-post-test comparisons showed that the students in the IELTS
Preparation class had an improvement of 0.9 after the 10-week course (final score between 4.3
and 5.2), whereas the EAP students lost 0.3 of a band in the posttest (final score between 5.3 and
5.0) (Brown, 1998). The data suggest that a course in IELTS preparation may contribute to the
success of writing improvement (Brown, 1998, p. 36). However, because the study was
conducted on a relatively small scale and the dropout rate was nearly 50%, the findings were
hardly conclusive.
Another study by Larsen-Freeman (2006) examined the effectiveness of English
instruction on the fluency, complexity, and accuracy of writing skills. Larsen-Freeman
investigated five Chinese-speaking high-intermediate English learners attending a ten-month
English language course at a university in the United States. Apart from the academic hours the
learners spent in the class, they were also instructed to fulfill varied learning activities outside
their classroom, focusing on grammar, pronunciation, reading, listening, and unfamiliar
vocabulary. Using a repeated-task and time-series design, the researcher asked the five
participants to do four identical tasks during the course. They had to write and tell the same
stories every six weeks - three days apart for both renditions - about an experience in the past
that they remembered the most without being worried about making errors. Feedback was not
given to the participants.
The quantitative results suggest an overall improvement in all aspects of L2 writing skill
under consideration. Learners’ writing became more fluent and accurate as the lexical and
19
syntactic complexity increased over the ten-month period. However, there was a great deal of
inter-individual and intra-individual variation over the research time, especially for accuracy.
These findings are consistent with the theory of dynamism of language and language learning
and that each learner carries unique characteristics.
Taken together, the findings of the above studies provide mixed evidence about the
effectiveness of short duration training on the development of ESL writing skills. On the one
hand, there is a positive trend especially regarding the improvement observed among lower level
ESL writers. On the other hand, no improvement or even a slight decline is reported regarding
higher level ESL writers. Also, a positive effect is observed when various tasks, both in class and
outside of class, are fulfilled by learners. Because these findings were observed in relation to
short duration training courses, comparing them to the effect of longer duration training is also
important.
The Effect of Longer Duration Instruction on ESL Writing Skills
This section reviews studies that have examined the effect of longer duration training, of
one year and longer, on the development of ESL writing skills (e.g. Craven, 2012; Elder & O’
Loughlin, 2003; Knoch, Rouhshad, Oon & Storch, 2015; Knoch, Rouhshad, & Storch, 2014;
O’Loughlin & Arkoudis, 2009; Serrano, Tragants, & Llanes, 2012).
Craven (2012) conducted a study using test-retest design at a large university in Sydney
to investigate whether undergraduate students are able to increase their overall IELTS band score
from an average of 6.0 to at least 7.0 after one year of study abroad. The participants included 40
ESL students from three different majors, who took the IELTS exam at the start of the program
and at the end. The program duration varied between 18 and 36 months. The findings showed
that only a minority of the participants improved their band score from 6 to 7. More importantly,
20
the improvement was found in the listening and reading skills, whereas production skills like
writing and speaking underwent little or no change over the same period of time. In fact, the
writing skills were the least improved with an overall score increment of 0.11.
The first IELTS Academic Module writing task observed a higher improvement in the
micro-skills, Lexical Resource followed by Grammatical Range and Accuracy. A smaller
improvement was observed in relation to the macro-skill of Coherence and Cohesion. However,
in the retest, the macro skill of Coherence and Cohesion showed the largest improvement,
followed by Lexical Resource and Grammatical Range and Accuracy. Overall, none of these
differences were statistically significant. Interestingly, those who had the lowest initial ITELTS
score at the outset tended to make the greatest improvement (Craven, 2012), which is similar to
the findings in Elder and O’ Loughlin’s (2003) study discussed previously.
Serrano, Tragant, and Llanes (2012) examined the effect of one-year immersion writing
programs on the writing skills of 14 Spanish ESL learners with respect to fluency, accuracy, and
syntactic and lexical complexity. The results showed no effect on accuracy, but a significant
effect on syntactic complexity. However, another study by Knoch, Rouhshad, and Storch (2014),
conducted at the University of Melbourne, reported few improvements in the writing skills of
101 international students after a one year-long English writing course. The only aspect of
writing skill that showed a slight improvement was fluency, whereas accuracy and vocabulary
and grammatical complexity were not shown to improve over the same period.
In conclusion, only a few studies have been conducted on the effect of longer duration
instruction on L2 writing skills. The findings from these studies are mixed and inconclusive.
Further investigations are therefore necessary in order to validate the existing results and
establish whether there are any developmental trends.
21
Influences on ESL Writing Skill
A review of ESL writing research shows that quite a number of studies have attempted to
examine the influence of different factors such as English-speaking environment immersion,
course design, teaching pedagogy, and course materials, etc. on ESL writing skill (Astin, 1993;
Craven, 2012; Elder & O’Loughlin, 2003; Hill & Storch, 2008; Knoch, Rouhshad, Oon, &
Storch, 2015; Knoch, Rouhshad, & Storch, 2014; Sasaki, 2007; Serrano, Tragant, & Llanes,
2012). The factors were identified by using questionnaires and/or semi-structured interviews
with some of the participants and/or teachers. In spite of different perspectives on the predictors
for learners’ chance of success, most of the studies found similarities in the importance of
feedback in L2 composition development. Participants in Knoch et al.’s (2014, p. 211) study
complained that “they did not receive any feedback on their writing from their content lecturers”.
This completely agrees with another finding by Knoch et al.’s (2015), in which 90.32% of the
participants reported that the instructors did not provide input regarding writing skill and they
considered the lack of error correction as the main reason for their failure to improve their L2
writing skills. Hill and Storch (2008) also observed that lack of opportunities to produce
extended writing and to receive feedback on writing during the course of study could be possible
reasons for the lack of improvement in L2 writing skills despite immersion in an L2 university
environment. Likewise, Astin (1993) measured self-reported gains in writing and other cognitive
skills across the undergraduate years in college. Aside from GPA and hours spent studying, the
two strongest partial correlations with writing skill improvement were the number of writing-
skills classes taken and the amount of feedback given by instructors.
English-speaking environment immersion is another important deciding factor in L2
writing development. Sasaki (2007) showed that study abroad has a significant effect on the
22
improvement of L2 writing skills when combined with L2 writing instruction. Furthermore,
Serrano et al.’s (2012) study shows that students who had a richer pool of vocabulary believed
the reason to be their frequent exposure to an English-speaking environment in their daily life. A
similar finding was reported in Hill and Storch (2008), which found a relationship between an
active involvement with native speakers and improvements in the writing skills of international
students enrolled at an Australian university. Over half of the participants (52.8%) in Knoch et
al.’s (2015) study shared the same perspectives as they reported that through reading books,
articles, magazines, and listening to music they improved their writing skills over time.
Course design, teaching pedagogy, and course materials also play an important role in the
development of ESL writing skills. Hu (2007) discusses the development of EAP writing courses
for Chinese students. The author emphasizes the importance of teaching objectives, teaching
methodology, resources and materials, and feedback for the learning benefits and academic
progress of the students. Further, Hu (2007) argues that developing a competency level in L2
writing is a challenging task because learners are faced with multiple hurdles at the micro and
macro level. At the micro level, they need to develop their overall language proficiency as they
expand their vocabulary knowledge and the sophistication of their grammar. At the macro-level
they need to master the skill of writing in terms of organization, coherence, cohesion, and
rhetorical styles. In addition, it is necessary to recognize that learners’ L1 writing skills and
experiences affect their L2 writing skills and often in a negative way, especially for lower level
learners. This means that at lower levels, for example, learners tend to apply their L1
grammatical structures in their L2.
Finally, Craven (2012) observes that in order to develop good writing skills L2 learners
need time and practice. Sharing the same viewpoint, Kellogg and Raulerson (2007, pp. 240-241)
23
state that as the writing process is a complex process of cognitive thinking, “an expertise in the
writing of extended texts … takes many years of deliberate practice. Such practice helps writers
to gain cognitive control over text production by reducing the individual working memory
demands of planning ideas, text generation, and reviewing ideas and texts.” This means that
short duration training may not lead to a significant improvement in ESL writing skills simply
because learners need more time to internalize the process.
Chapter II Summary
The review of literature shows that L2 writing research has examined factors that may
influence the development of ESL writing skills in what appears a rather isolated way. Therefore,
there is a need for more systematic research that can contribute to a better understanding of the
factors that contribute to the development of the macro and micro skills of ESL writers. In fact,
of the reviewed studies only two, Brown (1998) and Green (2005), have attempted to look at the
effect of writing instruction specifically on the development of the macro and micro skills of
writing. All the other studies have either approached writing in a holistic way or have relied on
participants’ self-reported perceptions of the factors that have influenced their writing skills.
Also, research about quantitative gains in L2 writing skills as a result of training and
instruction is rather limited and inconclusive. Some studies have reported significant
improvements (e.g. Craven, 2012; Elder & O’ Loughlin, 2003), but usually with lower
proficiency learners, whereas others (e.g. Knoch, Rouhshad, & Storch, 2014), have not observed
improvements.
Considering the gaps in the body of related research and the mixed findings of the studies
reviewed in this chapter, the current study adopted a more comprehensive approach to examining
possible developmental trends in the writing skills of Vietnamese college students enrolled in an
24
academic English program. It looks both at the macro and micro skills of three proficiency levels
of ESL learners and offers a cross sectional comparison of quantitative gains and the CEFR-
IELTS expected band scores. The methodology of the study is explained in the next chapter.
25
CHAPTER III
METHODOLOGY
The present study used a cross-sectional research design in a quantitative framework.
This chapter outlines the different components of the research methodology of the study. It
includes the following sections: Research Problem, Research Questions, Participants, Research
Tools, and Methods of Data Analyses.
Research Problems
The main purpose of this research was to investigate possible developmental trends in the
writing skills of Vietnamese ESL learners and to juxtapose them with the expected levels of
proficiency specified by the CEFR. Specifically, the study aimed to examine whether ESL
learners’ writing skills improved in the course of their undergraduate program, and if so, whether
the improvement could be observed across all levels of proficiency both on the macro and micro
level. Another goal of the study was to establish whether the obtained writing band scores for
each proficiency level correspond to the expected scores in the CEFR-IELTS paradigm. The
ultimate purpose of the study was to provide empirical data that would inform the pedagogical
methods of teaching English composition at university levels in Vietnam.
Research Questions
In view of the research purpose, the study addressed the following research questions:
1) Does the writing skill of English language learners in a Vietnamese undergraduate
program improve over the course of their studies?
2) If improvement is observed, is it observed both on the macro level (Task Response;
Coherence and Cohesion) and on the micro level (Lexical Resource; Grammatical
Range and Accuracy)?
26
3) How do the obtained scores for each proficiency level correspond to the CEFR-
IELTS expected score ranges?
Variables
Independent Variable
The independent variable in this study was participants’ level at the university.
Specifically, it included three levels, B1, B2, and C1, which are equated with first year, second
year, and third year students in the context of Vietnam. The inclusion of the three different
levels was made in order to see if there was development in students’ English writing skills when
they move through the program and advance to higher levels.
Dependent Variables
The dependent variables were calculated based on the participants’ performance on an
essay writing task and in view of the research questions:
a) total writing band score
b) band score on Task Response
c) band score on Coherence and Cohesion
d) band score on Lexical Resource
e) band score on Grammatical Range and Accuracy
Participants and Instructional Context
The current study focused on ELLs who were studying English at the University of
Foreign Languages in Danang, Vietnam. The students who are admitted to the university are
required to pass the high school national exit exams and a university entrance exam. The
entrance exam is a norm-referenced national exam designed specifically for the purpose of
27
university admissions. The exam is scored on a 0 to 10 band range. The higher the band, the
higher a student’s chance is for admission.
In view of the CEFR levels, the first year students are identified with the B1 level of
proficiency. During the years in the program, students follow the same curriculum and receive
the same instruction, but as they advance to the next level, the complexity of content and skills
increases. As part of their curriculum, they are required to take integrated-skills classes,
including grammar and composition, and have to achieve a certain level to advance to the next
class. Likewise, the second year students are aligned with the CEFR B2 proficiency level,
whereas the third year students are considered at C1 level. Attaining a C1 level proficiency is
one of the prerequisites for graduation from the program.
For the purpose of the study, six classes (two for each grade level) were randomly chosen
from the pool of the first, second, and third year students. A total of 250 students provided data.
For analysis, 30 essays were randomly selected from each level, B1, B2, and C1, amounting to a
total of 90 essays which were assessed and scored.
The participants’ age varied from 19 to 22, with an average age of 20. Due to the
imbalanced proportion between male and female students in the program of study, all male
students were included in the final sample. Hence, there were a total of 15 essays written by male
students (3, 6, and 6 essays for B1, B2, and C1 levels respectively) versus a total of 75 essays by
female students (27, 24, and 24 essays for levels B1, B2, and C1 respectively).
As for the participants’ familiarity with the IELTS writing tasks, the majority of the
participants at each level reported to have a vague idea of how an IELTS writing task looked
like, with approximately 50% of the participants choosing this answer (see Table 2). Thirteen
participants from the whole sample reported not having any idea, 10 checked the option familiar,
28
11 indicated that they were familiar and had received some practice, and only 1 participant had
taken the IELTS exam. This participant was in C1 level.
Table 2
Participants’ demographics
Level N Age Familiarity with IELTS
Min
Max
Mean No idea Vague idea Familiar
Familiar
Practice
Has taken
B1 30
19 22 20.07 7 15 2 5 0
Male 3
Female 27
B2 30
19 20 19.4 3 21 4 1 0
Male 6
Female 24
C1 30
19 22 19.93 3 15 4 5 1
Male 6
Female 24
Total 90
19 22 19.87 13 51 10 11 1
Research Instruments
The research instruments included a short demographic survey, a writing prompt, and a
scoring rubric.
Demographic Survey
Before the participants were given the writing prompt, they were asked to fill in a short
demographic survey including questions about their year in the program, age, sex, and familiarity
with the IELTS exam. The summary of the demographic data was presented in Table 2 above.
Writing Prompt
The participants were asked to write an argumentative essay modeled after the IELTS
Academic Module Writing Task 2 (see Appendix 1). Regarding the difficulty of the essay topic,
advice and assessment from senior lecturers were taken into careful consideration to make sure
29
that the topic was similar to their regular in-class and take-home assignments. The same written
and oral instructions were given to all participants in compliance with the Human Subjects ethics
standards.
Scoring Rubric
Since the essay prompt was modeled after the IELTS writing task, the scoring of the
essays was done following the IELTS Writing Task 2 Descriptors (Public Version). Another
reason for using the IELTS writing descriptors was that it has been validated and is
internationally accepted.
The IELTS writing descriptors include two criteria based on macro skills of writing and
two others covering micro skills. Specifically, macro skills of writing are addressed under the
categories titled Task Response and Coherence and Cohesion, whereas micro skills are assessed
under the categories of Lexical Resource and Grammatical Range and Accuracy. Task Response
is the first sub-construct in the IELTS writing descriptors, which measures how effectively a test-
taker 's essay addresses the task, including “a fully developed position in answer to the question
with relevant, fully extended, and well supported ideas”. Coherence and Cohesion, the second
sub-construct in the IELTS writing descriptors, measures test-takers' capability of using cohesive
devices effectively to logically present and connect ideas and supporting evidence in a way that
“attracts no attention” (Retrieved on 9.10.2015). Of the two micro skills, Lexical Resource
examines the participants' knowledge of English language vocabulary and their ability to use “a
wide range of vocabulary with very natural and sophisticated control of lexical features”. The
last sub-construct in the IELTS writing descriptors, Grammatical Range and Accuracy, measures
test-takers' ability to use “a wide range of structures with full flexibility and accuracy” (Retrieved
on 9.10.2015).
30
The essays were scored on the conventional IELTS 9-band scale. Specifically, the
participants received an overall score and scores for each sub-construct on a scale between 1 and
9. Scores for each sub-construct (Task Response, Coherence and Cohesion, Lexical Resource
and Grammatical Range and Accuracy) were treated equally, accounting for 25% each in the
total score. They were then summed and averaged to the nearest half or whole band for the
overall band score. In order to facilitate fairness, the rounding convention used in IELTS
marking was employed: "If the average across the four skills ends in .25, it is rounded up to the
next half band, and if it ends in .75, it is rounded up to the next whole band" (IELTS, 2015
September 10).
Raters and Scoring
The data collection was carried out by an assistant researcher in Vietnam who obtained
the ethics approval from the Human Studies Committee at Southern Illinois University (SIUC),
USA in December 2015. The collected essays were assessed by two independent raters, the
researcher and another assistant researcher - a native-speaking composition teacher at SIUC - to
ensure the reliability of the scoring process.
Before starting the scoring, both raters went through a training practice with three online
IELTS Writing Task 2 model essays that are publicly available. Each model essay has a score
given by professional IELTS scorers and an explanation of the score assignment. The researcher
and the second rater scored these model essays independently and then compared the scores they
assigned to each criterion with the scores given by the IELTS scorers and with each other. This
scoring of the three model essays showed an agreement between the two raters’ scoring and the
official IELTS raters. The scoring principle was that the variance between the two raters should
be kept within a small and acceptable range, 90 to 100% agreement.
31
Once the training was completed, 10 essays were randomly selected out of the 30 essays
for each proficiency level, amounting to a sample of 30 essays. This random sample was scored
by both raters independently and examined for inter-rater reliability through Pearson correlation
analysis and Cronach’s alpha test for consistency (George & Mallery, 2009).
Table 3 shows the results of the reliability analysis for each of the four sub-constructs for
all three CEFR proficiency levels (B1, B2 and C1). As seen from Table 3, all coefficients show
inter-rater agreement of 90 to 98%. The high level of inter-rater agreement served as assurance
that the researcher was capable of applying the IELTS writing descriptors in a consistent and
objective way. Hence, the rest of the 60 essays were scored by the researcher alone in strict
compliance with the IELTS writing descriptors.
Table 3
Inter-rater Reliability
Level Task Coherence Lexical Grammatical
B1 0.96 0.95 0.98 0.92
B2 0.90 0.90 0.98 0.97
C1 0.95 0.92 0.97 0.93
Data Analysis
The data were first examined for possible outliers and violations of the assumption of
normality through box plots and skewness values. All parameters were within the expected
values, except for two outliers from level B2, which were removed to clean the data. Descriptive
statistics and five one-way ANOVAs were used to address the research questions of the study.
Prior to each ANOVA, Levene’s test was employed in order to check the homogeneity of
variances. Among the 5 one-way ANOVAs, one was performed to compare the three CEFR
levels on the total obtained mean band scores, and four ANOVAs were performed to compare
the three CEFR levels on each of the four sub-constructs, including Task Response, Coherence
32
and Cohesion, Lexical Resource, and Grammatical Range and Accuracy. Since each ANOVA
involved three proficiency groups, a Tukey multiple comparison post-hoc test was performed as
appropriate to determine which levels of proficiency were significantly different from each other.
Hence, a total of 5 Tukey multiple comparison tests were performed and interpreted. All
statistical tests were carried out at level of significance α = .05 (CI = 95%) unless otherwise
noted.
33
CHAPTER IV
RESULTS
This study investigated the writing skills of Vietnamese learners of English across three
proficiency CEFR levels (B1, B2, and C1). The participants included 90 English-major students
at the University of Foreign Languages in Vietnam. Specifically, the participants were given the
following writing prompt, modeled after an IELTS Academic Module Writing Task 2:
Education should be accessible to people of all economic backgrounds. All levels of education,
from primary school to tertiary education, should be free. To what extent do you agree with this
opinion?”
The participants were allowed to complete the essay within a 45-minute time limit. The
IELTS Writing Task 2 Descriptors (Public version) and 9 band scale were used in the process of
scoring the results. Accordingly, each sub-construct of the writing skill (Task Response,
Coherence and Cohesion, Lexical Resource, and Grammatical Range and Accuracy) was
analyzed and given a score ranging from 0 to 9, depending to how well the essays met the
standards for each sub-construct. Zero is the lowest score, meaning test-takers' failure to measure
up to the requirements. In contrast, 9 is the highest score which indicates that all requirements
are met at the highest level. The band scores on each sub-construct were summed up and
averaged to obtain the mean band score for each participant.
The data were explored through descriptive statistics, box plots, skewness values, and 5
one-way ANOVAs, one for total score and four the four separate sub-constructs. This chapter
presents the results following the order of the research questions.
34
Results for Research Question 1
The first research question aimed to explore how the writing skill of English language
learners at a Vietnamese university develops across three CEFR proficiency levels (B1, B2, and
C1). The question was stated as: “Does the writing skill of English language learners in a
Vietnamese undergraduate program improve in the course of their studies?
The total band score was established by summing and averaging the scores of all the sub-
constructs to the nearest half or whole band. Each sub-construct accounted for 25% of the total
score (IELTS, 2015, September 10). The descriptive statistics and skewness values, as shown in
Table 4, reveal that the total band scores were normally distributed.
Table 4
Descriptive Statistics for Proficiency Levels on Total band score
Level N Mean SD 95% CI lower
95% CI Upper
Min
Max
Skewness
B1 30 4.79 0.71 4.52 5.05 3.25
6 -.026
B2 30 5.53 0.43 5.37 5.70 4.75
6.38
.098
C1 30 5.98 0.71 5.71 6.24 4.63
7.25
.096
The box plots in Figure 2 show a symmetrical distribution for total band scores with no outliers.
Figure 2. Box plot for proficiency levels on total band score
35
Prior to conducting the one-way ANOVA, Levene’s test was performed in order to check
for violations of the assumption of homogeneous variances. The results showed that this
assumption was observed, F(2, 87) = 3.20, p =.046, and the ANOVA results could be interpreted
without concern for their validity.
The one-way ANOVA revealed that the independent variable proficiency level had a
significant effect on the total band score, F(2, 87) = 27.03, p < .001. The descriptive statistics
showed an increase in mean scores from B1 level upwards (Table 4). To determine whether the
scores for each level were significantly different from each other, a Tukey multiple comparison
post hoc test was performed. The results showed that the two levels B1 and B2 were significantly
different from each other, (p < .001), as the B2 level had a significantly higher mean score than
B1 (mean difference = .75) (see Table 4). There was also a significant difference between the B1
and C1 levels, p < .001, where the C1 level showed a significantly higher ability to write an
academic essay than the B1 level (mean difference = 1.19). Levels B2 and C1 showed a
significant difference, p = .022; however, the mean difference (.44) was rather small compared to
the other two mean differences.
Overall, the total band score increased over time. The total score increased significantly
with each level of proficiency, however the mean difference between levels B2 and C1 was the
smallest indicating that the development of the writing skill between levels B2 and C1 was of
smaller magnitude than from B1 to B2.
36
Figure 3. Plot of proficiency level mean scores on total band score
Results for Research Question 2
Next, four one-way ANOVAs were performed with the purpose of obtaining a more
detailed account of how each of the four sub-constructs of the writing skill develops when
learners progress to a higher level of proficiency. Specifically, these four separate one-way
ANOVA analyses were carried out in view of the second research question which was stated as:
If improvement is observed, is it observed both on the macro level (Task Response; Coherence
and Cohesion) and on the micro level (Lexical Resource; Grammatical Range and Accuracy).
The data analysis is presented in four sub-sections, titled after the sub-constructs.
Task Response
The descriptive statistics and skewness values, as shown in Table 5, reveal normal
distributions of scores in all the proficiency levels for this macro skill.
37
Table 5
Descriptive Statistics for Level on Task Response
Levels N Mean SD 95% CI Upper 95% CI Lower Min
Max
Skewness
B1 30
4.77 .91 4.43 5.11 3.5 6.5 .17
B2 30
5.48 .64 5.25 5.72 4.5 6.5 -.26
C1 30
5.95 .83 5.64 6.26 4.0 7.5 -.07
The box plots in Figure 4 show that all three distributions were symmetrical and there
were no outlier scores in any proficiency level.
Figure 4. Box plots for proficiency levels on Task Achievement
Before running the one-way ANOVA, Levene’s test was carried out to examine the
homogeneity of variances. It was shown that the variances were approximately equal across
groups, F(2, 87) = 1.78, p =.174.
The one-way ANOVA showed a significant effect of the independent variable
proficiency level on the Task Achievement results, F(2, 87) = 16.62, p < .001. The descriptive
38
statistics showed an increase in mean scores from B1 level upwards (Table 5). The increasing
pattern was further explored through Tukey post hoc tests which showed a significant difference
between the two levels B1 and B2, p =.002. The B2 level mean score was statistically higher
than the B1 counterpart (mean difference = .72) (see Table 5). There was larger significant
difference between B1 and C1 levels, p < .001, because the C1 level showed a significantly
higher ability to address a writing task than the B1 level (mean difference = 1.18). However,
there was no significant difference between levels B2 and C1, p = .067, mean difference = .47.
Overall, the results revealed that all three proficiency levels showed an increase in their
Task Response scores, although the patterns vary. The scores increased significantly from B1 to
B2 levels, whereas no significant increase was observed between B2 to C1 levels. These trends
are illustrated in Figure 5 below.
Figure 5. Plot of proficiency level mean scores on Task Response
39
Coherence and Cohesion
As can be seen from Table 6, the descriptive statistics and skewness values indicate that
the scores for Coherence and Cohesion for each proficiency level were normally distributed.
Table 6
Descriptive Statistics for Proficiency Levels on Coherence and Cohesion
Level N Mean SD 95% CI Lower
95% CI Upper
Min Max
Skewness
B1
30
4.75 .79 4.46 5.04 3 6 <.001
B2 30
5.45 .48 5.27 5.63 4.5 6.5 -.04
C1 30
6 .87 5.67 6.32 4 7.5 -.095
The box plots in Figure 6 shows that the scores for the three proficiency levels were
normally distributed. There were no observed outlier scores for all proficiency levels.
Figure 6. Box plots for proficiency levels on Coherence and Cohesion
40
Prior to conducting the ANOVA, Levene's test results showed that the assumption of
homogeneity of variances was observed at α =.01, F(2, 87) = 4.75, p = .011. According to
George and Mallery (2009), this is an acceptable alpha level because Levene’s test is very
sensitive to even small differences.
The one-way ANOVA revealed that the independent variable proficiency level had a
significant effect on the Coherence and Cohesion results, F(2, 87) = 21.70, p < .001. The
descriptive statistics in Table 6 show that there was an increase in mean scores from B1 up to C1
levels. These differences were further explored through a Tukey multiple comparison post hoc
test which showed that the scores for the B1 and B2 levels were significantly different from each
other, p = .001. Specifically, the B2 mean score was significantly higher than the B1 counterpart
(mean difference = .70) (see Table 6). An even larger difference was shown between B1 and C1
scores, with p < .001; mean score for C1 level was shown to be much higher than that for B1
(mean difference = 1.24). The difference between levels B2 and C1 was also statistically
significant (p = .014) although the mean difference (.54) was not as large as for the previous two
comparisons.
To sum up, the one-way ANOVA revealed that the participants’ scores for the Coherence
and Cohesion sub-construct increased significantly across the three proficiency levels. However,
the mean difference between levels B1 and B2 was higher than between levels B2 and C1 (See
Figure 7 below).
41
Figure 7. Plot of proficiency level mean scores on Coherence and Cohesion.
Lexical Resource
The descriptive statistics and skewness values in Table 7 show that the scores for each
proficiency level were normally distributed for Lexical Resource.
Table 7
Descriptive Statistics for Proficiency Levels on Lexical Resource
Level N Mean SD 95%CI Lower 95%CI Upper Min Max Skewness
B1
30
4.85 .77 4.56 5.14 3 6.5 -.13
B2 30
5.62 .45 5.45 5.78 5 6.5 .422
C1 30
5.97 .67 5.72 6.22 4.5 7.5 .037
The distributions are further illustrated by the box plots in Figure 8, which show no
outliers and fairly symmetrical shapes.
42
Figure 8. Box plots for proficiency levels on Lexical Resource
Levene's test provided confirmatory evidence that the assumption of homogeneity was
observed at α = .01, F(2, 87) = 3.37, p = .039, which is considered acceptable (George &
Mallery, 2009). The ANOVA results showed statistical differences between the CEFR levels on
the Lexical Resource criteria, F(2, 87) = 23.73, p < .001, which were further clarified by the
Tukey post hoc comparisons. The patterns revealed through the Turkey test were similar to the
ones presented in the first two sections. Overall, the scores increased significantly from B1 to B2
levels but did not increase at the same rate at level C1 (see Figure 9). Specifically, the scores for
the first two levels (B1 and B2) were significantly different from each other (p < .001), with the
B2 mean score significantly higher than the B1mean score (mean difference = .77). Likewise, the
difference between B1 and C1 scores was also statistically significant (p < .001), with the C1
mean score being significantly higher than the B1 mean score (mean difference = 1.12). In
43
contrast, the difference between the B2 and C1 mean scores was not significantly significant (p =
.094) with a small mean difference (.35) in favor of level C1. Figure 9 illustrates these patterns.
Figure 9. Plot of proficiency level mean scores on Lexical Resource
Grammatical Range and Accuracy
The descriptive statistics and skewness values of Grammatical Range and Accuracy
revealed a normal distribution of the scores within each level (see Table 8).
Table 8
Descriptive Statistics for Proficiency Levels on Grammatical Range and Accuracy
Level N Mean SD 95%CI Upper 95%CI Lower Min
Max
Skewness
B1
30
4.78 .82 5.09 4.48 3 6 -.407
B2 30
5.58 .56 5.79 5.37 4.5 6.5 -.51
C1 30
6 .74 6.28 5.72 5 7.5 .338
44
The box plots in Figure 10 show symmetrical distributions for all levels and no outlier
scores in any proficiency level.
Figure 10. Box plots for proficiency levels on Grammatical Range and Accuracy
The assumption of homogeneity of variances was observed with Levene's test results
showing lack of evidence to reject the null hypothesis, F(2, 87) = 2.47, p = .09. The CEFR
proficiency level was found to have a significant effect on participants’ performance on the
criteria titled Grammatical Range and Accuracy, F(2, 87) = 22.48, p < .001. Tukey post-hoc tests
revealed that the B1 and B2 scores were significantly different from each other (p < .001) with a
better performance at the B2 level (mean difference = .80). The B1 and C1 levels were also
significantly different (p < .001) in favor of the C1 level (mean difference = 1.22). Meanwhile,
the ability to demonstrate grammatical range and accuracy did not differ significantly between
45
the B2 and C1 levels (p = .067); the C1 mean was not statistically higher than B2 mean (mean
difference = .42). Figure 11 shows the patterns described above.
Figure 11. Plot of proficiency levels mean scores on Grammatical Range and Accuracy
Chapter IV Summary
This chapter presented the results of the data analysis following the order of the research
questions as stated in chapter 3. In relation to overall band score on the writing task, a
significant increase was observed between all three levels, but the difference between levels B1
and B2 was larger than between levels B2 and C1. Regarding the separate sub-constructs, a
significant increase between all three proficiency levels was observed on Coherence and
Cohesion, whereas for the other three sub-constructs, the significant gain was between levels B1
and B2, but there was no significant increase between levels B2 and C1.
46
These findings are even more interesting when compared to the expected band scores for
each level of the CEFR proficiency according to the IELTS- CEFR alignment system presented
in chapter 1. Hence, the obtained results are interpreted in detail in the next chapter, the purpose
of which is to examine the findings of this study in view of the CEFR-IELTS expected band
scores in the context of the Vietnamese English language educational system.
47
CHAPTER V
DISCUSSION AND CONCLUSIONS
This chapter intertwines two main themes: one of them centers on the findings of the
current study in light of related theory and research, whereas the other positions the results of the
study within the expected performance standards of the CEFR. These two themes are then put
together in extrapolating implications for the L2 assessment and teaching practice in general, and
in the Vietnamese English language education system in particular. A section is also devoted to
the limitations of the study and recommendations for future research. The chapter ends by
highlighting the most important conclusions of the study.
Discussion
The present study examined whether there is an improvement in the writing skills of
Vietnamese learners of English as a result of their study at the University of Foreign Languages
in Danang, Vietnam. As described previously, the study adopted an in-depth approach to
examine the issue of interest by breaking down the writing skill into macro and micro skills, thus
tracing the development on each one of them in addition to the overall trend.
For the study, 90 essays were randomly selected from a pool of 250 argumentative essays
written by Vietnamese learners of English from three consecutive academics years. The essays
were modeled after the IELTS Writing Task 2 and scored against the standards specified in the
IELTS Writing Task 2 Descriptors. Inter-rater reliability was established and the data were
analyzed through five one-way ANOVAs, followed by Tukey multiple comparison tests. The
results revealed interesting patterns and trends both in view of related theory and research and in
relation to the CEFR expectancy levels. The next section discusses these results against the
background of L2 writing theory and empirical research.
48
Findings about the overall development of the writing skill across proficiency levels
Second language learning is viewed as a dynamic process in a substantial number of
works (Bley-Vroman, 1983; Fischer et al., 2003; Humphreys, et al., 2012; Larsen-Freeman,
2006; Marchman, et al., 2005; Shaw & Liu, 1998; Storch, 2007, 2009; Thelen & Smith, 1996;
Van Geert & Van Dijk, 2002). The dynamic and unpredictable nature of language development
is at the core of the debate about whether L2 proficiency improves as a result of training. This
controversy applies to all four language skills, including L2 writing, which constitutes the
construct investigated by this study.
Empirical findings about the dynamic nature of language learning are also contradictory.
On the one hand, a number of studies have reported a significant progress in L2 composition
skills by the end of three to six months of training courses (e.g., Archibald, 2001; Elder &
O’Loughlin, 2003; Hu, 2007; Sasaki, 2007, 2009); on the other hand, a growing body of
research, most of which has examined L2 writing in longer periods of instruction (a year or
longer), has shown a decline in the overall writing performance of ESL learners (e.g.,
Humphreys et al., 2012; Knoch et al., 2014; Storch, 2007, 2009; Tsang & Wong, 2000). A few
other studies provide mixed findings on the effectiveness of English instruction (e.g., Brown,
1998; Green, 2005) as they conclude that improvement is dependent on the types of English
courses or the proficiency level of the learners. For instance, O’Laughlin and Arkoudis (2009)
observe that an improvement in the writing skills is more commonly observed with lower
proficiency than with higher proficiency L2 learners.
In light of these opposing findings, the results of the present study support those studies
(e.g. Archibald, 2001; Elder & O’Loughlin, 2003; Hu, 2007; Sasaki, 2007, 2009) that have
reported positive developments in L2 writing skills as a result of instruction and training.
49
Specifically, a statistically significant improvement was observed in the writing skills of
Vietnamese English learners in an English-medium university from levels B1, B2, and C1.
Starting from 4.79 for the B1 level, the mean score increased by over half a band (.74), reaching
5.53 at the B2 level. Likewise, the mean score continued to increase during the program and
reached 5.98 at the C1 level. Overall, in the course of a 3-year undergraduate program, learners
improved their writing score by 1.19 band. In other words, the writing instruction at the
University of Foreign Languages in Danang, Vietnam, was effective as it showed a significant
positive effect on the development of the learners’ writing skills.
As mentioned above, the findings are consistent with previous studies which reported
improved composition skills, even after only three months of training (e.g. Archibald, 2001;
Elder & O’Loughlin, 2003; Hu, 2007; Sasaki, 2007, 2009), but are in contradiction to other
studies which have observed no development or even a decline in the writing skills of L2
learners (e.g., Knoch et al., 2014, 2015; Storch, 2007). The discrepancy could be explained by
the postulation made by O’Laughlin and Arkoudis (2009) that developments of writing skill
happen at different rates and to different extent at different levels of proficiency. In this case, the
participants in Knoch et al’s (2015) study were at a higher level of proficiency (minimum IELTS
score 6.5) than the participants in the current study. Furthermore, the operationalization of time
conditions also differed in both studies, which can possible be another factor attributing to the
inconsistency. In Knoch et al’s (2015) study, the participants were put under a rather tight time
constraint as they were allowed only 30 minutes to finish an argumentative essay. In contrast, in
this current study, the students were put under much less time pressure as they were given 45
minutes to complete a similar writing task.
50
It is also worth noticing that despite an improvement in the writing skills among the
assessed English-majored students, the increase was only by approximately one band (1.19).
This increase does not meet the IELTS Guidelines by the IELTS Testing Agency nor the
BALEAP Guidelines. These specify that after a three-month intensive English course ESL
writers’ skills are expected to improve by 1 band, according to IELTS (2002), or by 1.5 band,
according to BALEAP (2012). Taking into consideration that the participants in this study were
enrolled in a three-year program of academic study, it seems reasonable to expect a larger
increase from level B1 to C1. One possible explanation for the relatively small increase in
performance could be attributed to the fact that although the target students majored in English,
they had only 7 to 10 hours of English classes per week. Moreover, of the given English classes,
only 2 hours were devoted to composition training. Thus, in the course of one academic year, the
participants would get fewer English language writing classes than offered by intensive training
courses. It can also be argued that in an intensive program the curriculum is more focused and
more effectively structured than when stretched over a longer period, the latter making the
learning process rather intermittent.
Another important observation is that the students made less progress in the second phase
between levels B2 and C1. Specifically, the improvement was by less than half a band (.45) after
one year of formal English training (from 5.53 to 5.98 respectively). One explanation for this
could be that B2 and C1 levels are fairly advanced levels in the program and, hence, it may be
more difficult or make significant gains or it may take learners at these two levels a longer time
to make significant progress in their writing skills. A similar interpretation is found in Green
(2005) who observed significant developments only for lower level learners (5.0 band score or
lower), whereas for advanced learners (7.0 band score or higher) there was no development or
51
even a decline in the writing skills. These findings also corroborate the conclusions of a number
of other studies (e.g. Craven, 2012; Elder & O’ Loughlin, 2003; O’Laughlin & Arkoudis, 2009)
that it is easier to enhance L2 writing skills in lower-level students.
Juxtaposing the observed writing scores against the CEFR-IELTS standards
A further step in interpreting the results of this study is to juxtapose them against the
CEFR-IELTS band score ranges for each of the three levels of proficiency. This step is necessary
in order to draw informative conclusions about the effectiveness of the teaching of English
academic writing at Vietnamese undergraduate English-major programs. For instance, even
though the findings show an ascending developmental trend from level B1 to C1, for the most
part, except for level B1, the observed scores do not satisfy the corresponding band ranges
according to the CEFR-IELTS standards. The observed deviations are as shown in Figure 12
below.
Figure 12. Juxtaposition of obtained Mean scores with the CEFR-IELTS ranges
0
1
2
3
4
5
6
7
8
9
B1 B2 C1
obtained
CEFR lowest
CEFR highest
52
First of all, it becomes clear that the mean band score (4.8) of the B1 level participants
falls outside the expected range for this level (4.0 to 4.5). In fact, it exceeds the maximum band
score for this level. This means that the placement of the first year students into level B1 was not
appropriate since they were closer to the lower band score required of the B2 level than to the
higher band score associated with the B1 level. This observation carries valuable insights for the
parties involved in the placement of freshmen into proficiency levels at the University of Foreign
Languages in Danang, Vietnam. It definitely suggests the need to reevaluate the existing system
and the criteria by which the students are identified with specific CEFR levels of proficiency in
order to solve the discrepancy revealed by the results of the present study.
Going to the next level, B2, the obtained mean score (5.53) falls into the expected CEFR-
IELTS band score range of 5 - 6.5. However, the mean score does not go beyond the mid-point
of the range. This finding also carries insightful implications for the program developers at the
University of Foreign Languages in Danang, Vietnam. Particularly, this finding is important
when juxtaposed with the CEFR-IELTS band score range of level B2, which spans a 1.5 band
difference between the lowest and highest score (5-6.5). Contrasted with the half band between
the lowest and highest scores for level B1, this larger difference suggests that learners at level B2
will need proportionally longer or more intensive training than those at level B1 to progress to
the next level of proficiency. However, the current program of study does not take this fact into
consideration as it labels students at levels B1, B2, and C1 according to their year of study,
giving exactly the same time to each year and not recognizing the fact that the widths of the
ranges of expected CEFR-IELTS band score vary for each level.
This oversight becomes even more apparent in the results for level C1. At this stage, the
students are expected to have a band score between 7 and 8. However, the obtained mean score
53
for level C1 (5.98) falls far below than the lowest expected band score of 7. In fact, it does not
even reach the highest expected score for level B2 (6.5). Thus, the identification of third year
students with level C1 seems to be the most problematic as it falls short of meeting the CEFR-
IELTS band score standards for this level. This finding calls for serious reevaluation of the
existing placement system, especially concerning its alignment with the CEFR levels. According
to the results of this study, the third year students cannot be identified with level C1. Their mean
band score falls in the upper scale of the B2 range, and it would be more accurately identified as
B2+ level according to CEFR as mentioned in chapter I.
Developmental Trends on the Macro and Micro Level
Based on the review of related research in chapter II, most of the existing studies have
examined the development of L2 writing skills as a whole, and have rarely looked at the different
sub-constructs that partake in the formation of L2 writing ability. These sub-constructs are
broadly categorized into macro and micro skills (Brown, 2007). Moreover, the few existing
studies have produced mixed results. Considering this lack in the body of empirical research, the
present study set out to provide a more comprehensive analysis of the development of the writing
skill by examining the effect of academic training on two macro- and two micro aspects at the
heart of L2 writing ability. Specifically, the macro level looked at developments on: 1) Task
Response and 2) Coherence and Cohesion, whereas the micro level examined developments in
learners’: 1) Lexical Resource and 2) Grammatical Range and Accuracy.
Macro Level Results
The results for both macro skills (Task Response and Coherence and Cohesion) reveal an
ascending development from level B1 to C1. In fact, the first year students obtained quite similar
scores for both Task Response and Coherence and Cohesion of 4.77 and 4.75, respectively. They
54
progressed at a similar rate through level B2, yielding corresponding scores of 5.48 (Task
Response) and 5.45 (Coherence and Cohesion). This advancement was significant for both
macro skills. However, at level C1, the development on Task Response did not reach
significance, whereas on Coherence and Cohesion, it showed a statistically significant
improvement. According to the IELTS writing descriptors, this means that the participants from
level C1 were better at employing cohesive devices and at logically organizing information and
ideas than their counterparts at level B2. In contrast, there was no significant development
between levels B2 and C1 in the ability to express “a fully developed position in answer to the
question with relevant, fully extended, and well supported ideas.” (IELTS writing descriptors).
Thus, the improvement in Coherence and Cohesion, compared with Task Response was
more consistent between the three levels of proficiency. In fact, this skill started with the lowest
score (4.75) of all four macro and micro skills and reached the highest score of 6 at level C1.
This finding corroborates Craven’s (2012) study, which also showed the highest writing
improvement on Coherence and Cohesion, even though it was not statistically significant. One
explanation why learners show a steady progression in their ability to connect and logically
organize their writing could be inherent in the nature of this particular macro skill, which,
compared to the other sub-skills, may be faster and easier to build through instruction and
practice. Moreover, writing teachers may emphasize this particular aspect of writing more than
other aspects by encouraging learners to memorize structural cohesive devices and then use them
to sequence ideas and details logically.
Overall, the ascending development of the macro skills of writing that was observed in
the present study corroborates the findings of Storch (2009) and Storch and Hill (2008), whose
participants also improved their ability to address all parts of the task and organize strong
55
arguments coherently and cohesively. However, this conclusion is mostly applicable to the
developmental gain between levels B1 and B2, whereas it is not as straightforward between
levels B2 and C1. In fact, the findings also provide a partial support to Knoch et al’s (2014,
2015) results, where no significant improvement in the content and organization of ESL writers’
essays was found.
Juxtaposing the macro level results with the CEFR-IELTS band score ranges.
Similar to the interpretation of the overall writing scores for the three levels, the obtained
scores for Task Response and Coherence and Cohesion were juxtaposed with the band score
ranges for each level according to the CEFR-IELTS standards. This juxtaposition is illustrated in
Figures 13 and 14 below, and shows that although the macro skills developed over the three
examined years, the obtained scores did not fall into the expected ranges for two of the three
target levels. Specifically, for level B1, the mean band scores on Task Response (4.8) and
Coherence and Cohesion (4.75) fall above the maximum expected band score (4 to 4.5). For
level B2, the obtained mean scores fall within the range (5 to 6.5) but rather on its lower side
(Task Response = 5.5; Coherence and Cohesion = 5.45), whereas for level C1 both mean scores
(5.8 and 6) fall outside the lowest minimum band score for this level (7 – 8). Thus, even though
development on the macro level was observed, this development does not parallel the expected
performance as specified in the CEFR-IELTS standards. This fact pinpoints an obvious flaw in
the existing system of assigning CEFR levels based on years of study rather than on actual
assessment data.
56
Figure 13. Obtained and expected band scores Figure 14. Obtained and expected band
for Task Response scores for Coherence and
Cohesion
Micro Level Results
The results for the micro skills (Lexical Resource and Grammatical Range and Accuracy)
revealed a similar trend as the one observed for the macro skills. There was a significant
improvement from level B1 to level B2 in both Lexical Resource (B1 = 4.85; B2 =5.62) and in
Grammatical Range and Accuracy (B1 = 4.78; B2 = 5.58). On the other hand, the mean scores at
level C1 (Lexical = 5.97; Grammatical = 6) were slightly higher than those at level B2 and the
developmental gain did not reach statistical significance. The significant development from level
B1 to B2 is consistent with some previous studies (e.g., Larsen-Freeman, 2006; Tsang & Wong,
2000) in which the students’ accuracy and complexity increased after several months of
instruction. Yet, the improved vocabulary and grammatical use of the participants found in this
current study was not in line with findings from other studies (e.g., Knoch et al., 2014; Serrano et
0
1
2
3
4
5
6
7
8
9
B 1 B2 C1
obtained
CEFR
lowest
CEFR
highest
0
1
2
3
4
5
6
7
8
9
B 1 B2 C1
obtained
CEFR
lowest
CEFR
highest
57
al., 2012; Shaw & Liu, 1998; Storch, 2007, 2009; Storch & Tapper, 2009; Xudong, Cheng,
Varaprasad, & Leng, 2010) regarding grammatical accuracy and range.
A possible explanation for this discrepancy may be due to the difference in the length of
the English courses. For example, while this current study investigated the development of
writing skills throughout the first three years in a college-level program, the relatively short
duration (12 weeks) in Storch’s (2009) study possibly prevented those students from making
progress in their grammatical competence. In the same line, Ortega (2003) observes that it may
take more than 12 months of academic study for any significant developmental trends to occur.
On the other hand, the lack of significant gain between levels B2 and C1 is in support of
Storch’s (2009) conclusions. This seemingly controversial finding has its explanation in the
higher English language proficiency of the participants at level C1 in the current study and in
Storch’s study, where the minimum of IELTS band score was 6.5. As previously mentioned,
according to Green (2005) significant gains are more likely to be observed at lower proficiency
levels than at higher levels.
Juxtaposing the micro-level results with the CEFR-IELTS band score ranges.
As shown in Figures 15 and 16, the obtained scores for Lexical Resource and
Grammatical Range and Accuracy were juxtaposed with the band score ranges for each level
according to the CEFR-IELTS standards. In this process, similar problems become apparent as
described in relation to the macro skills. For level B1, the mean band scores on Lexical Resource
(4.85) and Grammatical Range and Accuracy (4.78) fall above the maximum expected band
score (4 to 4.5). For level B2, the obtained mean scores fall within the range (5 to 6.5) but on its
lower side (Lexical = 5.62; Grammar = 5.58), whereas for level C1 both mean scores (5.97 and
6) fall outside the lowest minimum band score for this level (7 – 8). Thus, the discrepancy
58
between the assigned levels of proficiency and the CEFR-IELTS standards is once again
observed in the results for the micro skills. In other words, the participants in this study at B1
level had better lexical and grammatical competence that specified in the CEFR-IELTS
paradigm, for B2 they demonstrated an acceptable level within the required range, and at level
C1, their lexical and grammatical skills were far below the ones expected for this level.
Figure 15. Obtained and expected band scores Figure 16. Obtained and expected band
for Lexical Resource scores for Grammatical Range
and Accuracy
Macro versus Micro skills
Since this study attempted to provide a more comprehensive analysis of developments in
the L2 writing skill in view of its macro and micro components, it is logical to combine these
analyses and see how the trends observed for the macro skills compare with the ones for the
micro skills. According to previous research, the trends can be competitive or supportive,
depending on the priorities of the learners at different proficiency levels. Competitiveness is
more likely to happen between macro and micro skills at lower proficiency levels (i.e., B1 and
0
1
2
3
4
5
6
7
8
9
B 1
B2
C1
obtained
CEFR
lowest
CEFR
highest
0
1
2
3
4
5
6
7
8
9
B 1
B2
C1
obtained
CEFR
lowest
CEFR
highest
59
B2 levels), whereas supportiveness at more advanced level like C1. The explanation of this
phenomenon could be attributed to the fact that at lower levels, the goals of ESL learners are to
improve their second language in terms of vocabulary and grammar, leaving such macro skills as
organization and form unattended (Van Geert, 2003). In contrast, as they grow more proficient in
their L2 language, they also become more familiar with the organization of the text. Their goal at
this period of time, enabled by their language proficiency, is to produce a text that is complex in
forms, and yet comprehensive and logically connected at the same time.
In the present study, the development of the macro and micro skills followed parallel
trends, which were supportive rather than competitive. As seen from the combined scores in
Table 9, there was a similar progression of the macro and micro elements of the writing skill in
the course of a three-year academic study program at the University of Foreign Languages in
Danang, Vietnam. Even though the mean score (4.76) for the macro skills at level B1 is slightly
lower than the mean score (4.82) for the macro sills, the difference is minute. Moreover, both
mean scores exceed the expected CEFR-IELTS range (4 to 4.5) for this level. For level B2, the
mean score for the macro skills (5.47) is again slightly lower than the one for the micro skills
(5.60), but the difference is very small and both mean scores fall into the expected CEFR-IELTS
range (5 to 6.5), on its lower rather than upper side. Regarding level C1, the mean scores are
almost identical (macro = 5.98; micro = 5.99) and both of them fall outside the expected CEFR-
IELTS range (7-8). In fact, both mean scores fall into the upper part of the B2 range.
Table 9
Macro vs. micro skills
Level Macro Skills Mean Micro Skills Mean
B1 4.76 4.82
B2 5.47 5.60
C1 5.98 5.99
60
The trends described above support the findings of previous studies (e.g. Robinson, 2001;
Van Geert & Steenbeek, 2005) in which macro and micro skills were found to have interactive
impacts on each other. They suggest that the curriculum at the University of Foreign Languages
in Danang follows a balanced approach and gives similar emphasis on developing learners’
writing ability on both the macro and micro levels. This fact pinpoints a strength in the teaching
approach because both aspects of the writing skill partake in the shaping of the writing skill. It
seems unlikely for learners with poor lexical and grammatical knowledge to develop good
academic writing skills even if they are very familiar with the conventions of academic writing.
The opposite is also true, that learners with good command of the lexical and grammatical
structure of the target language need to learn how to organize and structure their writing in order
to meet the required standards.
Pedagogical Implications
Extrapolating from the findings discussed above, the following implications for the
teaching of L2 academic writing and placement policies at the University of Foreign Languages
in Danang have been derived.
On the positive side, the findings of this study reveal that as a result of their three-year
course of study in the English major program, the students’ academic writing skills in English
have improved in all macro and micro areas, as measured by the IELTS writing descriptors. This
improvement was linear from the lowest proficiency level (B1) to the highest (C1). The fact that
it was observed on all macro and micro skills suggests that the writing curriculum and the
teaching practices follow a well-balanced approach, which covers all important sub-components
of the writing skill equally.
61
From a more critical perspective, there are also a few issues that should be brought to the
attention of the parties involved in the alignment of the English major program with the CEFR
proficiency levels and the placement of the students in these levels based on their year of study.
First and foremost, the findings of this study reveal a disparity between the actual English
proficiency levels of the English majors at the University of Foreign Languages and the CEFR
level they were identified with. This mismatch could be explained by the fact that the national
entrance examination is not aligned with the CEFR criteria which leads to inaccuracies in the
placement of students, especially concerning levels B1 and C1. Specifically, students placed in
level B1 seem to be at a higher level of proficiency than ascribed to this level. On the other hand,
students placed in level C1 appear to be of much lower level of proficiency than suggested by the
C1 descriptors. The findings of the study, therefore, suggest a reevaluation of the entrance
placement for English-language undergraduate programs. Based on the obtained mean scores for
the three target levels, it appears that the difference between the first, second, and third year
students is smaller than conveyed by their current placement levels. It seems more appropriate to
associate the first year students with the B1+ level, the second year students with the B2 level,
and the third year students with the B2+ level.
This reevaluation is highly recommended if the English major program aspires to be
aligned with the CEFR levels and to use the CEFR framework in the assessment of their
students’ progress and in the evaluation of the effectiveness of their curriculum and instruction.
Otherwise, in its present state, the CEFR levels are misleading and present an inaccurate profile
of the real abilities and competences of the students enrolled in the three-year program. This is
especially true for the third year students, who graduate at the C1 level, whereas their actual
competencies are far below the minimum requirements for this level. In fact, this is a rather
62
hazardous inflation of their actual abilities which can affect both the individuals and their
potential employers or further educational pursuits.
On the other hand, if the program administration aspires to have level C1 in the major,
then, they should either add another year of study to the program (B1+, B2, B2+, and C1) or
make the training at level B2 more intensive and closely aligned with the CEFR descriptors.
This implies curriculum and program changes which should be grounded in the CEFR
framework and in consideration of related theory and research about the most effective ways to
secure learners’ language development in an institutional program of study.
Last but not least, the results of this study suggest that there is a need for a regular and
systematic assessment of students’ progress through the program of study, as well as a
systematic evaluation of the effectiveness of the program in view of the goals, objectives, and the
framework in which it is set up.
Limitations and Recommendations for Future Research
Even though measures were taken to control for confounding factors, this study is, like
most research, liable to certain limitations and delimitations.
First of all, the study is limited to Vietnamese learners of English in the context of
Vietnamese undergraduate programs offering English majors. The results should not be extended
to other cultural and educational settings because they have their own idiosyncratic features that
may produce different results.
Stemming from the above limitation, the proficiency levels that were investigated by the
present study were inaccurately defined as shown by the results of the study. This fact constitutes
both a limitation and one of the major findings of this study, which can be insightful both to the
63
Vietnamese program administrators and to other educational settings where the CEFR
framework has been or will be adopted for the purpose of language assessment.
Another potential shortcoming of this study relates to the imbalance in sex among the
participants (15 males vs. 75 females). This limitation was inherent in the program from which
the subjects were drawn, whose student population was female predominant. For the sake of
generalizability, it is highly recommended to have a sex balanced sample because sex maybe a
variable that affects students’ motivation and progress through a program of study.
Furthermore, the findings and conclusions of this study are based on the results of only
one essay, which cannot capture the entirety of students’ writing skills and competences. In the
interest of related research and pedagogical practices, a more systematic gathering and analyses
of diverse data is highly recommended. Assessment is an ongoing process and numerous
previous studies (Bailey & Brown, 1999; H... Douglas Brown, 1980; H Douglas Brown, 2004;
Larsen-Freeman & Anderson, 2013) have highlighted the importance of ongoing assessment in
the development of learners’ language ability. If this study had analyzed a number of essays
collected throughout the program, the findings may have been a more reliable indicator of the
students’ writing abilities (Cumming, Busch, & Zhou, 2002).
Furthermore, the measures of grammatical and lexical range used in the study could be a
limitation as well. The results found from the study were mainly analytic scores, based on the
IELTS writing descriptors. Although a great deal of existing research has used this scoring
method in accessing ESL learners’ writing ability, some other studies also employed a variety of
discourse-analytic measures such as word count and ratio of words to T-units (e.g., Knoch et al,
2014, 2015; Larsen-Freeman, 2006; Serrano et al., 2012; Storch & Tapper, 2009; Tsang &
Wong, 2000) and the ratio of error-free clauses (e.g., Shaw & Liu, 1998; Storch & Tapper,
64
2009). Future research needs to consider employing a wider range of measures in order to give a
better insight into the features of composition development in L2 student writers.
Finally, it is important to mention here that the observed patterns and the conclusions
stemming from them are based on the average performance of the participants in each group,
namely on the group mean scores. Although the groups were homogeneous with no outliers, the
individual data revealed that in group C1 there were 5 participants who reached the lower bound
of the CEFR range (7 out of 8), thus showing that even though a minority, these five participants
were rightly identified with the C1 level. From a pedagogical and research point of view, it will
be interesting to examine the factors that have contributed to the higher level of the five
participants. These issues can be clarified through follow-up interviews, which are highly
recommended in future research on related issues.
Contribution of the Study
Despite the limitations mentioned above, the current study has added some valuable
insights to the existing body of research on L2 writing development. One of the major
contributions of the study is that it is the first one that has investigated the alignment between an
English major undergraduate program in Vietnam, which has adopted the CEFR criteria, with the
corresponding CEFR proficiency levels. As a result, the study has identified some flaws in the
operationalization of the CEFR levels and has offered specific recommendations for solving the
discrepancies between the existing system of assigning levels and the actual levels of the
students. The suggestions from the study may also inform other programs that have adopted or
will adopt the CEFR framework.
The results from the study, additionally, may be a motivation for future research that aims
at further exploring program alignment with the CEFR criteria. They will also be useful to
65
program evaluators, especially concerning the implementation of the CEFR framework into
educational settings in different countries and contexts.
Another contribution of the study is that it examined developments in L2 academic
writing both on the macro and micro level, and has thus offered a more comprehensive picture of
the different components of the writing skill and their development through a course of study. In
contrast, existing research has either looked at the writing skill in a holistic way or focused on
one or some of its elements, but has rarely approached writing as a balanced composite of macro
and micro skills.
66
REFERENCES
Archibald, A. (2001). Targeting L2 writing proficiencies: Instruction and areas of change in
students' writing over time. International Journal of English Studies, 1(2), 153-174.
Astin, A. W. (1993). What matters in college? Four critical years revisited. San Francisco:
Jossey-Bass.
Bailey, K., & Brown, J. (1999). Learning about language assesment: Dilemmas, decisions, and
directions & new ways of classroom assessment. Learning, 4(2), 1-8.
BALEAP. (2015, September 10). Course guide. Retrieved from
http://www.baleap.org.uk/content/courses/index.htm.
Bitchener, J., Young, S., & Cameron, D. (2005). The effect of different types of corrective
feedback on ESL student writing. Journal of Second Language Writing 14(3), 227–258.
BleyVroman, R. (1983). The comparative fallacy in interlanguage studies: The case of
systematicity. Language Learning, 33(1), 1-17.
Brown, H. D. (1980). Principles of language learning and teaching. Englewood Cliffs, N.J:
Prentice-Hall.
Brown, H. D. (2004). Language assessment: Principles and classroom practices. New York:
Pearson/ Longman, c2004.
Brown, H. D. (2007). Teaching by principles: An interactive approach to language pedagogy.
White Plains, NY: Pearson Education.
Brown, J. D. (1998). Does IELTS preparation work? An application of the context-adaptive
model of language program evaluation. IELTS Research Reports, 1, 20-37.
Cambridge, E. (2011). Using the CEFR: Principles of good practice. Cambridge ESOL, 1-48.
67
Craven, E. (2012). The quest for IELTS Band 7.0: Investigating English language proficiency
development of international students at an Australian university. IELTS Research
Reports, 13, 1-61.
Crystal, D. (2003). English as a global language. Cambridge, UK; New York: Cambridge
University Press.
Cumming, A., Busch, M., & Zhou, A. (2002). Investigating learners’ goals in the context of adult
second-language writing. New directions for research in L2 writing (pp. 189-208). New
York, NY: Springer.
DeKeyser, R. M. (2007). Study abroad as foreign language practice. In R. DeKeyser (Ed.),
Practice in a second language: Perspectives from applied linguistics and cognitive
psychology, 208-226. New York: Cambridge University Press.
Educational Testing, S. (2001). Test and score data summary for TOEFL[R] internet-based and
paper-based tests. 2001-2002 Test Data. Educational Testing Service. Retrieved from
https://www.ets.org/Media/Research/pdf/TOEFL-SUM-0102.pdf, on 2015, September 10
Educational Testing, S. (2002). Test and score data summary for TOEFL[R] internet-based and
paper-based tests. 2002-2003 Test Data. Educational Testing Service. Retrieved from
https://www.ets.org/Media/Research/pdf/TOEFL-SUM-0203-DATA.pdf, on 2015,
September 10
Educational Testing, S. (2006). Test and score data summary for TOEFL[R] internet-based and
paper-based tests. 2005-2006 Test Data. Educational Testing Service. Retrieved from
https://www.ets.org/Media/Research/pdf/TOEFL-SUM-0506-CBT.pdf, on 2015,
September 10
68
Elder, C., & O’Loughlin, K. (2003). Investigating the relationship between intensive English
language study and band score gain on IELTS. IELTS Research Reports, 4(6), 207-254.
de Europa, C., & de Cooperación Cultural, C. (2002). Common European Framework of
Reference for Languages: Learning, teaching, assessment. Cambridge, UK: Cambridge
University Press.
Ferris, D. R. (1994). Lexical and syntactic features of ESL writing by students at different levels
of L2 proficiency. TESOL Quarterly, 28(2), 414-420.
Ferris, D. (1999). The case for grammar correction in L2 writing classes: A response to Truscott
(1996). Journal of Second Language Writing, 8(1), 1-11.
Ferris, D. R., & Hedgcock, J. (2013). Teaching L2 composition: Purpose, process, and practice.
New York; London: Routledge.
Fischer, K. W., Yan, Z., & Stewart, J. (2003). Adult cognitive development: Dynamics in the
developmental web. Handbook of developmental psychology, 491-516.
George, D., & Mallery, P. (2009). SPSS for Windows step by step: A simple guide and reference
16.0 update. Boston, MA: Allyn and Bacon.
Goodall, K., & Roberts, J. (2003). Only connect: Teamwork in the multinational. Journal of
World Business, 38(2), 150-164.
Green, A. (2005). EAP study recommendations and score gains on the IELTS Academic Writing
test. Assessing Writing, 10(1), 44-60.
Green, A., & Weir, C. (2003). Monitoring score gain on the IELTS academic writing module in
EAP programmes of varying duration. Phase 2 report. Cambridge, UK: UCLES,
Cambridge.
69
Hill, K., & Storch, N. (2008). What happens to international students' English after one semester
at university? Australian Review of Applied Linguistics, 31(1), 1-17.
Hu, G. (2007). Developing an EAP writing course for Chinese ESL students. RELC
Journal, 38(1), 67-86.
Humphreys, M. P., Haugh, M., Fenton-Smith, B., Lobo, A., Michael, R., & Walkinshaw, I.
(2012). Tracking international students’ English proficiency for the first semester of
undergraduate study. IELTS Research Reports Online Series, 41. Melbourne: IELTS
Partners: British Council, Cambridge English Language Assessment and IDP: IELTS
Australia 2012. Retrieved from
https://www.researchgate.net/profile/Ian_Walkinshaw/publication/282664695_Tracking_
international_students'_English_proficiency_over_the_first_semester_of_undergraduate_
study/links/5617660308ae1a8880037176.pdf
Hyland, K. (2013). Writing in the university: Education, knowledge, and reputation. Language
Teaching, 46(1), 53-70.
Institute of International Education. (2014). Open Doors Fact Sheet Country Report 2014.
Retrieved from http://www.iie.org/Research-and-Publications/Open-Doors/Data/Fact-
Sheets-by-Country/2014, on 2015 September 10.
International English Language Testing System [IELTS] (2015, September 10). Retrieved from
http://www.ielts.org/researchers/score_processing_and_reporting.aspx#Writing, on 2015
September 10.
International English Language Testing System [IELTS]. (2002). The IELTS handbook.
Cambridge: University of Cambridge Local Examinations Syndicate, The British
Council, IDP Australia.
70
Jenkins, J. (2009). English as a lingua franca: Interpretations and attitudes. World
Englishes, 28(2), 200-207.
Kellogg, R. T., & Raulerson, B. A. (2007). Improving the writing skills of college
students. Psychonomic Bulletin & Review, 14(2), 237-242.
Knoch, U., Rouhshad, A., Oon, S. P., & Storch, N. (2015). What happens to ESL students’
writing after three years of study at an English medium university? Journal of Second
Language Writing, 28, 39-52.
Knoch, U., Rouhshad, A., & Storch, N. (2014). Does the writing of undergraduate ESL students
develop after one year of study in an English-medium university? Assessing Writing, 21,
1-17.
Kreidler, C. J. (1971). Effective use of visual aids in the ESOL classroom. TESOL Quarterly, 5,
19-37.
Larsen-Freeman, D. (2006). The emergence of complexity, fluency, and accuracy in the oral and
written production of five Chinese learners of English. Applied Linguistics, 27(4), 590-
619.
Larsen-Freeman, D., & Anderson, M. (2013). Techniques and Principles in Language Teaching
3rd edition. Oxford, UK: Oxford University Press.
Marchman, V., Thal, D., Tomasello, M., & Slobin, D. I. (2005). Words and grammar. Beyond
nature-nurture: Essays in honor of Elizabeth Bates, 141-164. Mahwah, NJ, US:
Lawrence Erbaum Associates Publishers.
Null, W. (2011). Curriculum: From theory to practice. Lanham, MD: Rowman & Littlefield
Publishers.
71
O’Loughlin, K., & Arkoudis, S. (2009). Investigating IELTS exit score gains in higher
education. IELTS Research Reports, 10(3), 1-86.
Ortega, L. (2003). Syntactic complexity measures and their relationship to L2 proficiency: A
research synthesis of collegelevel L2 writing. Applied linguistics, 24(4), 492-518.
Robinson, P. (2001). Task complexity, task difficulty, and task production: Exploring
interactions in a componential framework. Applied Linguistics, 22(1), 27-57.
Robinson, B. F., & Mervis, C. B. (1998). Disentangling early language development: Modeling
lexical and grammatical acquisition using an extension of case-study
methodology. Developmental Psychology, 34(2), 363-375.
Sasaki, M. (2007). Effects of studyabroad experiences on EFL writers: A multipledata
analysis. The Modern Language Journal, 91(4), 602-620.
Sasaki, M. (2009). Changes in English as a foreign language students’ writing over 3.5 years: A
sociocognitive account. Writing in foreign language contexts: Learning, teaching, and
research, 49-76. Great Britain: MPG Books Group.
Saville-Troike, M. (1984). What really matters in second language learning for academic
achievement? TESOL quarterly, 18(2), 199-219. doi:10.23073586690. Retrieved from
http://www.jstor.org/stable/3586690 doi:1 199-219.
Seidlhofer, B. (2004). 10. Research perspectives on teaching English as a lingua franca. Annual
review of applied linguistics, 24, 209-239.
Seidlhofer, B. (2011). Understanding English as a lingua franca. Oxford, UK: Oxford
University Press.
Selinker, L. (1972). Interlanguage. IRAL-International Review of Applied Linguistics in
Language Teaching, 10, 209-232.
72
Serrano, R., Tragant, E., & Llanes, À. (2012). A longitudinal analysis of the effects of one year
abroad. Canadian Modern Language Review, 68(2), 138-163.
Shaw, P., & Liu, E. T. K. (1998). What develops in the development of second-language
writing? Applied Linguistics, 19(2), 225-254.
Sizer, T. R. (2004). Horace's compromise: The dilemma of the American high school. Boston,
MA: Houghton Mifflin Harcourt.
Spack, R. (1988). Initiating ESL students into the academic discourse community: How far
should we go? TESOL Quarterly, 22(1), 29-51.
Storch, N. (2007). Development in L2 writing after a semester of study in an Australian
university. IJELT, 2(2), 173-190.
Storch, N. (2009). The impact of studying in a second language (L2) medium university on the
development of L2 writing. Journal of Second Language Writing, 18(2), 103-118.
Storch, N., & Tapper, J. (2009). The impact of an EAP course on postgraduate writing. Journal
of English for Academic Purposes, 8(3), 207-223.
Sung, C. C. M. (2014). English as a lingua franca and global identities: Perspectives from four
second language learners of English in Hong Kong. Linguistics and Education, 26, 31-39.
Thelen, E., & Smith, L. B. (1996). A dynamic systems approach to the development of cognition
and action. Cambridge, Mass.: MIT Press, 1996, c1994.
Tini, T. (1998, October 3). English may be ticket to better job in Vietnam. Los Angeles Times.
Retrieved from http://articles.latimes.com/1998/oct/03/news/mn-28937
Truscott, J. (1996). The case against grammar correction in L2 writing classes. Language
Learning, 46(2), 327-369.
73
Tsang, W. K., & Wong, M. (2000). Giving grammar the place it deserves in process writing.
Prospects, 15 (1), 34-45. Adelaide, Aus: AMEP Research Center.
Xudong, D., Cheng, L. K., Varaprasad, C., & Leng, L. M. (2010). Academic writing
development of ESL/EFL graduate students in NUS. Reflections on English Language
Teaching, 9(2), 119-138.
Van Geert, P. (2003). Dynamic systems approaches and modeling of developmental
processes. Handbook of Developmental Psychology, 640-672.
Van Geert, P., & Steenbeek, H. (2005). A complexity and dynamic systems approach to
development: Measurement, modeling and research. In K.W. Fischer, A. Battro, and P.
Lena (Eds), Mind, Brain, and Education (pp 1-27). Cambridge: Cambridge University
Press.
Van Geert, P., & Van Dijk, M. (2002). Focus on variability: New tools to study intra-individual
variability in developmental data. Infant Behavior and Development, 25(4), 340-374.
APPENDICES
74
APPENDIX A
CONSENT FORM
Dear participant,
My name is Ha Nguyen, and I am a graduate student in the Linguistics Department at Sothern
Illinois University at Carbondale, IL, USA. I have been granted approval by the Human Subjects
Committee at SIUC to contact you to request your participation in a research study, which I am
conducting as part of my thesis research. The purpose of my study is to investigate the micro and
macro skills in second language academic writing in Vietnamese educational setting.
Participation in this study is VOLUNTARY, but your willingness to take part in it is greatly
appreciated because I need to collect an adequate amount of data for my study. Please, choose
from the following two options:
1) If you do not want to participate, return this form to your teacher without signing it.
While your classmates who agree to participate are writing the essay, you will be given a
task related to your Writing class that you will complete silently. You will receive
constructive feedback on this task without a grade.
2) If you agree to participate, sign this form and return it to your teacher. After your
teacher receives the signed form, she will give you a writing task which is modeled
after an IELTS Academic Module Writing Task 2. You will have to write an academic
essay in response to the writing task. The essay takes 45 minutes to complete.
If you agree to participate, you will not be given extra credit for participation. However, doing
the task will be helpful for your preparation to take the IELTS exam. After the essays are scored,
you will receive extensive feedback on these areas of your writing that you need to work on in
order to perform better on the actual IELTS. This feedback will help you learn, but will not be
used in forming your class grades.
I can assure you that your responses will be kept confidential. The people who will have
access to the essay are: my thesis chair, Dr. Krassimira Charkova, Research Advisor, Department
of Linguistics, and myself. Our contact information is given in the next paragraph.
Questions about this study can be directed to me, Ha Nguyen, address: 800 East Grand Avenue
Apt 2C, tel: (858)381-7141; email: [email protected] or my thesis chair, Dr. Krassimira
Charkova, Research Advisor, Department of Linguistics, Faner Building 3225 SIUC,
Carbondale, IL, 62901, office tel: (618) 453 3425, email: [email protected].
Thank you for your precious collaboration and assistance in this research.
Signing this form indicates voluntary consent to participate in this study.
Signature _______________________________________________________________
75
This project has been reviewed and approved by the SIUC Human Subjects Committee.
Questions concerning your rights as a participant in this research may be addressed to the
Committee Chairperson, Office of Sponsored Projects Administration, Southern Illinois
University, Carbondale, IL 62901-4709. Phone (618) 453-4533. E-mail: [email protected]
76
APPENDIX B
RESEARCH INSTRUMENT
Dear participant,
Thank you for agreeing to take part in the study conducted by Ms. Ha Nguyen. As already
specified on the Consent Form that you have read and signed, you will complete a writing task
within a 45-minute time limit. Before this, please complete the following demographic
questionnaire:
1) Your name: ______________________________________________________________
2) Your Gender: Male ______________ Female _________________
3) Your Age: ___________________
4) Your Year of study: Second Year _____________ Fourth Year ____________
5) How familiar are you with the IELTS writing tasks?
a) I have no idea of what IELTS writing tasks are.
b) I have a vague idea of what IELTS writing tasks are.
c) I am familiar with the IELTS writing tasks.
d) I am familiar with the IELTS writing tasks and I have received practice in how to write
IELTS essays.
f) I am familiar with the IELTS writing tasks and I have taken the IELTS exam.
Now, go to the next page in order to see the question that you need to address in writing the
essay. Read it carefully, and begin writing your response.
77
WRITING TASK
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
Education should be accessible to people of all economic backgrounds. All
levels of education, from primary school to tertiary education, should be
free.
To what extent do you agree with this opinion?
78
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
79
APPENDIX C
IELTS WRITING TASK 2 DESCRIPTORS
80
VITA
Graduate School
Southern Illinois University
Ha Thi Thanh Nguyen
College of Foreign Languages, Danang University, Vietnam
Bachelor of English Studies, English Department, June 2009
Special Honors and Awards:
- Fulbright scholarship for Master’s program in Southern Illinois University
Carbondale, Illinois, the USA, 2014-2016.
- Alpha Delta Kappa scholarship for Master’s program in Southern Illinois University
Carbondale, Illinois, the USA, 2014-2016.
- Southern Illinois University Carbondale (SIUC) Women’s Civic Institute Reward for
Women Leadership.
- Top 10 Graduation, College of Foreign Languages, Danang University, Vietnam.
Thesis Title:
Macro and Micro Skills in Second Language Academic Writing: A Study of Vietnamese
Learners of English
Major Professor: Dr. Krassimira Charkova