Published

2012-07-01

The Impact of Conferencing Assessment on EFL Students’ Grammar Learning

Keywords:

Alternatives in language assessment, assessment for learning, conferencing assessment. (en)

Downloads

Authors

  • Sasan Baleghizadeh Shahid Beheshti University, G.C.
  • Zahra Zarghami Allameh Tabataba’i University
This article reports on a study that was carried out in order to examine the impact of conferencing assessment on students’ learning of English grammar. Forty-two Iranian intermediate university students were randomly assigned to an experimental and a control group. The participants in the experimental group took part in four individual and four whole class conferences. The participants in the control group studied the same grammatical points but they were not involved in conferencing assessment. The results of the study showed that the experimental group performed significantly better than the control group on the given post-test. Moreover, the attitudes of the participants toward grammar learning in the experimental group significantly changed from the first administration of a questionnaire to its second administration. 


En este artículo se reporta un estudio llevado a cabo con el fin de examinar el impacto de la evaluación mediante conferencias en el aprendizaje de gramática inglesa. Cuarenta y dos estudiantes universitarios iraníes, de nivel intermedio, fueron asignados aleatoriamente a dos grupos: uno experimental y otro de control. Los estudiantes del grupo experimental participaron en cuatro entrevistas individuales y cuatro con toda la clase. Los del grupo de control estudiaron los mismos elementos gramaticales pero
no estuvieron involucrados en conferencias de evaluación. Los resultados del estudio mostraron que el grupo experimental tuvo un desempeño significativamente mejor que el del grupo de control en el examen que se realizó al final del proceso investigativo. Además, se halló que las actitudes de los participantes del grupo experimental hacia el aprendizaje de la gramática cambiaron entre la primera
y la segunda aplicación de un cuestionario.

The Impact of Conferencing Assessment on EFL Students’ Grammar Learning

Impacto de la evaluación mediante conferencias en el aprendizaje de la gramática en estudiantes de inglés como lengua extranjera

 

Sasan Baleghizadeh*

Shahid Beheshti University, g.c., Iran

Zahra Zarghami**

Allameh Tabataba’i University, Iran

*sasanbaleghizadeh@yahoo.com

**zarghami_z@yahoo.com

This article was received on January 31, 2012, and accepted on June 1, 2012.


This article reports on a study that was carried out in order to examine the impact of conferencing assessment on students’ learning of English grammar.  Forty-two Iranian intermediate university students were randomly assigned to an experimental and a control group. The participants in the experimental group took part in four individual and four whole class conferences. The participants in the control group studied the same grammatical points but they were not involved in conferencing assessment. The results of the study showed that the experimental group performed significantly better than the control group on the given post-test. Moreover, the attitudes of the participants toward grammar learning in the experimental group significantly changed from the first administration of a questionnaire to its second administration.

Key words: Alternatives in language assessment, assessment for learning, conferencing assessment.


En este artículo se reporta un estudio llevado a cabo con el fin de examinar el impacto de la evaluación mediante conferencias en el aprendizaje de gramática inglesa. Cuarenta y dos estudiantes universitarios iraníes, de nivel intermedio, fueron asignados aleatoriamente a dos grupos: uno experimental y otro de control. Los estudiantes del grupo experimental participaron en cuatro entrevistas individuales y cuatro con toda la clase. Los del grupo de control estudiaron los mismos elementos gramaticales pero no estuvieron involucrados en conferencias de evaluación. Los resultados del estudio mostraron que el grupo experimental tuvo un desempeño significativamente mejor que el del grupo de control en el examen que se realizó al final del proceso investigativo. Además, se halló que las actitudes de los participantes del grupo experimental hacia el aprendizaje de la gramática cambiaron entre la primera y la segunda aplicación de un cuestionario.

Palabras clave: alternativas en evaluación del lenguaje, evaluación del aprendizaje, evaluación mediante conferencias.


Introduction

In every country, educational policymakers place great emphasis on tests and test scores. Tests are considered to be measurement tools and, more often than not, important decisions about people are made based on their test scores. People usually think that it is the test itself and the score on the test which are so important. However, the fact is that “it is the use to which we put their results that can be appropriate or inappropriate” (Bailey, 1998, p. 204).

Tests, however, are just one of the possible methods of assessment. Practitioners in the field of education make a distinction between tests and assessment. As Brown (2004) states, “tests are formal procedures, usually administered within strict time limitations, to sample the performance of a test-taker in a specified domain” while “Assessment includes all occasions from informal impromptu observations and comments up to and including tests” (p. 251).

Reliability and validity of a test were considered to be the two most important issues in designing traditional tests such as multiple-choice items and other standardized tests. Such tests were constructed in a way to ensure both objectivity and ease of administration and scoring. Since the 1990s, there has been a major paradigm shift in language testing and assessment domain. The shortcomings of standardized tests convinced specialists to replace traditional tests with new kinds of language assessment. Portfolios, journals, self- and peer-assessment, and many other techniques have been introduced in order to evaluate students’ achievement. Brown and Hudson  (1998, p. 657) state that using the term “alternative assessment” for the newly introduced methods of language assessment is counterproductive because the term implies something completely new and distinct from what was done before. They suggest the term “alternatives in language assessment” for these methods of language assessment (Brown, 2004, p. 252).

The last decade has also witnessed another widespread change in language assessment concepts and methods. One of the main reasons for such a shift is the growing interest of practitioners in the concept of “assessment for learning,” which means considering teaching, learning, and assessment as an integrated and interdependent chain of events (Lee, 2007). Assessment for learning is best defined as a process by which assessment information is used by teachers to adjust their teaching strategies, and by students to adjust their learning strategies. Based on this view, assessment, teaching, and learning are interdependently linked, as each one imposes its own effect on the others (Assessment Reform Group, 2002).

Conferences, a special type of purposeful conversation or discussion between teachers and learners, can be regarded as a new form of evaluating students’ achievement in different educational settings. Genesee and Upshur (1996) argue that conferences involve both teachers and learners visiting each other in an office or classroom to discuss the students’ performance in their learning process. They stress that during a conference the focus of the instructor should be on the learners and their needs in the learning process they are experiencing.

Since the inception of alternative assessment methods, a number of researchers have tried to investigate the effectiveness of using these new methods of assessment on language learning of different students. Ross (2005) has investigated the impact of using formative methods of assessment on foreign language proficiency growth by involving eight cohorts of foreign language learners in an eight-year longitudinal study. The results of this study indicate that formative assessment practices yield very positive effects on language proficiency growth. Cheng and Warren (2005) have investigated the benefits of peer-assessment in English language programs. In their study, undergraduate engineering students attending a university in Hong Kong were asked to assess the English language proficiency of their peers. Their study also compares peer and teacher assessments. The findings suggest that students had a less positive attitude toward assessing their peers’ language proficiency, but they did not score their peers’ language proficiency very differently from the other assessment criteria. Firooz-Zareh (2006) examined the relationship between alternative assessment techniques and Iranian students’ reading proficiency. Throughout a whole semester, two techniques of self-assessment and conferencing were put into practice in the experimental group. The findings of his study ensure the inclusion of alternative assessment techniques in assessment and instruction. Likewise, Besharati (2004) looked into the impact of alternative assessment techniques as regards Iranian students’ listening comprehension. Again, a combination of the two techniques of self-assessment and conferencing were put into practice in the experimental group. The results of this study pointed to the positive effects of incorporating alternative assessment procedures onto the listening comprehension skills of Iranian university learners.

Linn, Baker, and Dunbar (1991) have proposed eight criteria for validation of performance-based assessment, such as many alternative assessments methods, as follows:

Serious validation of alternative assessments needs to include evidence regarding the intended and unintended consequences, the degree to which performance on specific assessment tasks transfers, and the fairness of the assessment. Evidence is also needed regarding the cognitive complexity of the processes students employ in solving problems and the meaningfulness of the problems for students and teachers. In addition, a basis for judging both content quality and the comprehensiveness of the content coverage needs to be provided. Finally, the cost of the assessment must be justified. (p. 20)

The Present Study

Reviewing the available literature reveals that much of the research regarding the efficacy of alternative assessment methods has been carried out in English as a Second Language (ESL) contexts and these studies have focused on reading and writing skills. The application of alternative assessment methods, however, has grown rapidly beyond the ESL context to many varied situations, specifically in English as a Foreign Language (EFL) contexts. To date, the effectiveness of alternative assessment methods, incorporating principles of assessment for learning has not been investigated in the EFL learning context of Iran. Therefore, more empirical research is required to examine the impact of alternative assessment methods and assessment for learning techniques on language learners’ attitudes and their achievements. Therefore, the present study aims to investigate the efficacy of conferencing assessment procedure on grammar learning of Iranian EFL students and their attitudes toward formal grammar learning by seeking to answer the following research questions:

1.       Does conferencing assessment have any impact on Iranian EFL students’ grammar learning?

2.       Does conferencing assessment change the attitude of Iranian EFL students toward formal grammar learning?

3.       Does traditional summative assessment change the attitude of Iranian EFL students toward formal grammar learning?

4.       Is there any change in the attitude of the students in both groups (conferencing versus traditional assessment) toward formal grammar learning?

Method

Participants

The participants for this study were 42 Iranian intermediate EFL students (22 females and 20 males) majoring in different fields (information technology, computer engineering, accounting, etc.) in one of the branches of the University of Applied Science and Technology in Tehran, Iran. They were freshmen with an average age of 22. The participants were members of two classes taking a course named General English I. These classes, both taught by the same teacher, were randomly assigned to one experimental group (n=20) and one control group (n=22).

Instruments

The main instrument used in this study was a 50-item grammar test consisting of 25 multiple-choice and 25 error recognition items. The test was administered to both groups in the first and last sessions of the experimental period. The questions were based on the topics students were supposed to study during the course General English I, namely, (a) verb tenses (including simple present, simple past, future, present continuous, past continuous, future continuous, present perfect, past perfect, future perfect, present perfect continuous, past perfect continuous, and future perfect continuous), (b) auxiliary verbs, (c) coordination (including coordinating conjunctions and conjunctive adverbs), and (d) subordinators (including relative pronouns and adverbial subordinating conjunctions). There were 12 items from each topic except verb tenses, which included 14 items in the test, most of which were adopted from previous actual samples of the Test of English as a Foreign Language (TOEFL), officially released by Educational Testing Service  (ETS). Some of the test items are provided in Appendix A.

Given that the items were selected and adopted from various sources, there was a need to check the reliability as well as the content validity of the test. The reliability of the test, measured through Kuder-Richardson 21 formula, turned out to be 0.89 and its content validity was approved by the course instructor as well as by an EFL university professor.

This test served three purposes in this study: It was used as the pre-test as well as the post-test. Moreover, it functioned as an instrument to determine the homogeneity of both groups at the beginning of the study in terms of their grammatical knowledge.

To find out the attitude of Iranian university students regarding formal grammar learning, a questionnaire developed by Schulz (2001) was used. This questionnaire was administered two times (once at the beginning and then at the end of the treatment period to determine whether or not the participants’ responses on the first administration would differ from their responses on the second administration of the questionnaire. The questionnaire had a five-point scale in Likert format (strongly disagree, disagree, undecided, agree, and strongly agree). The minimum and maximum scores on this questionnaire were 13 and 65, respectively. The reliability of this questionnaire calculated through Cronbach’s alpha level formula was acceptable (α >.60).

In order to collect appropriate data for the study, the following steps were taken. In the first session of the treatment, the grammar test was administered to both groups. In addition, the questionnaire was given to all the participants and some explanations were given by the instructor to help learners complete the questionnaire.

Throughout the ten-week semester, the conferencing assessment technique was utilized for the experimental group based on the grammar points programmed to be taught in the class. The procedure for implementing this technique in the experimental group was a conference check-list, which was a set of questions to be asked by the instructor and was used as the specific treatment for this group. It can be considered as a kind of treatment in that the participants gave the instructor feedback on their strengths and weaknesses in grammar tasks and the instructor provided them with necessary feedback regarding their problems and helped them overcome their weaknesses. The main purposes of these conference sessions were the following: (a) to allow the instructor and the students to talk about learning different grammar points constructively, (b) to provide both the teacher and the students with an invaluable source of information about the students’ progress in their learning, (c) to identify the gaps in the students’ understanding of the subject matter as well as to provide them with the necessary positive feedback to motivate them, and (d) to create a supportive atmosphere for the students to experience problem solving and information sharing processes (Stiggins & Chappuis, 2005). The checklist (Appendix B) included two sets of questions, namely:

a.       The questions asked in the first conference. Examples:

        What do you think about your grammar ability?

        How do you try to learn grammar?

Sample responses from the learners:

It’s terrible. I don’t like grammar. (Student 4)

Yes, if I try very hard I can be successful. (Student 8)

A good learner is someone who is really careful about all the details. A good grammar learner is also somebody who has a very good memory. (Student 17)

b.       These questions were asked after covering each of the grammar structures mentioned before:

        Do you think you have been successful in learning this grammatical structure?

        What is your weakness in this lesson?

Sample responses from the learners:

Yes, I have been successful to some extent. (Student 2)

Now I am able to answer grammar questions easily. I can use these structures in my language accurately. (Student 19)

I always had problems with different tenses, especially in my speaking, but now I can use them accurately in my writing and speaking. (Student 11)

In the first session of individual conferencing, each of the learners was supposed to answer the first set of questions of the conference checklist. At this time, the instructor was required to create a comfortable setting to perform face-to-face conferences which would allow the learners to talk about their problems freely. The students were advised to feel relaxed in all the conference sessions. They were assured that the main purpose of the conferencing assessment was to identify their thoughts, strengths, and weaknesses in order to help them improve their learning. After completing each grammar point, the participants were required to respond to the second set of questions of the conference checklist either individually or in whole class conferences. Based on their answers, the instructor provided them with appropriate oral feedback to help them overcome their problems in learning that specific grammatical feature. The instructor’s feedback was supposed to be consistent with the following English language teaching rules:

        Giving relevant, practical, and constructive feedback.

        Making feedback specific rather than general.

        Giving feedback as immediately as possible.

        Focusing on the points that may help or lead to more achievements.

        Concentrating on one particular point at a time.

        Using non-threatening language, especially for giving negative feedback.

        Considering the learners’ needs and wants.

        Making sure that the feedback is understood by the learners.

On the whole, the participants took part in eight conferences (four individual conferences for each learner and four whole class conferences). All the conferences were conducted orally in English and on average lasted for eight minutes. The instructor gave the participants ample time to talk about their problems and then provided them with appropriate feedback.

In the control group the routine syllabus—based on the presentation, practice, and production (PPP) model—was followed without any resort to alternative assessment techniques. In the control group the procedure was as follows: The instructor taught the units and then the participants did the exercises. The participants were not involved in any individual or whole class conferences. The students were passive most of the time except the time they were doing the exercises.

At the end of the treatment period (about ten weeks), the participants in both groups were given the post-test. Reasoning that the time interval (ten weeks) was long enough for the participants not to remember the items from the first administration, the pre-test was administered as the post-test, too. Besides, the same grammar learning attitude questionnaire was given to all the participants to see whether their responses on the first administration had differed from their answers on the second administration.

The researchers analyzed the results of the participants’ scores on the pre- and post-tests of grammar by using an independent samples t-test. The scores of the participants on the pre- and post-course questionnaire were analyzed by using both paired and independent samples t-tests. All statistical analyses were carried out using Statistical Package for Social Sciences (SPSS) version 15.0 with alpha set at .05.

Results

After administering the pre-test, the participants’ scores were used to check for the homogeneity of both groups at the outset of the study. The descriptive statistics of the pre-test are presented in Table 1.

An independent samples t-test was used to see if there was any statistically significant difference between these two groups. Table 2 shows the results.

The results indicate that there was not a statistically significant difference between the mean scores of both groups t (40)=.82, p=.93. Thus, it can be concluded that both groups of the students participating in this study met the condition of homogeneity.

After the ten-week treatment period, consisting of 18 sessions, the post-test was administered. The descriptive statistics of the post-test are presented in Table 3.

To answer the first research question of the study, an independent samples t-test was used to compare the mean scores of both groups (see Table 4).

The results revealed that there was a significant difference between the mean scores of both groups t (40)=7.37, p=.001. This suggests that the participants in the experimental group significantly outperformed their peers in the control group on the post-test. Therefore, the first research question was answered in the positive, which suggests that conferencing assessment played a substantial role in grammar learning of the participants in the experimental group.

In this study, the same questionnaire was administered at the beginning and at the end of the experiment to compare the participants’ attitudes toward formal grammar learning before and after the treatment period.

To answer the second research question, two paired samples t-tests were used to compare the probable differences between the participants’ attitudes in each group toward formal grammar learning prior to and after the treatment period.

Table 5 displays the data obtained from the experimental group. The results show that the mean difference (3.60) is statistically significant t (19)=3.70, p=.002, which suggests that the conferencing technique worked with the participants in the experimental group and changed their attitudes as well. Thus, the second research question was answered positively, too.

However, as Table 6 shows, conducting a paired samples t-test for evaluating the participants’ attitudes in the control group regarding formal grammar learning revealed that there was no significant difference before and after taking part in the traditional summative assessment t (19)=-.66, p=.51 (see Table 6). This revelation suggests that the answer to the third research question is negative.

In order to investigate the fourth research question, an independent samples t-test was performed on the post-course questionnaire scores of both groups. The descriptive statistics of the post-course questionnaire are presented in Table 7.

The results of an independent samples t-test revealed that there was a significant difference between the mean scores of both groups in the post-course questionnaire scores t (40)=3.32, p=.002) (see Table 8). Thus, it can be concluded that using alternative assessment procedures positively changed the attitudes of the participants toward formal grammar learning.

Discussion

This study set out to investigate the efficacy of alternative assessment methods in EFL contexts. More specifically, the main purpose of this study was to examine the impact of conferencing assessment on Iranian EFL students’ grammar learning. Furthermore, the data shed light on possible differences in terms of the participants’ attitudes toward grammar learning prior to and after implementing different treatment conditions.

In brief, the results reported above revealed two relatively related findings. First, the participants who took part in the conferencing assessment showed significantly more improvement as compared to their peers in the control group. And second, these students revealed positive attitudes toward formal grammar learning after experiencing this alternative assessment method. Therefore, it can be concluded that integrating teaching, learning, and assessment processes through alternative assessment procedures may have positive effects on EFL learners’ achievements in grammar learning and their attitudes toward its learning.

The findings of the present study corroborate the findings of studies conducted by Besharati (2004), Firooz-Zareh (2006), and Ross (2005) in that incorporating alternative assessment procedures in language classes would have a positive effect on students’ learning. Viewing language tests as an ongoing process of assessment can change their nature from being an assessment tool to a learning tool and can be used as an effective way to improve students’ learning and their attitudes about it.

Considering the first research question, the findings of the study pointed to the significant effects of this alternative assessment procedure on Iranian EFL students’ grammar learning. These results lead us to conclude that implementing alternative assessment procedures and applying principles of assessment for learning have promoted grammar learning of the participants of this study more than the traditional summative assessment technique.

The results obtained in this study can be attributed to the following reasons:

1.       Feedback based on assessment is one of the most powerful issues in teaching and learning. Maximizing the quality, appropriateness, and use of feedback should be a core aim of all assessment procedures. Feedback can drive a loop of continuous change and improvement for both the teacher and student, as both learn from each other (Stiggins, 2002).

2.       The assessment procedure used in this study may have encouraged the participants to take responsibility for their own learning by engaging them in self-assessment, reflection, goal setting, monitoring, and communicating their own progress (Anderson,  1998; Rash, 1997). As Stiggins (2005) states, when students actively participate in assessing their learning by interpreting their performance, they are in a better position to recognize the important moments of personal learning. This helps them identify their own strengths and needs and discover how to make better instructional decisions.

3.       A desirable aim of teaching and assessment is to encourage independence in learners by making them capable of controlling their own learning. The alternative assessment procedure utilized in this study propelled the participants of the experimental group into independence by involving them in the assessment process, decision making, and goal setting. It has been argued that participating in alternative assessments can assist learners in becoming skilled judges of their own strengths and weaknesses, which can develop their capacity to become self-directed and autonomous learners and thus develop lifelong learning skills (Brindley, 2001).

4.       By encouraging learners to observe and analyze target grammar items for themselves, alternative assessment procedures reinforce their natural tendency and ability to make sense of language and to systematize it. The alternative assessment technique used in this study involved learners in doing consciousness-raising tasks which highlighted certain grammatical topics for them and encouraged them to learn for themselves (Ellis, 1993).

5.       One of the most important purposes of assessment for learning is the role it plays in students’ motivation. Knowledge and understanding of what is to be achieved is not enough. Students must want to make the effort and must be willing to keep on engaging, even when they find the learning task difficult. Assessment that encourages learning promotes motivation by emphasizing progress and achievement rather than failure (Harmer, 1987; Stiggins, 2004). In this study, the participants of the experimental group were in a position to judge whether or not success is within or beyond reach, whether or not learning is worth the required effort, and whether they should strive for it or not.

As for the second and third research questions of the study, the ones which were intended to investigate the impact of alternative and traditional assessment procedures on Iranian EFL students’ attitudes to formal grammar learning, it was found that conferencing assessment, through the course of the study, had significantly changed their attitudes.

Each of the following issues can be considered as a probable reason for the change in the students’ attitudes toward formal grammar learning in this study.

1.       Experiencing a new assessment method (Guskey, 2003; Ho, 2003; Scouller, 1996; Spavold, 2005). Students’ previous learning experiences, mainly formed in teacher-centered grammar translation classes, had shaped negative attitudes toward grammar learning. Being involved in an innovative learning situation in which the learners are asked to speak about their strengths and weaknesses in learning different grammatical points wherein the main focus of the teacher is to help students overcome their problems is likely to enhance learners’ attitudes toward learning.

2.       Making the learners sure that they are capable of learning (Stiggins & Popham, 2008). Encouraging learners to talk about the learning processes they are experiencing can help them become more aware of what they are learning as well as how they are learning it. This situation can increase the sense of wanting to learn in the students and consequently affects their attitudes toward learning.

3.       Providing learners with a set of clearly defined learning goals (Stiggins, 2002). In conferencing assessment the learners are encouraged to talk about their improvements in learning the subject matter. In this process, they are not just thinking about what they have learnt, but how they are learning. In thinking about how they learn, they can achieve a better understanding of the learning goals and develop positive feelings regarding the learning processes they have undertaken.

4.       Motivating them to learn (Race, 1995). Involving learners in the assessment and decision-making processes is an effective way to increase their self-esteem and motivate them to learn more. With the conferencing method, the focus of instruction and assessment is on the learners’ ideas, beliefs, and needs in a specific learning situation. In such cases, the learners will feel the ownership of the assessment and are, therefore, more motivated to learn.

Concerning the last research question, the significant difference between the attitudes of the conferencing group and those of the control group on the post-course questionnaire lends support to the valuable role of communication and face-to-face interaction in changing learners’ attitudes toward grammar learning. As Harris and Bell (1994) indicate, “[a]ssessing without communication is of doubtful value: communication between the teacher and the learner is an essential part of the learning process and should be on a regular basis” (p. 18). Interactive communication between the instructor and the students during the conferences in this study might have affected the instructor’s teaching by providing her with more information about each of the students’ personality type, learning styles and strategies, feeling toward the learning processes they were involved in, and their desires and needs in the course of study. All this information helped the instructor support, guide, monitor, and teach the students more effectively. When the students perceived such relevance between what they wanted and what they received from the teacher during the teaching and assessment processes, they might have been more motivated to learn, which could have affected their attitudes toward learning in positive ways.

The results of this study also point to the importance of considering the learners’ needs and ideas in teaching, learning, and assessment processes (Kaufman, 2000). Student-involved classroom assessment can be effective by providing teachers with constant needs analyses and increased understanding of the students’ concerns and problems. It can also be helpful for learners by encouraging them to identify their own strengths and weaknesses, to promote their autonomy and independent learning skills, and to increase responsibility for their own learning. Students’ involvement in decision-making and assessment processes can enhance their motivation by creating a situation for optimal learning, introducing expected  learning goals, providing appropriate feedback, promoting meaningful learning, and facilitating students’ development in independent learning, which in turn can affect their attitudes toward learning.

Conclusion

The purpose of this study was to investigate the impact of conferencing assessment on grammar learning of Iranian EFL students and their attitudes toward formal grammar learning. The overall emergent picture drawn from this study suggests that conferencing assessment has a positive impact on EFL students’ grammar learning and can change their attitudes toward grammar learning. Using process-oriented assessment procedures like conferencing assessment can provide ample opportunities for both teachers and students to communicate with each other. Hence, teachers can facilitate learning by providing students with appropriate descriptive feedback in their learning process and help them identify their problems. In this way, students and teachers can work as assessment partners who have clear-cut learning goals and specific assessment tasks. This process can lead students to take control of their own success and to accept responsibility for their own learning. This will naturally motivate them for more effective learning and greater achievement. As the last word, it should be mentioned that assessment should not be considered as something independent of instruction. To be more authentic, assessment should be based on the learners’ behaviors exhibited during formative and continuous evaluation and students must be aware of the expected outcomes of instruction and assessment, the processes involved, and the criteria on which they will be evaluated.


References

Anderson, R. S. (1998). Why talk about different ways to grade? The shift from traditional assessment to alternative assessment. New Directions for Teaching and Learning, 74, 5-16.

Assessment Reform Group. (2002). Assessment for learning: 10 principles. Port Melbourne: Cambridge University Press.

Bailey, K. M. (1998). Learning about language assessment: Dilemmas, decisions, and directions. Boston: Heinle & Heinle.

Besharati, F. (2004). The impact of alternative assessment techniques on Iranian students’ achievements in listening comprehension skills (Unpublished master’s thesis). Al-Zahra University, Iran.

Brindley, G. (2001). Outcomes-based assessment in practice: Some examples and emerging insights. Language Testing, 18(4), 393-407.

Brown, H. D. (2004). Language assessment: Principles and classroom practice. White Plains, NY: Pearson Education.

Brown, J. D., & Hudson, T. (1998). The alternatives in language assessment. TESOL Quarterly, 32(4), 653- 675.

Cheng, W., & Warren, M. (2005). Peer assessment of language proficiency. Language Testing, 22(3), 93-121.

Ellis, R. (1993). Second language acquisition and the structural syllabus. TESOL Quarterly, 27(1), 91-113.

Firooz-Zareh, A. R. (2006). The effectiveness of alternative assessment and traditional methods of testing on Iranian EFL adult learners’ reading proficiency (Unpublished master’s thesis). Allameh Tabataba'i University, Iran.

Genesee, F., & Upshur, J. (1996). Classroom-based evaluation in second language education. Cambridge: Cambridge University Press.

Guskey, T. R. (2003). How classroom assessments improve learning. Educational Leadership, 60(5), 6-11.

Harmer, J. (1987). Teaching and learning grammar. London: Longman.

Harris, D., & Bell, C. (1994). Evaluating and assessing for learning. London: Kogan Page.

Ho, L. (2003). Self- and peer-assessments vehicles to improve learning. CTDL Breif, 6(3). Retrieved from http://www.cdtl.nus.edu.sg/brief/v6n3/sec5.htm

Kaufman, L. M. (2000). Student-written tests: An effective twist in teaching language. The Journal of the Imagination in Language Learning and Teaching, V, 1-5.

Lee, I. (2007). Assessment for learning: Integrating assessment, teaching, and learning in the ESL/EFL writing classroom. The Canadian Modern Language Review, 64(1), 199-214.

Linn, R. L., Baker, E. L., & Dunbar, S. B. (1991). Complex, performance-based assessment: Expectations and validation criteria. Educational Researcher, 20(8), 15-21.

Race, P. (1995). What has assessment done for us and to us? In P. Knight (Ed.), Assessment for learning in higher education (pp. 61-74). London: Kogan Page, Ltd.

Rash, A. M. (1997). An alternative method of assessment: Using student created problems. Primus, 7, 89-95.

Ross, S. (2005). The impact of assessment method on foreign language proficiency growth. Applied Linguistics, 26(3), 317-342.

Schulz, R. A. (2001). Cultural differences in student and teacher perceptions concerning the role of grammar instruction and corrective feedback: USA-Colombia. The Modern Language Journal, 85(2), 244-256.

Scouller, K. M. (1996). Influence of assessment method on students’ learning approaches, perceptions, and preferences: Assignment essay versus short answer examination. Research and Development in Higher Education, 19(3), 776-781.

Spavold, Z. (2005). Using formative assessment to raise pupil motivation: A small classroom-based study. School Science Review, 86(317), 119-123.

Stiggins, R. J. (2002). Assessment crisis: The absence of assessment FOR learning. Phi Delta Kappan, 83(10), 758-765.

Stiggins, R. J. (2004). New assessment beliefs for a new school mission. Phi Delta Kappan, 86(1), 22-27.

Stiggins, R. J. (2005). From formative assessment to assessment for learning: A path to success in standard-based schools. Phi Delta Kappan, 87(4), 324-328.

Stiggins, R. J., & Chappuis, J. (2005). Using student-involved classroom assessment to close achievement gaps. Theory into Practice, 44(1), 11-18.

Stiggins, R. J., & Popham, W. J. (2008). Assessing students’ affect related to assessment for learning. Washington, DC: Council of Chief State School Officers.


About the Authors

Sasan Baleghizadeh is an assistant professor of TEFL at Shahid Beheshti University, G.C. in Iran, where he teaches courses in applied linguistics, syllabus design, and materials development. His published articles appear in journals like PROFILE, ELT Journal, and Modern English Teacher.

Zahra Zarghami holds an MA degree in TEFL from Allameh Tabataba’i University in Iran. She has vast experience of English language teaching at different proficiency levels. Her research interest lies in issues related to assessment for learning.


Appendix A: Sample Test Items


Appendix B: Conference Checklist

Directions

The following questions will be asked in a comfortable setting. The session will be carried out in a face-to-face situation. The students should feel safe and comfortable without any worry about the assessment atmosphere. They should be assured that the teacher is only interested in their thoughts, strengths, and weaknesses and helping them to facilitate learning grammar. The teacher can ask the students to elaborate on answers by using questions like:

-         Can you tell me more about it?

-         What else do you suggest?

The more the students talk, the more the teacher can get insight on their students and their process.

Part 1

The following questions will be asked in the very first session before doing anything:

-         What do you think about your grammar ability?

-         Do you think you are successful in learning grammar?

-         Who is a good grammar learner?

-         How do you try to learn grammar?

-         Which strategies do you use in learning grammar?

-         What do you do if you have problems in background information in grammar?

-         What do you do if you have problems with the meaning of key words in the process of learning grammar?

-         What do you do if you have difficulty in comprehending the structure of the context that you are going to learn?

-         What does your teacher do in helping you to improve your weaknesses in learning grammar?

-         What do you do in removing your friend’s problems in learning grammar?

Part 2

The following questions will be asked after covering each unit of the book:

-         Do you think you have been successful in learning the grammar structures?

-         What is the reason for your success/failure in learning these grammar points?

-         What is your strength in this lesson? Why do you think so?

-         What is your weakness in this lesson? Why?

-         In which part do you have a problem? Educational background, vocabulary, or comprehension of the passages?

-         Why do you think so? What is your reason?

-         Which strategy do you utilize in the process of learning grammar?

-         Which strategy do you utilize in overcoming your barriers?

How to Cite

APA

Baleghizadeh, S. and Zarghami, Z. (2012). The Impact of Conferencing Assessment on EFL Students’ Grammar Learning. Profile: Issues in Teachers’ Professional Development, 14(2), 131–144. https://revistas.unal.edu.co/index.php/profile/article/view/34068

ACM

[1]
Baleghizadeh, S. and Zarghami, Z. 2012. The Impact of Conferencing Assessment on EFL Students’ Grammar Learning. Profile: Issues in Teachers’ Professional Development. 14, 2 (Jul. 2012), 131–144.

ACS

(1)
Baleghizadeh, S.; Zarghami, Z. The Impact of Conferencing Assessment on EFL Students’ Grammar Learning. Profile: Issues Teach. Prof. Dev. 2012, 14, 131-144.

ABNT

BALEGHIZADEH, S.; ZARGHAMI, Z. The Impact of Conferencing Assessment on EFL Students’ Grammar Learning. Profile: Issues in Teachers’ Professional Development, [S. l.], v. 14, n. 2, p. 131–144, 2012. Disponível em: https://revistas.unal.edu.co/index.php/profile/article/view/34068. Acesso em: 29 mar. 2024.

Chicago

Baleghizadeh, Sasan, and Zahra Zarghami. 2012. “The Impact of Conferencing Assessment on EFL Students’ Grammar Learning”. Profile: Issues in Teachers’ Professional Development 14 (2):131-44. https://revistas.unal.edu.co/index.php/profile/article/view/34068.

Harvard

Baleghizadeh, S. and Zarghami, Z. (2012) “The Impact of Conferencing Assessment on EFL Students’ Grammar Learning”, Profile: Issues in Teachers’ Professional Development, 14(2), pp. 131–144. Available at: https://revistas.unal.edu.co/index.php/profile/article/view/34068 (Accessed: 29 March 2024).

IEEE

[1]
S. Baleghizadeh and Z. Zarghami, “The Impact of Conferencing Assessment on EFL Students’ Grammar Learning”, Profile: Issues Teach. Prof. Dev., vol. 14, no. 2, pp. 131–144, Jul. 2012.

MLA

Baleghizadeh, S., and Z. Zarghami. “The Impact of Conferencing Assessment on EFL Students’ Grammar Learning”. Profile: Issues in Teachers’ Professional Development, vol. 14, no. 2, July 2012, pp. 131-44, https://revistas.unal.edu.co/index.php/profile/article/view/34068.

Turabian

Baleghizadeh, Sasan, and Zahra Zarghami. “The Impact of Conferencing Assessment on EFL Students’ Grammar Learning”. Profile: Issues in Teachers’ Professional Development 14, no. 2 (July 1, 2012): 131–144. Accessed March 29, 2024. https://revistas.unal.edu.co/index.php/profile/article/view/34068.

Vancouver

1.
Baleghizadeh S, Zarghami Z. The Impact of Conferencing Assessment on EFL Students’ Grammar Learning. Profile: Issues Teach. Prof. Dev. [Internet]. 2012 Jul. 1 [cited 2024 Mar. 29];14(2):131-44. Available from: https://revistas.unal.edu.co/index.php/profile/article/view/34068

Download Citation

Article abstract page views

2839

Downloads

Download data is not yet available.