<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE article
  PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "https://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article article-type="research-article" dtd-version="1.1" specific-use="sps-1.9" xml:lang="en" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">
	<front>
		<journal-meta>
			<journal-id journal-id-type="publisher-id">prf</journal-id>
			<journal-title-group>
				<journal-title>Profile Issues in Teachers` Professional Development</journal-title>
				<abbrev-journal-title abbrev-type="publisher">profile</abbrev-journal-title>
			</journal-title-group>
			<issn pub-type="ppub">1657-0790</issn>
			<issn pub-type="epub">2256-5760</issn>
			<publisher>
				<publisher-name>Departamento de Lenguas Extranjeras, Universidad Nacional de Colombia.</publisher-name>
			</publisher>
		</journal-meta>
		<article-meta>
			<article-id pub-id-type="doi">10.15446/profile.v24n2.90518</article-id>
			<article-categories>
				<subj-group subj-group-type="heading">
					<subject>Issues Based on Reflections and Innovations</subject>
				</subj-group>
			</article-categories>
			<title-group>
				<article-title>English Language Teachers’ Perceived Classroom Assessment Knowledge and Practice: Developing and Validating a Scale</article-title>
				<trans-title-group xml:lang="es">
					<trans-title>Percepciones de docentes de inglés sobre el conocimiento y la práctica de la evaluación en el aula: desarrollo y validación de una escala</trans-title>
				</trans-title-group>
			</title-group>
			<contrib-group>
				<contrib contrib-type="author">
					<contrib-id contrib-id-type="orcid">0000-0002-0430-6408</contrib-id>
					<name>
						<surname>Tajeddin</surname>
						<given-names>Zia</given-names>
					</name>
					<xref ref-type="aff" rid="aff1"><sup>*</sup></xref>
				</contrib>
				<contrib contrib-type="author">
					<contrib-id contrib-id-type="orcid">0000-0003-2165-8891</contrib-id>
					<name>
						<surname>Saeedi</surname>
						<given-names>Zari</given-names>
					</name>
					<xref ref-type="aff" rid="aff2"><sup>**</sup></xref>
				</contrib>
				<contrib contrib-type="author">
					<contrib-id contrib-id-type="orcid">0000-0003-1007-1841</contrib-id>
					<name>
						<surname>Panahzadeh</surname>
						<given-names>Vahid</given-names>
					</name>
					<xref ref-type="aff" rid="aff3"><sup>***</sup></xref>
				</contrib>
			</contrib-group>
			<aff id="aff1">
				<label>*</label>
				<institution content-type="original"> Tarbiat Modares University, Tehran, Iran, tajeddinz@modares.ac.ir</institution>
				<institution content-type="normalized">Tarbiat Modarres University</institution>
				<institution content-type="orgname">Tarbiat Modares University</institution>
				<addr-line>
					<city>Tehran</city>
				</addr-line>
				<country country="IR">Iran</country>
				<email>tajeddinz@modares.ac.ir</email>
			</aff>
			<aff id="aff2">
				<label>**</label>
				<institution content-type="original"> Allameh Tabataba’i University, Tehran, Iran, saeedi.za@atu.ac.ir</institution>
				<institution content-type="normalized">Allameh Tabatabaii University</institution>
				<institution content-type="orgname">Allameh Tabataba’i University</institution>
				<addr-line>
					<city>Tehran</city>
				</addr-line>
				<country country="IR">Iran</country>
				<email>saeedi.za@atu.ac.ir</email>
			</aff>
			<aff id="aff3">
				<label>***</label>
				<institution content-type="original"> Allameh Tabataba’i University, Tehran, Iran, panahzadeh_vahid93@atu.ac.ir</institution>
				<institution content-type="normalized">Allameh Tabatabaii University</institution>
				<institution content-type="orgname">Allameh Tabataba’i University</institution>
				<addr-line>
					<city>Tehran</city>
				</addr-line>
				<country country="IR">Iran</country>
				<email>panahzadeh_vahid93@atu.ac.ir</email>
			</aff>
			<pub-date date-type="pub" publication-format="electronic">
				<day>24</day>
				<month>07</month>
				<year>2022</year>
			</pub-date>
			<pub-date date-type="collection" publication-format="electronic">
				<season>Jul-Dec</season>
				<year>2022</year>
			</pub-date>
			<volume>24</volume>
			<issue>2</issue>
			<fpage>247</fpage>
			<lpage>264</lpage>
			<history>
				<date date-type="received">
					<day>13</day>
					<month>09</month>
					<year>2020</year>
				</date>
				<date date-type="accepted">
					<day>14</day>
					<month>02</month>
					<year>2022</year>
				</date>
			</history>
			<permissions>
				<license license-type="open-access" xlink:href="https://creativecommons.org/licenses/by-nc-nd/4.0/" xml:lang="en">
					<license-p>This is an Open Access article distributed under the terms of the Creative Commons license Attribution-NonCommercial-NoDerivatives 4.0 International License.</license-p>
				</license>
			</permissions>
			<abstract>
				<title>Abstract</title>
				<p>This study sought to develop and validate a classroom-based language assessment literacy scale to measure teachers’ perceived classroom-based assessment knowledge and practice. Exploratory factor analysis revealed that the scale items clustered around four factors: (a) purposes of assessment and grading, (b) assessment ethics, (c) student involvement in assessment, and (d) feedback and assessment interpretation. Moreover, the scale was administered to 348 Iranian English as a foreign language teachers. The findings showed that the majority reported to be literate in classroom-based language assessment and agreed to the allocation of more space to classroom-based language assessment in teacher education courses. The findings suggest that the newly-developed scale can serve as a valid and reliable tool to explore language teachers’ classroom-based assessment literacy.</p>
			</abstract>
			<trans-abstract xml:lang="es">
				<title>Resumen</title>
				<p>Este estudio desarrolló y validó una escala de literacidad de evaluación del lenguaje para identificar las percepciones de 348 docentes de inglés iraníes sobre el conocimiento y la práctica de la evaluación en el aula. Los ítems de la escala se dividieron en: a) propósitos de evaluación y calificación, b) ética de la evaluación, c) participación de los estudiantes en la evaluación y d) retroalimentación e interpretación de la evaluación. Varios participantes informaron ser competentes en la evaluación del lenguaje en el aula y estuvieron de acuerdo con asignar más tiempo para este aspecto en los cursos de formación docente. La escala desarrollada puede ser una herramienta válida y confiable para explorar la competencia de evaluación en el aula de los profesores de idiomas.</p>
			</trans-abstract>
			<kwd-group xml:lang="en">
				<title><italic>Keywords:</italic></title>
				<kwd>classroom assessment</kwd>
				<kwd>classroom-based language assessment literacy</kwd>
				<kwd>English as a foreign language teachers</kwd>
				<kwd>scale development</kwd>
			</kwd-group>
			<kwd-group xml:lang="es">
				<title><italic>Palabras clave:</italic></title>
				<kwd>desarrollo de escalas</kwd>
				<kwd>evaluación en el aula</kwd>
				<kwd>evaluación del lenguaje en el aula</kwd>
				<kwd>literacidad</kwd>
				<kwd>profesores de inglés</kwd>
			</kwd-group>
			<counts>
				<fig-count count="0"/>
				<table-count count="6"/>
				<equation-count count="0"/>
				<ref-count count="46"/>
				<page-count count="18"/>
			</counts>
		</article-meta>
	</front>
	<body>
		<sec sec-type="intro">
			<title>Introduction</title>
			<p>Recent years have witnessed calls for teachers to view classroom assessment as an inseparable part of the teaching and learning process, and to use assessment data to enhance instruction and promote students’ learning (<xref ref-type="bibr" rid="B11">DeLuca et al., 2018</xref>; <xref ref-type="bibr" rid="B36">Shepard, 2013</xref>). In line with the educational shift in the conceptualization of assessment, it has been contended that teachers need to gain competency in utilizing a variety of methods in assessing students’ learning, irrespective of whether the assessment is employed to support learning through provision of feedback (<xref ref-type="bibr" rid="B24">Lee, 2017</xref>), or it is used to measure learning outcomes (<xref ref-type="bibr" rid="B4">Campbell, 2013</xref>). Despite repeated calls for teachers’ capitalization on various assessment methods and skills (e.g., <xref ref-type="bibr" rid="B40">Taylor, 2013</xref>), research has generally shown that teachers lack adequate assessment proficiency-or what has come to be known as assessment literacy-to take advantage of and inform their instructional practice (<xref ref-type="bibr" rid="B7">DeLuca &amp; Bellara, 2013</xref>; <xref ref-type="bibr" rid="B15">Fives &amp; Barnes, 2020</xref>; <xref ref-type="bibr" rid="B33">Popham, 2009</xref>). In one of the early attempts to introduce the concept of assessment literacy, <xref ref-type="bibr" rid="B37">Stiggins (1995)</xref> defined it as teachers’ understanding of “the difference between sound and unsound assessment” (p. 240). <xref ref-type="bibr" rid="B34">Popham (2011)</xref> defined assessment literacy as “an individual’s understandings of the fundamental assessment concepts and procedures deemed likely to influence educational decisions” (p. 265).</p>
			<p>The term language assessment literacy (LAL) has recently appeared in the literature on assessment literacy owing to the distinctive features of the context of language teaching and learning (<xref ref-type="bibr" rid="B18">Inbar-Lourie, 2008</xref>; <xref ref-type="bibr" rid="B26">Levi &amp; Inbar-Lourie, 2019</xref>). Although the concept of LAL is relatively new, it is a large and gradually developing construct in applied linguistics, which has been conceptualized in various ways in the literature (<xref ref-type="bibr" rid="B16">Fulcher, 2012</xref>; <xref ref-type="bibr" rid="B18">Inbar-Lourie, 2008</xref>, <xref ref-type="bibr" rid="B20">2017</xref>; <xref ref-type="bibr" rid="B23">Lan &amp; Fan, 2019</xref>; <xref ref-type="bibr" rid="B25">Lee &amp; Butler, 2020</xref>). Due to the increasing importance of LAL in meeting the increased demand for and the use of assessment data by language teachers and other stakeholders for the new age (<xref ref-type="bibr" rid="B19">Inbar-Lourie, 2013</xref>; <xref ref-type="bibr" rid="B41">Tsagari &amp; Vogt, 2017</xref>), researchers have paid close attention to the investigation of teachers’ LAL (e.g., <xref ref-type="bibr" rid="B11">DeLuca et al., 2018</xref>; <xref ref-type="bibr" rid="B22">Lam, 2019</xref>; <xref ref-type="bibr" rid="B45">Xu &amp; Brown, 2017</xref>). The majority of these studies have utilized multiple-choice or scenario-based scales inquiring into preservice or in-service language teachers’ assessment knowledge, beliefs, and/or practice (e.g., <xref ref-type="bibr" rid="B29">Ölmezer-Öztürk &amp; Aydin, 2018</xref>; <xref ref-type="bibr" rid="B39">Tajeddin et al., 2018</xref>). The scales used in these studies have been tightly aligned with the seven Standards for Teacher Competence in Educational Assessment for Students (<xref ref-type="bibr" rid="B1">American Federation of Teachers [AFT] et al., 1990</xref>). However, as <xref ref-type="bibr" rid="B3">Brookhart (2011)</xref> noted, the 1990 standards are outdated in terms of not considering current conceptions of formative assessment knowledge and skills as well as of accountability concerns. Moreover, the previous scales have briefly touched on teachers’ classroom-based assessment literacy. According to <xref ref-type="bibr" rid="B43">Xu (2017)</xref>, classroom assessment literacy refers to “teachers’ knowledge of assessment in general and of the contingent relationship between assessment, teaching, and learning, as well as abilities to conduct assessment in the classroom to optimize such contingency” (p. 219). The present study, therefore, addresses the classroom-based assessment gap by developing a classroom-based language assessment literacy (CBLAL) scale to come up with items that solicit realistic and meaningful data applicable to the classroom context. Also, the study seeks to explore Iranian English as a foreign language (EFL) teachers’ status of classroom-based language assessment knowledge and practice using the newly-developed scale.</p>
		</sec>
		<sec>
			<title>Literature Review</title>
			<sec>
				<title>Conceptualization of Language Assessment Literacy</title>
				<p>The term LAL has been conceptualized by many scholars in the past two decades (e.g., <xref ref-type="bibr" rid="B16">Fulcher, 2012</xref>; <xref ref-type="bibr" rid="B18">Inbar-Lourie, 2008</xref>; <xref ref-type="bibr" rid="B31">Pill &amp; Harding, 2013</xref>; <xref ref-type="bibr" rid="B40">Taylor, 2013</xref>). LAL was defined by <xref ref-type="bibr" rid="B18">Inbar-Lourie (2008)</xref> as “having the capacity to ask and answer critical questions about the purpose for assessment, about the fitness of the tool being used, about testing conditions, and about what is going to happen on the basis of the test results” (p. 389). <xref ref-type="bibr" rid="B16">Fulcher (2012)</xref> defined LAL as “the knowledge, skills and abilities required to design, develop, maintain or evaluate, large-scale standardized and/or classroom-based tests, familiarity with test processes, and awareness of principles and concepts that guide and underpin practice, including ethics and codes of practice” (p. 125).</p>
				<p>
					<xref ref-type="bibr" rid="B40">Taylor (2013)</xref> put forward a model of assessment competency and expertise for different stakeholder groups. Placing language teachers at an intermediary position between the measurement specialist and the general public, Taylor argued that LAL is best defined in terms of the particular needs of each stakeholder group. Likewise, <xref ref-type="bibr" rid="B31">Pill and Harding (2013)</xref> rejected a dichotomous view of “literacy” or “illiteracy,” arguing for viewing LAL in terms of a continuum from “illiteracy” to “multidimensional literacy.” They contended that non-practitioners do not require assessment literacy at the “multidimensional level” or the “procedural level;” rather, it would be desirable for policy makers and other non-practitioners to gain “functional level” of assessment literacy in order to deal with language tests.</p>
			</sec>
			<sec>
				<title>Research on Language Assessment Literacy</title>
				<p>The last two decades have witnessed an increasing number of studies investigating teachers’ self‐described levels of assessment literacy (e.g., <xref ref-type="bibr" rid="B23">Lan &amp; Fan, 2019</xref>; <xref ref-type="bibr" rid="B42">Vogt &amp; Tsagari, 2014</xref>; <xref ref-type="bibr" rid="B45">Xu &amp; Brown, 2017</xref>), approaches to assessment (e.g., <xref ref-type="bibr" rid="B11">DeLuca et al., 2018</xref>; <xref ref-type="bibr" rid="B9">DeLuca et al., 2019</xref>; <xref ref-type="bibr" rid="B39">Tajeddin et al., 2018</xref>), perceptions about language assessment (e.g., <xref ref-type="bibr" rid="B41">Tsagari &amp; Vogt, 2017</xref>), and assessment use confidence (e.g., <xref ref-type="bibr" rid="B2">Berry et al., 2019</xref>). For instance, <xref ref-type="bibr" rid="B42">Vogt and Tsagari (2014)</xref> explored the current level of language testing and assessment (LTA) literacy of foreign language teachers from seven European countries through questionnaires and teacher interviews. They found that the LTA literacy of teachers was not very well-developed. <xref ref-type="bibr" rid="B45">Xu and Brown (2017)</xref>, utilizing an adapted version of the Teacher Assessment Literacy Questionnaire developed by <xref ref-type="bibr" rid="B32">Plake et al. (1993)</xref>, investigated 891 Chinese university English teachers’ assessment literacy levels and the effects of their demographic characteristics on assessment literacy performance. The findings of the study revealed that the vast majority of the teachers had very basic to minimally acceptable competencies in certain dimensions of assessment literacy. <xref ref-type="bibr" rid="B39">Tajeddin et al. (2018)</xref> aimed at exploring 26 novice and experienced language teachers’ knowledge and practices with regard to speaking assessment purposes, criteria, and methods. The researchers concluded that, although divergence between novice and experienced teachers’ knowledge and practice of assessment purpose was moderate, the data revealed more consistency in the experienced teachers’ assessment literacy for speaking.</p>
				<p>
					<xref ref-type="bibr" rid="B11">DeLuca et al. (2018)</xref> sought to explore 404 Canadian and American teachers’ perceived skills in classroom assessment across their career stage (i.e., teaching experience). The researchers observed that more experienced teachers, as opposed to less experienced teachers, reported greater skill in monitoring, analyzing, and communicating assessment results as well as assessment design, implementation, and feedback.</p>
				<p>More recently, <xref ref-type="bibr" rid="B9">DeLuca et al. (2019)</xref> looked into 453 novice teachers’ classroom assessment approaches using five assessment scenarios. The researchers observed that teachers were quite consistent regarding their learning principles. However, they showed some difference in their actual classroom practice, indicating the situated nature of classroom assessment practice. In another study, <xref ref-type="bibr" rid="B23">Lan and Fan (2019)</xref> explored 344 in-service Chinese EFL teachers’ status of LAL. They observed that teachers’ classroom-based LAL was at the functional level, namely “sound understanding of basic terms and concepts” (<xref ref-type="bibr" rid="B31">Pill &amp; Harding, 2013</xref>, p. 383). The researchers concluded that teacher education courses should acquaint teachers with the necessary knowledge and skills for conducting classroom-based assessment literacy.</p>
				<p>Considering the variety of conceptualizations of the term LAL and its intricacies, more studies must be carried out in local contexts (<xref ref-type="bibr" rid="B20">Inbar-Lourie, 2017</xref>) with a focus on language teachers’ perspectives (<xref ref-type="bibr" rid="B25">Lee &amp; Butler, 2020</xref>) in order to help the field come to grips with the dynamics of the issue. Also, as the above literature review shows, we still have a limited understanding of language teachers’ classroom-based assessment literacy. Against this backdrop, the main purpose of this study is twofold: developing and validating a new CBLAL scale to assess language teachers’ perceived classroom-based assessment knowledge and practice, and looking into Iranian EFL teachers’ perceived classroom-based language assessment knowledge and practice. The following are the questions that guided this study:</p>
				<p>
					<list list-type="order">
						<list-item>
							<p>Which factors underlie language teachers’ perceived classroom-based assessment knowledge and practice?</p>
						</list-item>
						<list-item>
							<p>What is Iranian EFL teachers’ perceived classroom-based assessment knowledge and practice?</p>
						</list-item>
					</list>
				</p>
			</sec>
		</sec>
		<sec sec-type="methods">
			<title>Method</title>
			<sec>
				<title>Participants</title>
				<p>The participants of the study for the initial piloting of the scale were 54 Iranian EFL teachers, including 23 male (42.6%) and 31 female (57.4%) teachers. It should be noted that the pilot study participants, albeit identical to those in the main study, did not take part in the later study. A total of 346 EFL teachers, including 143 men (41.3%) and 203 women (58.7%), participated in the development and validation of the scale as well as the investigation of the classroom-based assessment literacy of EFL teachers. The teachers were all teaching general English (i.e., integrated four language skills) to various levels and age groups in private language schools in the Iranian context. In these language schools, teachers are required to follow a fixed syllabus using well-known communication-oriented international textbook series (<xref ref-type="bibr" rid="B35">Sadeghi &amp; Richards, 2015</xref>). The private school supervisors commonly use written examinations and interviews to recruit qualified teachers and regularly observe their performance for promotional and career growth purposes (<xref ref-type="bibr" rid="B35">Sadeghi &amp; Richards, 2015</xref>).</p>
				<p>The participants’ ages ranged between 18 and 67, with the average age of 32. All teachers, based on a convenience sampling procedure, voluntarily took part in the study. More than half of the teachers had majored in teaching EFL. Moreover, almost half of the teachers had taken a language testing/assessment course at university (see <xref ref-type="table" rid="t1">Table 1</xref>).</p>
				<p>
					<table-wrap id="t1">
						<label>Table 1</label>
						<caption>
							<title>Demographic Information of the Participants</title>
						</caption>
						<table>
							<colgroup>
								<col/>
								<col/>
								<col/>
								<col/>
							</colgroup>
							<thead>
								<tr>
									<th align="left"> </th>
									<th align="center">Category</th>
									<th align="center"><italic>n</italic></th>
									<th align="center">%</th>
								</tr>
							</thead>
							<tbody>
								<tr>
									<td align="left" rowspan="2">Gender</td>
									<td align="left">Male</td>
									<td align="center">143</td>
									<td align="center">41.3</td>
								</tr>
								<tr>
									<td align="left">Female</td>
									<td align="center">203</td>
									<td align="center">58.7</td>
								</tr>
								<tr>
									<td align="left" rowspan="6">Educational level</td>
									<td align="left">BA student</td>
									<td align="center">71</td>
									<td align="center">20.5</td>
								</tr>
								<tr>
									<td align="left">BA graduate</td>
									<td align="center">50</td>
									<td align="center">14.5</td>
								</tr>
								<tr>
									<td align="left">MA student</td>
									<td align="center">38</td>
									<td align="center">11</td>
								</tr>
								<tr>
									<td align="left">MA graduate</td>
									<td align="center">130</td>
									<td align="center">37.6</td>
								</tr>
								<tr>
									<td align="left">PhD student</td>
									<td align="center">41</td>
									<td align="center">11.8</td>
								</tr>
								<tr>
									<td align="left">PhD graduate</td>
									<td align="center">16</td>
									<td align="center">4.6</td>
								</tr>
								<tr>
									<td align="left" rowspan="5">Field of education</td>
									<td align="left">TEFL</td>
									<td align="center">187</td>
									<td align="center">54</td>
								</tr>
								<tr>
									<td align="left">English language literature</td>
									<td align="center">108</td>
									<td align="center">31.2</td>
								</tr>
								<tr>
									<td align="left">Translation studies</td>
									<td align="center">14</td>
									<td align="center">4.1</td>
								</tr>
								<tr>
									<td align="left">Linguistics</td>
									<td align="center">12</td>
									<td align="center">3.5</td>
								</tr>
								<tr>
									<td align="left">Other</td>
									<td align="center">25</td>
									<td align="center">7.2</td>
								</tr>
								<tr>
									<td align="left" rowspan="2">Taken assessment/testing course at university</td>
									<td align="left">Yes</td>
									<td align="center">167</td>
									<td align="center">48.3</td>
								</tr>
								<tr>
									<td align="left">No</td>
									<td align="center">179</td>
									<td align="center">51.7</td>
								</tr>
							</tbody>
						</table>
					</table-wrap>
				</p>
			</sec>
			<sec>
				<title>Scale Development</title>
				<p>The process of developing the CBLAL scale began with a review of previously validated assessment literacy scales in the literature (e.g., <xref ref-type="bibr" rid="B16">Fulcher, 2012</xref>; <xref ref-type="bibr" rid="B28">Mertler &amp; Campbell, 2005</xref>; <xref ref-type="bibr" rid="B32">Plake et al., 1993</xref>; <xref ref-type="bibr" rid="B46">Zhang &amp; Burry-Stock, 1994</xref>). It was observed that most previous studies exploring teachers’ assessment literacy used the Teacher Assessment Literacy Questionnaire (<xref ref-type="bibr" rid="B32">Plake et al., 1993</xref>), or its adapted version-the Assessment Literacy Inventory (<xref ref-type="bibr" rid="B28">Mertler &amp; Campbell, 2005</xref>). Both scales included 35 items assessing teachers’ understanding of general concepts about testing and assessment, which were tightly aligned to the seven Standards for Teacher Competence in Educational Assessment for Students (<xref ref-type="bibr" rid="B1">AFT et al., 1990</xref>).</p>
				<p>To address the recent conceptualizations of classroom-based assessment needs of language teachers (<xref ref-type="bibr" rid="B3">Brookhart, 2011</xref>), we set out to develop a CBLAL scale based on <xref ref-type="bibr" rid="B44">Xu and Brown’s (2016)</xref> six-component interrelated framework of teacher assessment literacy in practice (TALiP). Assessment knowledge base, constituting the basis of the framework, was used for developing the CBLAL scale in this study. According to Xu and Brown, teacher assessment knowledge base refers to “a core body of formal, systematic, and codified principles concerning good assessment practice” (p. 155). The key domains of teacher assessment knowledge base are briefly defined next:</p>
				<p>
					<list list-type="bullet">
						<list-item>
							<p>Disciplinary knowledge and pedagogical content knowledge: knowledge of the content and the general principles regarding how it is taught or learned.</p>
						</list-item>
						<list-item>
							<p>Knowledge of assessment purposes, content, and methods: knowledge of the general objectives of assessment and the relevant assessment tasks and strategies.</p>
						</list-item>
						<list-item>
							<p>Knowledge of grading: knowledge of rationale, methods, content, and criteria for grading and scoring students’ performance. </p>
						</list-item>
						<list-item>
							<p>Knowledge of feedback: knowledge of the types and functions of various feedback strategies for enhancing learning.</p>
						</list-item>
						<list-item>
							<p>Knowledge of assessment interpretation and communication: knowledge of effective interpretation of assessment results and how to communicate them to stakeholders.</p>
						</list-item>
						<list-item>
							<p>Knowledge of student involvement in assessment: knowledge of the benefits and strategies of engaging students in the assessment process.</p>
						</list-item>
						<list-item>
							<p>Knowledge of assessment ethics: knowledge of observing ethical and legal considerations (i.e., social justice) in the assessment process.</p>
						</list-item>
					</list>
				</p>
				<p>It should be noted that as the present study sought to develop a language assessment-specific scale to be used for pinpointing language teachers’ classroom-based assessment knowledge and practice, the first domain of the assessment knowledge base (i.e., disciplinary knowledge and pedagogical content knowledge) was not taken into account in the scale development process. It was reasoned that an adequate understanding of the disciplinary content and the general principles regarding how it is taught or learned is a pre-requisite for all teachers regardless of the content area they are teaching (<xref ref-type="bibr" rid="B3">Brookhart, 2011</xref>; <xref ref-type="bibr" rid="B14">Firoozi et al., 2019</xref>).</p>
				<p>After existing assessment literacy scales were reviewed by the researchers, 25 assessment knowledge items (i.e., 62.5%) which corresponded to the subcomponents of the Xu and Brown’s framework (i.e., assessment literacy knowledge base) were identified. Next, they were borrowed and reworded in order to measure teachers’ (a) perceived classroom-based assessment knowledge, and (b) perceived classroom-based assessment practice. According to <xref ref-type="bibr" rid="B12">Dörnyei (2003</xref>, p. 52), “borrowing questions from established questionnaires” is one of the sources that successful item designers mostly rely on. Then, 15 assessment knowledge items (i.e., 37.5%) were originally developed by the authors to ensure an acceptable number of items for each subcomponent. Later, 40 assessment practice items were developed corresponding to the 40 assessment knowledge items. The final draft of the CBLAL scale comprised a demographics part and two other sections. The demographics part consisted of both open-ended and close-ended items inquiring into the participants’ demographic information. The first of the other two sections aimed at exploring language teachers’ knowledge of classroom-based assessment. A pool of 40 items was generated in line with the six components of Xu and Brown’s framework. The items asked teachers to evaluate their own knowledge of assessment on a Likert scale ranging from 1 (<italic>strongly disagree</italic>), 2 (<italic>slightly disagree</italic>), 3 (<italic>moderately agree</italic>), to 4 (<italic>strongly agree</italic>). The items in the second part of the last two sections of the scale corresponded to the preceding one in terms of the targeted construct (i.e., the six components of Xu and Brown’s framework). However, they were modified to probe into the teachers’ perceptions of their classroom-based assessment practices. There was a total of 40 items on a Likert scale ranging from 1 (<italic>never</italic>), 2 (<italic>rarely</italic>), 3 (<italic>sometimes</italic>), 4 (<italic>very often</italic>), to 5 (<italic>always</italic>) in the last section of the scale.</p>
				<p>Prior to subjecting the scale to psychometric analysis, it was filled out by six teachers to check the intelligibility of the items. Having resolved the ambiguities and unintelligible items based on the teachers’ feedback, the researchers pilot-tested the scale on 54 teachers to assure its reliability using Cronbach’s alpha. As for the validity of the scale, a panel of four instructors doing their PhD in applied linguistics was consulted to review the content validity of the scale. Also, exploratory factor analysis (EFA) was performed in the main phase of the study to extract major factors and item loadings of the scale. To do so, the scale was distributed among a large pool of language teachers through both online media (e.g., email) and personal contacts. Overall, a total of 346 teachers filled out the scale and their results were subjected to EFA. Regarding the status of language teachers’ classroom-based assessment knowledge and practice, descriptive statistics were run based on the teachers’ responses to the CBLAL scale.</p>
			</sec>
		</sec>
		<sec sec-type="results">
			<title>Results</title>
			<p>Having validated and administered the CBLAL scale to Iranian EFL teachers, the researchers explored teachers’ status of classroom-based assessment knowledge and practice. The findings are presented in the following sections.</p>
			<sec>
				<title>Exploratory Factor Analysis</title>
				<p>The initial 80 items of teachers’ CBLAL, including 40 classroom-based assessment knowledge items on a 4-point Likert scale and 40 classroom-based assessment practice items on a 5-point Likert scale, were subjected to EFA, namely principal axis factoring (PAF) with direct Oblimin rotation. The suitability of data for factor analysis was investigated prior to performing PAF. First, the normality of the distribution of the data was checked by considering the skewness and kurtosis measures of the items. It was found that all items’ statistics ranged between -2 and +2, satisfying the assumption of normality (<xref ref-type="bibr" rid="B38">Tabachnick &amp; Fidell, 2013</xref>). Second, the Kaiser-Meyer-Olkin (KMO) measure was used to estimate the sampling adequacy for the analysis. As can be seen in <xref ref-type="table" rid="t2">Table 2</xref>, the KMO value was .91 for assessment knowledge items and .92 for assessment practice items, exceeding the recommended minimum value of .6 (<xref ref-type="bibr" rid="B30">Pallant, 2016</xref>; <xref ref-type="bibr" rid="B38">Tabachnick &amp; Fidell, 2013</xref>). Further, as shown in <xref ref-type="table" rid="t2">Table 2</xref>, Bartlett’s test of sphericity reached statistical significance for both measures, which indicated that correlations between items were sufficiently large for PAF.</p>
				<p>
					<table-wrap id="t2">
						<label>Table 2</label>
						<caption>
							<title>Kaiser-Meyer-Olkin and Bartlett’s Test</title>
						</caption>
						<table>
							<colgroup>
								<col span="2"/>
								<col/>
								<col/>
							</colgroup>
							<thead>
								<tr>
									<th align="left" colspan="2"/>
									<th align="center">Assessment knowledge</th>
									<th align="center">Assessment practice</th>
								</tr>
							</thead>
							<tbody>
								<tr>
									<td align="left" colspan="2">Kaiser-Meyer-Olkin measure of sampling adequacy </td>
									<td align="center">.916</td>
									<td align="center">.922</td>
								</tr>
								<tr>
									<td align="left">Bartlett’s test of sphericity</td>
									<td align="center">Approx. Chi-square</td>
									<td align="center">9367.020</td>
									<td align="center">9664.243</td>
								</tr>
								<tr>
									<td align="left"> </td>
									<td align="center"><italic>df</italic></td>
									<td align="center">780</td>
									<td align="center">780</td>
								</tr>
								<tr>
									<td align="left"> </td>
									<td align="center">Sig.</td>
									<td align="center">.000</td>
									<td align="center">.000</td>
								</tr>
							</tbody>
						</table>
					</table-wrap>
				</p>
				<p>After running PAF on the assessment knowledge items, an initial 7-factor solution emerged with eigenvalues exceeding 1, explaining 40.4%, 5.8%, 4.7%, 4.2%, 3.1%, 2.8%, and 2.6% of the variance, respectively. However, an inspection of the scree plot and parallel analysis showed only four factors with eigenvalues exceeding the corresponding criterion values for a randomly generated data matrix of the same size (40 variables × 346 respondents; <xref ref-type="bibr" rid="B30">Pallant, 2016</xref>). The final 4-factor solution of assessment knowledge measure explained a total of 55.3% of the variance. The internal consistency of the assessment knowledge scale as a whole was estimated and its Cronbach’s Alpha was found to be .95.</p>
				<p>Regarding assessment practice items, an initial 7-factor solution emerged, with eigenvalues exceeding 1, explaining 40.6%, 7.2%, 4.5%, 4.0%, 3.3%, 2.9%, and 2.5% of the variance, respectively. The inspection of the scree plot and parallel analysis, however, yielded a 4-factor solution for the assessment practice scale, which explained a total of 56.5% of the variance. The internal consistency of the assessment practice scale as a whole was estimated and its Cronbach’s Alpha was found to be .94.</p>
				<p>To aid in the interpretation of the extracted factors, Oblimin rotation was performed. Also, only variables with loadings of .4 and above were interpreted, as suggested by <xref ref-type="bibr" rid="B13">Field (2013)</xref>. It should be noted that Items 25 and 26 were omitted from the assessment knowledge scale due to their low coefficients (see <xref ref-type="table" rid="t3">Table 3</xref>). Moreover, Item 21 was omitted from the assessment knowledge scale due to cross-loadings. Regarding the assessment practice scale pattern matrix (<xref ref-type="table" rid="t4">Table 4</xref>), Items 14, 15, 16, and 40 were suppressed by SPSS from the factor solution because of their low coefficients.</p>
				<p>
					<table-wrap id="t3">
						<label>Table 3</label>
						<caption>
							<title>Pattern Matrix of the Extracted Factors for Assessment Knowledge</title>
						</caption>
						<table>
							<colgroup>
								<col span="6"/>
							</colgroup>
							<thead>
								<tr>
									<th align="center" colspan="6">Assessment knowledge factors </th>
								</tr>
							</thead>
							<tbody>
								<tr>
									<td align="left">Item #</td>
									<td align="left"> </td>
									<td align="center">1</td>
									<td align="center">2</td>
									<td align="center">3</td>
									<td align="center">4</td>
								</tr>
								<tr>
									<td align="left">1</td>
									<td align="left">I am familiar with using classroom tests (e.g., quizzes) to pinpoint students’ strengths and weaknesses to plan further instruction.</td>
									<td align="center">.875</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">2</td>
									<td align="left">I know how to use classroom tests for the purpose of assigning grades to students.</td>
									<td align="center">.766</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">3</td>
									<td align="left">I know how to use classroom tests to track students’ progress during the course.</td>
									<td align="center">.603</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">4</td>
									<td align="left">I am knowledgeable about using classroom tests for the purpose of planning future instruction.</td>
									<td align="center">.721</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">5</td>
									<td align="left">I have sufficient knowledge to use classroom tests to help me divide students into different groups for instructional purposes.</td>
									<td align="center">.704</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">6</td>
									<td align="left">I know how to use various types of classroom tests (e.g., speaking tests or grammar quizzes) depending on the intended course objectives.</td>
									<td align="center">.770</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">7</td>
									<td align="left">I can adapt tests found in teachers’ guidebooks to fit intended course objectives.</td>
									<td align="center">.577</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">8</td>
									<td align="left">I know how to use a detailed description of intended course objectives to develop classroom tests.</td>
									<td align="center">.754</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">9</td>
									<td align="left">I am familiar with using different types of classroom tests to assign grades to students.</td>
									<td align="center">.636</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">10</td>
									<td align="left">I maintain detailed records of each student’s classroom test results to help me assign grades.</td>
									<td align="center">.497</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">11</td>
									<td align="left">I know how to grade each student’s classroom test performance against other students’ test performance.</td>
									<td align="center">.623</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">12</td>
									<td align="left">I can develop rating scales to help me grade students’ classroom test performance.</td>
									<td align="center">.703</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">13</td>
									<td align="left">I have sufficient knowledge about grade students’ classroom test performance against certain achievement goals.</td>
									<td align="center">.742</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">14</td>
									<td align="left">I know how to consult with experienced colleagues about rating scales they use to grade students’ classroom test performance.</td>
									<td align="center">.555</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">15</td>
									<td align="left">I know how to consult with my colleagues about assigning grades to students.</td>
									<td align="center">.486</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">33</td>
									<td align="left">I recognize students’ cultural diversity and eliminate offensive language and content of classroom tests.</td>
									<td align="center"> </td>
									<td align="center">.745</td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">34</td>
									<td align="left">I know how to match the contents of my classroom tests with the contents of my teaching and intended course objectives.</td>
									<td align="center"> </td>
									<td align="center">.464</td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">35</td>
									<td align="left">I know how to use the same classroom tests and the same rating scales for all students to avoid bias.</td>
									<td align="center"> </td>
									<td align="center">.631</td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">36</td>
									<td align="left">I observe assessment fairness by avoiding giving lower grades to students from lower socioeconomic status.</td>
									<td align="center"> </td>
									<td align="center">.497</td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">37</td>
									<td align="left">I am knowledgeable about how to help students with learning disability during classroom tests.</td>
									<td align="center"> </td>
									<td align="center">.586</td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">38</td>
									<td align="left">I know how to avoid using new items in my classroom tests which did not appear on the course syllabus.</td>
									<td align="center"> </td>
									<td align="center">.440</td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">39</td>
									<td align="left">I know how to inform students of the test item formats (e.g., multiple choice or essay) prior to classroom tests.</td>
									<td align="center"> </td>
									<td align="center">.417</td>
									<td align="center"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">40</td>
									<td align="left">I know how to announce students’ classroom test scores individually, rather than publicly, to avoid making them get embarrassed.</td>
									<td align="left"> </td>
									<td align="center">.459</td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">27</td>
									<td align="left">I encourage students to assess their own classroom test performance to enhance their learning.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.472</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">28</td>
									<td align="left">I help my students learn how to grade their own classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.446</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">29</td>
									<td align="left">I can ask top students in my class to help me assess other students’ classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.811</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">30</td>
									<td align="left">I know how to encourage students to provide their classmates with feedback on their classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.740</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">31</td>
									<td align="left">I know how to give students clear rating scales by which they can assess each other’s classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.635</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">32</td>
									<td align="left">I know how to explain to students the rating scales I apply to grade their classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.464</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">16</td>
									<td align="left">I provide students with regular feedback on their classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.529</td>
								</tr>
								<tr>
									<td align="left">17</td>
									<td align="left">I provide students with specific, practical suggestions to help them improve their test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.690</td>
								</tr>
								<tr>
									<td align="left">18</td>
									<td align="left">I praise students for their good performance on classroom tests.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.688</td>
								</tr>
								<tr>
									<td align="left">19</td>
									<td align="left">I know how to remind students of their strengths and weaknesses in their classroom test performance to help them improve their learning.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.620</td>
								</tr>
								<tr>
									<td align="left">20</td>
									<td align="left">I know how to encourage my students to improve their classroom test performance according to the feedback provided by me.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.689</td>
								</tr>
								<tr>
									<td align="left">21</td>
									<td align="left">I use classroom test results to determine if students have met course objectives.</td>
									<td align="center">.411</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.465</td>
								</tr>
								<tr>
									<td align="left">22</td>
									<td align="left">I know how to use classroom test results to decide whether students can proceed to the next stage of learning.</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.606</td>
								</tr>
								<tr>
									<td align="left">23</td>
									<td align="left">I can construct an accurate report about students’ classroom test performance to communicate it to both parents and/or institute managers.</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.466</td>
								</tr>
								<tr>
									<td align="left">24</td>
									<td align="left">I speak understandably with students about the meaning of the report card grades to help them improve their test performance.</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.597</td>
								</tr>
							</tbody>
						</table>
					</table-wrap>
				</p>
				<p>The assessment knowledge items that clustered around the same factors (bolded items) in the pattern matrix presented in <xref ref-type="table" rid="t3">Table 3</xref> suggested that factor one, containing 15 items, represented “Knowledge of Assessment Use and Grading.” The items elicit teachers’ familiarity with the purpose of classroom tests and how to choose appropriate classroom tests to fit intended course objectives. Factor one also probes into teachers’ knowledge of grading students’ classroom test performance and how to get assistance from experienced colleagues in this regard. Factor two comprised 8 items which represented “Knowledge of Assessment Ethics.” It elicits teachers’ knowledge of how to observe assessment fairness and to avoid assessment bias in classroom tests. Factor three comprised 6 items which tapped on “Knowledge of Student Involvement in Assessment.” The items examine teachers’ knowledge of strategies to encourage students to assess their own and their peers’ classroom test performance. Finally, factor four consisted of 8 items which represented “Knowledge of Feedback and Assessment Interpretation.” The items inquire into teachers’ knowledge of providing students with regular feedback (i.e., practical suggestions) to help them improve their test performance. The items also elicit teachers’ familiarity with reporting and communicating students’ classroom test performance to both parents and/or school managers.</p>
				<p>
					<table-wrap id="t4">
						<label>Table 4</label>
						<caption>
							<title>Pattern Matrix of the Extracted Factors for Assessment Practice</title>
						</caption>
						<table>
							<colgroup>
								<col span="6"/>
							</colgroup>
							<thead>
								<tr>
									<th align="center" colspan="6">Assessment practice factors </th>
								</tr>
							</thead>
							<tbody>
								<tr>
									<td align="left">Item #</td>
									<td align="left"> </td>
									<td align="center">1</td>
									<td align="center">2</td>
									<td align="center">3</td>
									<td align="center">4</td>
								</tr>
								<tr>
									<td align="left">1</td>
									<td align="left">Using classroom tests (e.g., quizzes) to pinpoint students’ strengths and weaknesses to plan further instruction.</td>
									<td align="center">.790</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">2</td>
									<td align="left">Using classroom tests for the purpose of assigning grades to students.</td>
									<td align="center">.700</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">3</td>
									<td align="left">Using classroom tests to track students’ progress during the course.</td>
									<td align="center">.739</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">4</td>
									<td align="left">Using classroom tests for the purpose of planning future instruction.</td>
									<td align="center">.786</td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">5</td>
									<td align="left">Using classroom tests to help me divide students into different groups for instructional purposes.</td>
									<td align="center">.620</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">6</td>
									<td align="left">Using various types of classroom tests (e.g., speaking tests or grammar quizzes) depending on the intended course objectives.</td>
									<td align="center">.648</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">7</td>
									<td align="left">Adapting tests found in teachers’ guidebooks to fit intended course objectives.</td>
									<td align="center">.558</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">8</td>
									<td align="left">Using a detailed description of intended course objectives to develop classroom tests.</td>
									<td align="center">.636</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">9</td>
									<td align="left">Using different types of classroom tests to assign grades to students. </td>
									<td align="center">.560</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">10</td>
									<td align="left">Maintaining detailed records of each student’s classroom test results to help me assign grades.</td>
									<td align="center">.461</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">11</td>
									<td align="left">Grading each student’s classroom test performance against other students’ test performance.</td>
									<td align="center">.621</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">12</td>
									<td align="left">Developing rating scales to help me grade students’ classroom test performance.</td>
									<td align="center">.698</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">13</td>
									<td align="left">Grading students’ classroom test performance against certain achievement goals.</td>
									<td align="center">.574</td>
									<td align="center"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">17</td>
									<td align="left">Providing students with specific, practical suggestions to help them improve their test performance.</td>
									<td align="center"> </td>
									<td align="center">.429</td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">32</td>
									<td align="left">Explaining to students the rating scales I apply to grade their classroom test performance.</td>
									<td align="center"> </td>
									<td align="center">.497</td>
									<td align="left"> </td>
									<td align="left"> </td>
								</tr>
								<tr>
									<td align="left">33</td>
									<td align="left">Recognizing students’ cultural diversity and eliminating offensive language and content of classroom tests. </td>
									<td align="left"> </td>
									<td align="center">.724</td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">34</td>
									<td align="left">Matching the contents of my classroom tests with the contents of my teaching and intended course objectives.</td>
									<td align="left"> </td>
									<td align="center">.591</td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">35</td>
									<td align="left">Using the same classroom tests and the same rating scales for all students to avoid bias.</td>
									<td align="left"> </td>
									<td align="center">.752</td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">36</td>
									<td align="left">Observing assessment fairness by avoiding giving lower grades to students from lower socioeconomic status.</td>
									<td align="left"> </td>
									<td align="center">.582</td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">37</td>
									<td align="left">Helping students with learning disability during classroom tests.</td>
									<td align="left"> </td>
									<td align="center">.546</td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">38</td>
									<td align="left">Avoiding using new items in my classroom tests which did not appear on the course syllabus.</td>
									<td align="left"> </td>
									<td align="center">.758</td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">39</td>
									<td align="left">Informing students of the test item formats (e.g., multiple choice or essay) prior to classroom tests.</td>
									<td align="left"> </td>
									<td align="center">.651</td>
									<td align="center"> </td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">26</td>
									<td align="left">Participating in discussion with institute managers about important changes to the curriculum based on students’ classroom test results.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.477</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">27</td>
									<td align="left">Encouraging students to assess their own classroom test performance to enhance their learning.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.587</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">28</td>
									<td align="left">Helping my students learn how to grade their own classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.729</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">29</td>
									<td align="left">Asking top students in my class to help me assess other students’ classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.803</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">30</td>
									<td align="left">Encouraging students to provide their classmates with feedback on their classroom test performance.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.860</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">31</td>
									<td align="left">Giving students clear rating scales by which they can assess each other’s classroom test performance. </td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center">.771</td>
									<td align="center"> </td>
								</tr>
								<tr>
									<td align="left">18</td>
									<td align="left">Praising students for their good performance on classroom tests.</td>
									<td align="left"> </td>
									<td align="center"> </td>
									<td align="center"> </td>
									<td align="center">-.586</td>
								</tr>
								<tr>
									<td align="left">19</td>
									<td align="left">Reminding students of their strengths and weaknesses in their classroom test performance to help them improve their learning.</td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="center">-.723</td>
								</tr>
								<tr>
									<td align="left">20</td>
									<td align="left">Encouraging my students to improve their classroom test performance according to the feedback provided by me.</td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="center">-.775</td>
								</tr>
								<tr>
									<td align="left">21</td>
									<td align="left">Using classroom test results to determine if students have met course objectives.</td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="center">-.422</td>
								</tr>
								<tr>
									<td align="left">22</td>
									<td align="left">Using classroom test results to decide whether students can proceed to the next stage of learning.</td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="center">-.507</td>
								</tr>
								<tr>
									<td align="left">23</td>
									<td align="left">Constructing an accurate report about students’ classroom test performance to communicate it to both parents and/or institute managers.</td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="center">-.412</td>
								</tr>
								<tr>
									<td align="left">24</td>
									<td align="left">Speaking understandably with students about the meaning of the report card grades to help them improve their test performance.</td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="center">-.469</td>
								</tr>
								<tr>
									<td align="left">25</td>
									<td align="left">Speaking understandably with parents, if needed, about the decisions made or recommended based on classroom test results.</td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="left"> </td>
									<td align="center">-.507</td>
								</tr>
							</tbody>
						</table>
					</table-wrap>
				</p>
				<p>As for assessment practice items, those that clustered around the same factors (bolded items) in the pattern matrix presented in <xref ref-type="table" rid="t4">Table 4</xref> suggested that factor one, comprising 13 items, represented “Assessment Purpose and Grading.” It elicits the frequency of the use of classroom tests to track students’ progress during the course, to plan future instruction, and to assign grades to students. Factor two, consisting of 9 items, represented “Assessment Ethics.” Factor two items inquire into observing assessment fairness and avoiding assessment bias in classroom tests. Factor three, comprising 6 items, implies “Student Involvement in Assessment.” It explores the frequency of assisting students to learn how to grade their own and their peers’ classroom test performance. Finally, factor four, with 8 items, refers to “Feedback and Assessment Interpretation and Communication.” The items elicit the frequency of reminding students of their strengths and weaknesses in their classroom test performance and of using classroom test results for decision-making purposes.</p>
				<p>Overall, the classroom-based assessment knowledge and practice items loaded on four thematic areas with 37 items falling under classroom-based assessment knowledge and 36 items relating to classroom practice.</p>
			</sec>
			<sec>
				<title>Teachers’ Classroom-Based Assessment Knowledge and Practice</title>
				<p>The newly-developed CBLAL scale was used to probe into Iranian EFL teachers’ knowledge and practice of four factors of these teachers’ classroom-based assessment knowledge base (<xref ref-type="bibr" rid="B44">Xu &amp; Brown, 2016</xref>). <xref ref-type="table" rid="t5">Table 5</xref> presents the percentages of the teachers’ responses on classroom-based assessment knowledge factors.</p>
				<p>
					<table-wrap id="t5">
						<label>Table 5</label>
						<caption>
							<title>Mean and Response Percentages of Classroom-Based Assessment Knowledge Factors</title>
						</caption>
						<table>
							<colgroup>
								<col/>
								<col/>
								<col/>
								<col/>
								<col/>
								<col/>
								<col/>
							</colgroup>
							<thead>
								<tr>
									<th align="left">Factors</th>
									<th align="center">Strongly disagree</th>
									<th align="center">Slightly disagree</th>
									<th align="center">Moderately agree</th>
									<th align="center">Strongly agree</th>
									<th align="center"><italic>Mean</italic></th>
									<th align="center"><italic>SD</italic></th>
								</tr>
							</thead>
							<tbody>
								<tr>
									<td align="left">Knowledge of Assessment Use and Grading</td>
									<td align="center">3.93%</td>
									<td align="center">19.47%</td>
									<td align="center">41.25%</td>
									<td align="center">35.35%</td>
									<td align="center">3.08</td>
									<td align="center">0.82</td>
								</tr>
								<tr>
									<td align="left">Knowledge of Assessment Ethics</td>
									<td align="center">2.60%</td>
									<td align="center">14.93%</td>
									<td align="center">34.55%</td>
									<td align="center">47.93%</td>
									<td align="center">3.27</td>
									<td align="center">0.78</td>
								</tr>
								<tr>
									<td align="left">Knowledge of Student Involvement in Assessment</td>
									<td align="center">6.73%</td>
									<td align="center">22.18%</td>
									<td align="center">39.93%</td>
									<td align="center">31.15%</td>
									<td align="center">2.95</td>
									<td align="center">0.87</td>
								</tr>
								<tr>
									<td align="left">Knowledge of Feedback and Assessment Interpretation</td>
									<td align="center">0.95%</td>
									<td align="center">10.99%</td>
									<td align="center">35.69%</td>
									<td align="center">52.38%</td>
									<td align="center">3.39</td>
									<td align="center">0.69</td>
								</tr>
							</tbody>
						</table>
					</table-wrap>
				</p>
				<p>As can be seen in <xref ref-type="table" rid="t5">Table 5</xref>, the teachers in the present study reported to be knowledgeable about classroom assessment by moderately or strongly agreeing with the items. More specifically, around 76% of the teachers self-reported to be knowledgeable about the uses of assessment and grading procedures in language classrooms. Regarding knowledge of assessment ethics, around 82% of the teachers believed that they were knowledgeable about ethical considerations in the classroom. As for knowledge of student involvement in assessment, 71% of the teachers believed they were knowledgeable about how to encourage students to assess their own and their peers’ classroom performance. Finally, regarding knowledge of feedback and assessment interpretation, 88% of the teachers were of the belief that they knew how to provide accurate feedback as well as report on students’ class performances.</p>
				<p>As for the teachers’ classroom-based assessment practice, <xref ref-type="table" rid="t6">Table 6</xref> presents the percentages of their responses on a 5-point Likert scale.</p>
				<p>
					<table-wrap id="t6">
						<label>Table 6</label>
						<caption>
							<title>Mean and Response Percentages of Classroom-Based Assessment Practice Factors</title>
						</caption>
						<table>
							<colgroup>
								<col/>
								<col/>
								<col/>
								<col/>
								<col/>
								<col/>
								<col/>
								<col/>
							</colgroup>
							<thead>
								<tr>
									<th align="left">Factors</th>
									<th align="center">Never</th>
									<th align="center">Rarely</th>
									<th align="center">Sometimes</th>
									<th align="center">Very often</th>
									<th align="center">Always</th>
									<th align="center"><italic>Mean</italic></th>
									<th align="center"><italic>SD</italic></th>
								</tr>
							</thead>
							<tbody>
								<tr>
									<td align="left">Assessment Purpose and Grading</td>
									<td align="center">4.03%</td>
									<td align="center">10.54%</td>
									<td align="center">24.08%</td>
									<td align="center">37.22%</td>
									<td align="center">24.13%</td>
									<td align="center">3.66</td>
									<td align="center">1.05</td>
								</tr>
								<tr>
									<td align="left">Assessment Ethics</td>
									<td align="center">2.73%</td>
									<td align="center">6.10%</td>
									<td align="center">19.33%</td>
									<td align="center">34.19%</td>
									<td align="center">37.64%</td>
									<td align="center">3.97</td>
									<td align="center">1.01</td>
								</tr>
								<tr>
									<td align="left">Student Involvement in Assessment</td>
									<td align="center">7.98%</td>
									<td align="center">13.63%</td>
									<td align="center">26.90%</td>
									<td align="center">31.50%</td>
									<td align="center">19.98%</td>
									<td align="center">3.41</td>
									<td align="center">1.15</td>
								</tr>
								<tr>
									<td align="left">Feedback and Assessment Interpretation and Communication</td>
									<td align="center">1.98%</td>
									<td align="center">5.50%</td>
									<td align="center">19.90%</td>
									<td align="center">37.06%</td>
									<td align="center">35.56%</td>
									<td align="center">3.98</td>
									<td align="center">0.94</td>
								</tr>
							</tbody>
						</table>
					</table-wrap>
				</p>
				<p>
					<xref ref-type="table" rid="t6">Table 6</xref> shows that around 61% of the teachers stated that they use classroom assessment for the purpose of grading students’ performance as well as informing future instruction. Also, around 71% of the teachers were of the belief that they very often consider ethical issues in their everyday classroom assessment tasks. As for student involvement in assessment, more than half (51%) of the teachers reported that they help students learn how to grade their own classroom test performance. Finally, as for feedback and assessment interpretation and communication, around 72% of the teachers were of the belief that they practice constructive feedback in their classes to help students set goals for their future success.</p>
			</sec>
		</sec>
		<sec sec-type="discussion">
			<title>Discussion</title>
			<p>The present study sought to develop and validate a new scale to tap teachers’ classroom-based assessment knowledge and practice. To do so, a total of 40 items on classroom-based assessment knowledge and a corresponding set of 40 classroom-based assessment practice items were developed and subjected to EFA. It was revealed that the six factors of <xref ref-type="bibr" rid="B44">Xu and Brown’s (2016)</xref> teacher assessment knowledge base collapsed into four factors in the context of the present study. The items on the scale clustered around the following themes: (a) purposes of assessment and grading, (b) assessment ethics, (c) student involvement in assessment, and (d) feedback and assessment interpretation. It can be reasoned that since a large number of students strive for international examinations (i.e., IELTS) at private language schools in Iran (<xref ref-type="bibr" rid="B35">Sadeghi &amp; Richards, 2015</xref>), teachers’ view of assessment is largely quantitative (i.e., grade-based), and they see grading as the principal purpose of assessment procedures. As a result, the items on assessment purpose and grading loaded together in the EFA.</p>
			<p>Moreover, not only was knowledge of assessment feedback and assessment communication and interpretation found to be interrelated in the context of the study, but it was also found that there is a high negative correlation between the factor of “feedback and assessment interpretation” and other three factors. It may be contended that teachers look at feedback and interpretation as part of “good” teaching practice rather than good assessment (<xref ref-type="bibr" rid="B2">Berry et al., 2019</xref>). In other words, teachers regard the process of providing feedback as a teaching mechanism helping students notice their strengths and weakness in order to facilitate their learning. Such a perspective has, however, been explained as “assessment for learning” in the literature (<xref ref-type="bibr" rid="B24">Lee, 2017</xref>).</p>
			<p>Having validated the scale, the researchers investigated the status of Iranian EFL teachers’ classroom-based assessment knowledge and practice. The findings of the study reveal that most participants (around 80%) reported to be knowledgeable, to different extents, about classroom-based assessment by moderately or strongly agreeing with the items on the four factors. Also, most teachers (around 65%) reported practicing classroom-based assessment quite frequently in their classrooms in terms of the four classroom-based assessment practice factors. These findings corroborate those of <xref ref-type="bibr" rid="B6">Crusan et al. (2016)</xref> in that teachers reported to be familiar with writing assessment concepts and use various assessment procedures in their classes. The findings of the study, however, run counter to those of <xref ref-type="bibr" rid="B23">Lan and Fan (2019)</xref>, <xref ref-type="bibr" rid="B41">Tsagari and Vogt (2017)</xref>, <xref ref-type="bibr" rid="B42">Vogt and Tsagari (2014)</xref>, and <xref ref-type="bibr" rid="B45">Xu and Brown (2017)</xref>. For instance, <xref ref-type="bibr" rid="B42">Vogt and Tsagari (2014)</xref>, exploring language testing and assessment (LTA) literacy of European teachers, concluded that the teachers’ LTA literacy is not well-developed, which was attributed to lack of training in assessment. Similarly, <xref ref-type="bibr" rid="B23">Lan and Fan (2019)</xref> investigated in-service Chinese EFL teachers’ classroom-based LAL. They observed that teachers lacked sufficient procedural and conceptual LAL for conducting classroom-based language assessment.</p>
			<p>Language teachers’ high levels of self-reported assessment knowledge and practice in the present study might also be attributed to demographic (i.e., educational level) and affective (i.e., motivation) variables. Since many teachers teaching in Iranian private language schools hold an MA or PhD in teaching EFL, they may have been familiarized with some of the recent developments on the use of language assessment in classrooms (<xref ref-type="bibr" rid="B35">Sadeghi &amp; Richards, 2015</xref>). Also, as private language school teachers face relatively less “meso-level” (i.e., school policies) constraints (<xref ref-type="bibr" rid="B17">Fulmer et al., 2015</xref>) and enjoy considerably more autonomy (i.e., space and support) in their teaching context (<xref ref-type="bibr" rid="B22">Lam, 2019</xref>), the chances of developing practical assessment knowledge and trying out renewed conceptualization of assessment in their classes is increased.</p>
			<p>Reviewing the teachers’ responses to open-ended questions in the first section of the CBLAL scale (i.e., demographic information) showed that more than half of the participants (51.7%) had not taken any course with a particular focus on assessment. The remaining teachers (48.3%) unanimously reported to have taken such a course at university, indicating that teacher training courses (TTCs) offered at private language schools do not hold a special position in shaping teachers’ assessment literacy. To further support this contention, teachers’ responses to open-ended questions revealed that the topics covered in language testing courses offered at universities (i.e., including designing test items, test reliability and validation, and testing language skills and components) did not equip them with the dynamics of classroom assessment. Moreover, the teachers’ responses to open-ended questions confirm that most (around 80%) agreed to the allocation of more time to language assessment component in their preservice and/or in-service TTCs. They conceived of assessment as an essential component of the teaching process without which instruction would not lead to desirable outcomes. However, they unanimously stressed the need for the inclusion of a practical classroom-based assessment component in preservice and in-service teacher education courses.</p>
		</sec>
		<sec sec-type="conclusions">
			<title>Conclusion</title>
			<p>The principal purpose of the present study was to develop and validate a new CBLAL scale, and then to explore Iranian EFL teachers’ status of classroom-based assessment knowledge and practice. It was found that teachers’ assessment knowledge base is composed of four main themes. Also, private language school teachers’ responses to the CBLAL scale revealed that they self-reported to be moderately assessment literate. However, the findings of the open-ended section of the scale painted a different picture. The majority of teachers expressed their need for a specific course on language assessment, which demands a cautious interpretation of the findings of the study.</p>
			<p>The findings imply that in the absence of an assessment for learning component in teacher education courses for language teachers to equip them with recent updates on classroom assessment, teachers would resort to their past experiences as students, or what <xref ref-type="bibr" rid="B42">Vogt and Tsagari (2014)</xref> call “testing as you were tested.” Teacher education courses should then adopt Vygotsky’s sociocultural theory, particularly his concept of the “zone of proximal development.” With its emphasis on the contribution of mediation and dialogic interaction with teachers’ professional development (<xref ref-type="bibr" rid="B21">Johnson, 2009</xref>), a sociocultural theory perspective on teacher education can cater for an opportunity for teachers to “articulate and synthesize their perspectives by drawing together assessment theory, terminology, and experience” (<xref ref-type="bibr" rid="B8">DeLuca et al., 2013</xref>, p. 133). By externalizing their current understanding of assessment and then reconceptualizing and recontextualizing it, teachers can discover their own assessment mindset orientation and develop alternative ways of engaging in the activities associated with assessment (<xref ref-type="bibr" rid="B5">Coombs et al., 2020</xref>; <xref ref-type="bibr" rid="B9">DeLuca et al., 2019</xref>; <xref ref-type="bibr" rid="B21">Johnson, 2009</xref>).</p>
			<p>The findings have important implications for language teacher education and professional development in that, by spotting classroom-based assessment needs of teachers and considering them in teacher education pedagogies, teachers’ conceptions and practices can be transformed (<xref ref-type="bibr" rid="B27">Loughran, 2006</xref>). Also, the findings have implications for materials developers, who are responsible for providing and sequencing the content of teaching materials. By becoming cognizant of the intricacies of classroom assessment, materials developers can include appropriate topics and discussions in their materials to help teachers acquire necessary classroom-based assessment knowledge base.</p>
			<p>A number of limitations were, however, present in this study, which need to be acknowledged. The study only probed into language teachers’ self-reported account of classroom-based assessment knowledge and practice without any evidence of their actual practice. A further limitation was that teachers may have intentionally marked their assessment knowledge and practice items positive for some behavioral and “social desirability” reasons (<xref ref-type="bibr" rid="B5">Coombs et al., 2020</xref>). Therefore, future studies are invited to observe language teachers’ assessment practices (i.e., rather than their perceptions) to obtain a more realistic picture of their classroom-based assessment literacy. Future studies could also consider the implementation of “focused instruction” (<xref ref-type="bibr" rid="B10">DeLuca &amp; Klinger, 2010</xref>) on the use of both formative and summative assessments in classrooms and the potential impact of such a course on language teachers’ classroom-based assessment literacy development.</p>
		</sec>
	</body>
	<back>
		<ref-list>
			<title>References</title>
			<ref id="B1">
				<mixed-citation>American Federation of Teachers, National Council on Measurement in Education, &amp; National Education Association. (1990). <italic>Standards for teacher competence in educational assessment of students</italic> (ED323186). ERIC. <ext-link ext-link-type="uri" xlink:href="https://files.eric.ed.gov/fulltext/ED323186.pdf">https://files.eric.ed.gov/fulltext/ED323186.pdf</ext-link>
				</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<collab>American Federation of Teachers</collab>
						<collab>National Council on Measurement in Education</collab>
						<collab>National Education Association</collab>
					</person-group>
					<year>1990</year>
					<source>Standards for teacher competence in educational assessment of students</source>
					<pub-id pub-id-type="other">ED323186</pub-id>
					<publisher-name>ERIC</publisher-name>
					<ext-link ext-link-type="uri" xlink:href="https://files.eric.ed.gov/fulltext/ED323186.pdf">https://files.eric.ed.gov/fulltext/ED323186.pdf</ext-link>
				</element-citation>
			</ref>
			<ref id="B2">
				<mixed-citation>Berry, V., Sheehan, S., &amp; Munro, S. (2019). What does language assessment literacy mean to teachers? <italic>ELT Journal</italic>, <italic>73</italic>(2), 113-123. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1093/elt/ccy055">https://doi.org/10.1093/elt/ccy055</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Berry</surname>
							<given-names>V.</given-names>
						</name>
						<name>
							<surname>Sheehan</surname>
							<given-names>S.</given-names>
						</name>
						<name>
							<surname>Munro</surname>
							<given-names>S.</given-names>
						</name>
					</person-group>
					<year>2019</year>
					<article-title>What does language assessment literacy mean to teachers?</article-title>
					<source>ELT Journal</source>
					<volume>73</volume>
					<issue>2</issue>
					<fpage>113</fpage>
					<lpage>123</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1093/elt/ccy055">https://doi.org/10.1093/elt/ccy055</ext-link>
				</element-citation>
			</ref>
			<ref id="B3">
				<mixed-citation>Brookhart, S. M. (2011). Educational assessment knowledge and skills for teachers. <italic>Educational Measurement: Issues and Practice</italic>, <italic>30</italic>(1), 3-12. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1111/j.1745-3992.2010.00195.x">https://doi.org/10.1111/j.1745-3992.2010.00195.x</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Brookhart</surname>
							<given-names>S. M.</given-names>
						</name>
					</person-group>
					<year>2011</year>
					<article-title>Educational assessment knowledge and skills for teachers</article-title>
					<source>Educational Measurement: Issues and Practice</source>
					<volume>30</volume>
					<issue>1</issue>
					<fpage>3</fpage>
					<lpage>12</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1111/j.1745-3992.2010.00195.x">https://doi.org/10.1111/j.1745-3992.2010.00195.x</ext-link>
				</element-citation>
			</ref>
			<ref id="B4">
				<mixed-citation>Campbell, C. (2013). Research on teacher competency in classroom assessment. In J. H. McMillan (Ed.), <italic>Sage handbook of research on classroom assessment</italic> (pp. 71-84). Sage. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.4135/9781452218649.n5">https://doi.org/10.4135/9781452218649.n5</ext-link>
				</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Campbell</surname>
							<given-names>C.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<chapter-title>Research on teacher competency in classroom assessment</chapter-title>
					<person-group person-group-type="editor">
						<name>
							<surname>McMillan</surname>
							<given-names>J. H.</given-names>
						</name>
					</person-group>
					<source>Sage handbook of research on classroom assessment</source>
					<fpage>71</fpage>
					<lpage>84</lpage>
					<publisher-name>Sage</publisher-name>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.4135/9781452218649.n5">https://doi.org/10.4135/9781452218649.n5</ext-link>
				</element-citation>
			</ref>
			<ref id="B5">
				<mixed-citation>Coombs, A., DeLuca, C., &amp; MacGregor, S. (2020). A person-centered analysis of teacher candidates’ approaches to assessment. <italic>Teaching and Teacher Education</italic>, <italic>87</italic>. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.tate.2019.102952">https://doi.org/10.1016/j.tate.2019.102952</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Coombs</surname>
							<given-names>A.</given-names>
						</name>
						<name>
							<surname>DeLuca</surname>
							<given-names>C.</given-names>
						</name>
						<name>
							<surname>MacGregor</surname>
							<given-names>S.</given-names>
						</name>
					</person-group>
					<year>2020</year>
					<article-title>A person-centered analysis of teacher candidates’ approaches to assessment</article-title>
					<source>Teaching and Teacher Education</source>
					<volume>87</volume>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.tate.2019.102952">https://doi.org/10.1016/j.tate.2019.102952</ext-link>
				</element-citation>
			</ref>
			<ref id="B6">
				<mixed-citation>Crusan, D., Plakans, L., &amp; Gebril, A. (2016). Writing assessment literacy: Surveying second language teachers’ knowledge, beliefs, and practices. <italic>Assessing Writing</italic>, <italic>28</italic>, 43-56. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.asw.2016.03.001">https://doi.org/10.1016/j.asw.2016.03.001</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Crusan</surname>
							<given-names>D.</given-names>
						</name>
						<name>
							<surname>Plakans</surname>
							<given-names>L.</given-names>
						</name>
						<name>
							<surname>Gebril</surname>
							<given-names>A.</given-names>
						</name>
					</person-group>
					<year>2016</year>
					<article-title>Writing assessment literacy: Surveying second language teachers’ knowledge, beliefs, and practices</article-title>
					<source>Assessing Writing</source>
					<volume>28</volume>
					<fpage>43</fpage>
					<lpage>56</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.asw.2016.03.001">https://doi.org/10.1016/j.asw.2016.03.001</ext-link>
				</element-citation>
			</ref>
			<ref id="B7">
				<mixed-citation>DeLuca, C., &amp; Bellara, A. (2013). The current state of assessment education: Aligning policy, standards, and teacher education curriculum. <italic>Journal of Teacher Education</italic>, <italic>64</italic>(4), 356-372. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0022487113488144">https://doi.org/10.1177/0022487113488144</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>DeLuca</surname>
							<given-names>C.</given-names>
						</name>
						<name>
							<surname>Bellara</surname>
							<given-names>A.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<article-title>The current state of assessment education: Aligning policy, standards, and teacher education curriculum</article-title>
					<source>Journal of Teacher Education</source>
					<volume>64</volume>
					<issue>4</issue>
					<fpage>356</fpage>
					<lpage>372</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0022487113488144">https://doi.org/10.1177/0022487113488144</ext-link>
				</element-citation>
			</ref>
			<ref id="B8">
				<mixed-citation>DeLuca, C., Chavez, T., Bellara, A., &amp; Cao, C. (2013). Pedagogies for preservice assessment education: Supporting teacher candidates’ assessment literacy development. <italic>The Teacher Educator</italic>, <italic>48</italic>(2), 128-142. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/08878730.2012.760024">https://doi.org/10.1080/08878730.2012.760024</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>DeLuca</surname>
							<given-names>C.</given-names>
						</name>
						<name>
							<surname>Chavez</surname>
							<given-names>T.</given-names>
						</name>
						<name>
							<surname>Bellara</surname>
							<given-names>A.</given-names>
						</name>
						<name>
							<surname>Cao</surname>
							<given-names>C.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<article-title>Pedagogies for preservice assessment education: Supporting teacher candidates’ assessment literacy development</article-title>
					<source>The Teacher Educator</source>
					<volume>48</volume>
					<issue>2</issue>
					<fpage>128</fpage>
					<lpage>142</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/08878730.2012.760024">https://doi.org/10.1080/08878730.2012.760024</ext-link>
				</element-citation>
			</ref>
			<ref id="B9">
				<mixed-citation>DeLuca, C., Coombs, A., MacGregor, S., &amp; Rasooli, A. (2019). Toward a differential and situated view of assessment literacy: Studying teachers’ responses to classroom assessment scenarios. <italic>Frontiers in Education</italic>. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/feduc.2019.00094">https://doi.org/10.3389/feduc.2019.00094</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>DeLuca</surname>
							<given-names>C.</given-names>
						</name>
						<name>
							<surname>Coombs</surname>
							<given-names>A.</given-names>
						</name>
						<name>
							<surname>MacGregor</surname>
							<given-names>S.</given-names>
						</name>
						<name>
							<surname>Rasooli</surname>
							<given-names>A.</given-names>
						</name>
					</person-group>
					<year>2019</year>
					<article-title>Toward a differential and situated view of assessment literacy: Studying teachers’ responses to classroom assessment scenarios</article-title>
					<source>Frontiers in Education</source>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/feduc.2019.00094">https://doi.org/10.3389/feduc.2019.00094</ext-link>
				</element-citation>
			</ref>
			<ref id="B10">
				<mixed-citation>DeLuca, C., &amp; Klinger, D. A. (2010). Assessment literacy development: Identifying gaps in teacher candidates’ learning. <italic>Assessment in Education: Principles, Policy &amp; Practice</italic>, <italic>17</italic>(4), 419-438. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/0969594x.2010.516643">https://doi.org/10.1080/0969594x.2010.516643</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>DeLuca</surname>
							<given-names>C.</given-names>
						</name>
						<name>
							<surname>Klinger</surname>
							<given-names>D. A.</given-names>
						</name>
					</person-group>
					<year>2010</year>
					<article-title>Assessment literacy development: Identifying gaps in teacher candidates’ learning</article-title>
					<source>Assessment in Education: Principles, Policy &amp; Practice</source>
					<volume>17</volume>
					<issue>4</issue>
					<fpage>419</fpage>
					<lpage>438</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/0969594x.2010.516643">https://doi.org/10.1080/0969594x.2010.516643</ext-link>
				</element-citation>
			</ref>
			<ref id="B11">
				<mixed-citation>DeLuca, C., Valiquette, A., Coombs, A., LaPointe-McEwan, D., &amp; Luhanga, U. (2018). Teachers’ approaches to classroom assessment: A large-scale survey. <italic>Assessment in Education: Principles, Policy &amp; Practice</italic> , <italic>25</italic>(4), 355-375. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/0969594x.2016.1244514">https://doi.org/10.1080/0969594x.2016.1244514</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>DeLuca</surname>
							<given-names>C.</given-names>
						</name>
						<name>
							<surname>Valiquette</surname>
							<given-names>A.</given-names>
						</name>
						<name>
							<surname>Coombs</surname>
							<given-names>A.</given-names>
						</name>
						<name>
							<surname>LaPointe-McEwan</surname>
							<given-names>D.</given-names>
						</name>
						<name>
							<surname>Luhanga</surname>
							<given-names>U.</given-names>
						</name>
					</person-group>
					<year>2018</year>
					<article-title>Teachers’ approaches to classroom assessment: A large-scale survey</article-title>
					<source>Assessment in Education: Principles, Policy &amp; Practice</source>
					<volume>25</volume>
					<issue>4</issue>
					<fpage>355</fpage>
					<lpage>375</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/0969594x.2016.1244514">https://doi.org/10.1080/0969594x.2016.1244514</ext-link>
				</element-citation>
			</ref>
			<ref id="B12">
				<mixed-citation>Dörnyei, Z. (2003). <italic>Questionnaires in second language research: Construction, administration, and processing</italic> (1<sup>st</sup> ed.). Lawrence Erlbaum Associates.</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Dörnyei</surname>
							<given-names>Z.</given-names>
						</name>
					</person-group>
					<year>2003</year>
					<source>Questionnaires in second language research: Construction, administration, and processing</source>
					<edition>1st</edition>
					<publisher-name>Lawrence Erlbaum Associates</publisher-name>
				</element-citation>
			</ref>
			<ref id="B13">
				<mixed-citation>Field, A. (2013). <italic>Discovering statistics using IBM SPSS statistics</italic> (4<sup>th</sup> ed.). Sage.</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Field</surname>
							<given-names>A.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<source>Discovering statistics using IBM SPSS statistics</source>
					<edition>4th</edition>
					<publisher-name>Sage</publisher-name>
				</element-citation>
			</ref>
			<ref id="B14">
				<mixed-citation>Firoozi, T., Razavipour, K., &amp; Ahmadi, A. (2019). The language assessment literacy needs of Iranian EFL teachers with a focus on reformed assessment policies. <italic>Language Testing in Asia</italic>, 9(2). <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1186/s40468-019-0078-7">https://doi.org/10.1186/s40468-019-0078-7</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Firoozi</surname>
							<given-names>T.</given-names>
						</name>
						<name>
							<surname>Razavipour</surname>
							<given-names>K.</given-names>
						</name>
						<name>
							<surname>Ahmadi</surname>
							<given-names>A.</given-names>
						</name>
					</person-group>
					<year>2019</year>
					<article-title>The language assessment literacy needs of Iranian EFL teachers with a focus on reformed assessment policies</article-title>
					<source>Language Testing in Asia</source>
					<volume>9</volume>
					<issue>2</issue>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1186/s40468-019-0078-7">https://doi.org/10.1186/s40468-019-0078-7</ext-link>
				</element-citation>
			</ref>
			<ref id="B15">
				<mixed-citation>Fives, H., &amp; Barnes, N. (2020). Navigating the complex cognitive task of classroom assessment. <italic>Teaching and Teacher Education</italic> , <italic>92</italic>. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.tate.2020.103063">https://doi.org/10.1016/j.tate.2020.103063</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Fives</surname>
							<given-names>H.</given-names>
						</name>
						<name>
							<surname>Barnes</surname>
							<given-names>N.</given-names>
						</name>
					</person-group>
					<year>2020</year>
					<article-title>Navigating the complex cognitive task of classroom assessment</article-title>
					<source>Teaching and Teacher Education</source>
					<volume>92</volume>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.tate.2020.103063">https://doi.org/10.1016/j.tate.2020.103063</ext-link>
				</element-citation>
			</ref>
			<ref id="B16">
				<mixed-citation>Fulcher, G. (2012). Assessment literacy for the language classroom. <italic>Language Assessment Quarterly</italic>, 9(2), 113-132. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/15434303.2011.642041">https://doi.org/10.1080/15434303.2011.642041</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Fulcher</surname>
							<given-names>G.</given-names>
						</name>
					</person-group>
					<year>2012</year>
					<article-title>Assessment literacy for the language classroom</article-title>
					<source>Language Assessment Quarterly</source>
					<volume>9</volume>
					<issue>2</issue>
					<fpage>113</fpage>
					<lpage>132</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/15434303.2011.642041">https://doi.org/10.1080/15434303.2011.642041</ext-link>
				</element-citation>
			</ref>
			<ref id="B17">
				<mixed-citation>Fulmer, G. W., Lee, I. C. H., &amp; Tan, K. H. K. (2015). Multi-level model of contextual factors and teachers’ assessment practices: An integrative review of research. <italic>Assessment in Education: Principles, Policy &amp; Practice</italic> , <italic>22</italic>(4), 475-494. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/0969594x.2015.1017445">https://doi.org/10.1080/0969594x.2015.1017445</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Fulmer</surname>
							<given-names>G. W.</given-names>
						</name>
						<name>
							<surname>Lee</surname>
							<given-names>I. C. H.</given-names>
						</name>
						<name>
							<surname>Tan</surname>
							<given-names>K. H. K.</given-names>
						</name>
					</person-group>
					<year>2015</year>
					<article-title>Multi-level model of contextual factors and teachers’ assessment practices: An integrative review of research</article-title>
					<source>Assessment in Education: Principles, Policy &amp; Practice</source>
					<volume>22</volume>
					<issue>4</issue>
					<fpage>475</fpage>
					<lpage>494</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/0969594x.2015.1017445">https://doi.org/10.1080/0969594x.2015.1017445</ext-link>
				</element-citation>
			</ref>
			<ref id="B18">
				<mixed-citation>Inbar-Lourie, O. (2008). Constructing a language assessment knowledge base: A focus on language assessment courses. <italic>Language Testing</italic>, <italic>25</italic>(3), 385-402. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0265532208090158">https://doi.org/10.1177/0265532208090158</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Inbar-Lourie</surname>
							<given-names>O.</given-names>
						</name>
					</person-group>
					<year>2008</year>
					<article-title>Constructing a language assessment knowledge base: A focus on language assessment courses</article-title>
					<source>Language Testing</source>
					<volume>25</volume>
					<issue>3</issue>
					<fpage>385</fpage>
					<lpage>402</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0265532208090158">https://doi.org/10.1177/0265532208090158</ext-link>
				</element-citation>
			</ref>
			<ref id="B19">
				<mixed-citation>Inbar-Lourie, O. (2013). Guest Editorial to the special issue on language assessment literacy. <italic>Language Testing</italic> , <italic>30</italic>(3), 301-307. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0265532213480126">https://doi.org/10.1177/0265532213480126</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Inbar-Lourie</surname>
							<given-names>O.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<article-title>Guest Editorial to the special issue on language assessment literacy</article-title>
					<source>Language Testing</source>
					<volume>30</volume>
					<issue>3</issue>
					<fpage>301</fpage>
					<lpage>307</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0265532213480126">https://doi.org/10.1177/0265532213480126</ext-link>
				</element-citation>
			</ref>
			<ref id="B20">
				<mixed-citation>Inbar-Lourie, O. (2017). Language assessment literacy. In E. Shohamy, L. G. Or, &amp; S. May (Eds.), <italic>Language testing and assessment: Encyclopedia of language and education</italic> (3<sup>rd</sup> ed., pp. 257-270). Springer International Publishing. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1007/978-3-319-02261-1_19">https://doi.org/10.1007/978-3-319-02261-1_19</ext-link>
				</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Inbar-Lourie</surname>
							<given-names>O.</given-names>
						</name>
					</person-group>
					<year>2017</year>
					<chapter-title>Language assessment literacy</chapter-title>
					<person-group person-group-type="editor">
						<name>
							<surname>Shohamy</surname>
							<given-names>E.</given-names>
						</name>
						<name>
							<surname>Or</surname>
							<given-names>L. G.</given-names>
						</name>
						<name>
							<surname>May</surname>
							<given-names>S.</given-names>
						</name>
					</person-group>
					<source>Language testing and assessment: Encyclopedia of language and education</source>
					<edition>3rd</edition>
					<fpage>257</fpage>
					<lpage>270</lpage>
					<publisher-name>Springer International Publishing</publisher-name>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1007/978-3-319-02261-1_19">https://doi.org/10.1007/978-3-319-02261-1_19</ext-link>
				</element-citation>
			</ref>
			<ref id="B21">
				<mixed-citation>Johnson, K. E. (2009). <italic>Second language teacher education: A sociocultural perspective</italic>. Routledge. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.4324/9780203878033">https://doi.org/10.4324/9780203878033</ext-link>
				</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Johnson</surname>
							<given-names>K. E.</given-names>
						</name>
					</person-group>
					<year>2009</year>
					<source>Second language teacher education: A sociocultural perspective</source>
					<publisher-name>Routledge</publisher-name>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.4324/9780203878033">https://doi.org/10.4324/9780203878033</ext-link>
				</element-citation>
			</ref>
			<ref id="B22">
				<mixed-citation>Lam, R. (2019). Teacher assessment literacy: Surveying knowledge, conceptions and practices of classroom-based writing assessment in Hong Kong. <italic>System</italic>, <italic>81</italic>, 78-89. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.system.2019.01.006">https://doi.org/10.1016/j.system.2019.01.006</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Lam</surname>
							<given-names>R.</given-names>
						</name>
					</person-group>
					<year>2019</year>
					<article-title>Teacher assessment literacy: Surveying knowledge, conceptions and practices of classroom-based writing assessment in Hong Kong</article-title>
					<source>System</source>
					<volume>81</volume>
					<fpage>78</fpage>
					<lpage>89</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.system.2019.01.006">https://doi.org/10.1016/j.system.2019.01.006</ext-link>
				</element-citation>
			</ref>
			<ref id="B23">
				<mixed-citation>Lan, C., &amp; Fan, S. (2019). Developing classroom-based language assessment literacy for in-service EFL teachers: The gaps. <italic>Studies in Educational Evaluation</italic>, <italic>61</italic>, 112-122. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.stueduc.2019.03.003">https://doi.org/10.1016/j.stueduc.2019.03.003</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Lan</surname>
							<given-names>C.</given-names>
						</name>
						<name>
							<surname>Fan</surname>
							<given-names>S.</given-names>
						</name>
					</person-group>
					<year>2019</year>
					<article-title>Developing classroom-based language assessment literacy for in-service EFL teachers: The gaps</article-title>
					<source>Studies in Educational Evaluation</source>
					<volume>61</volume>
					<fpage>112</fpage>
					<lpage>122</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.stueduc.2019.03.003">https://doi.org/10.1016/j.stueduc.2019.03.003</ext-link>
				</element-citation>
			</ref>
			<ref id="B24">
				<mixed-citation>Lee, I. (2017). <italic>Classroom writing assessment and feedback in L2 school contexts</italic>. Springer. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1007/978-981-10-3924-9">https://doi.org/10.1007/978-981-10-3924-9</ext-link>
				</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Lee</surname>
							<given-names>I.</given-names>
						</name>
					</person-group>
					<year>2017</year>
					<source>Classroom writing assessment and feedback in L2 school contexts</source>
					<publisher-name>Springer</publisher-name>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1007/978-981-10-3924-9">https://doi.org/10.1007/978-981-10-3924-9</ext-link>
				</element-citation>
			</ref>
			<ref id="B25">
				<mixed-citation>Lee, J., &amp; Butler, Y. G. (2020). Reconceptualizing language assessment literacy: Where are language learners? <italic>TESOL Quarterly</italic>, <italic>54</italic>(4), 1098-1111. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1002/tesq.576">https://doi.org/10.1002/tesq.576</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Lee</surname>
							<given-names>J.</given-names>
						</name>
						<name>
							<surname>Butler</surname>
							<given-names>Y. G.</given-names>
						</name>
					</person-group>
					<year>2020</year>
					<article-title>Reconceptualizing language assessment literacy: Where are language learners?</article-title>
					<source>TESOL Quarterly</source>
					<volume>54</volume>
					<issue>4</issue>
					<fpage>1098</fpage>
					<lpage>1111</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1002/tesq.576">https://doi.org/10.1002/tesq.576</ext-link>
				</element-citation>
			</ref>
			<ref id="B26">
				<mixed-citation>Levi, T., &amp; Inbar-Lourie, O. (2019). Assessment literacy or language assessment literacy: Learning from the teachers. <italic>Language Assessment Quarterly</italic> , <italic>17</italic>(2), 168-182. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/15434303.2019.1692347">https://doi.org/10.1080/15434303.2019.1692347</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Levi</surname>
							<given-names>T.</given-names>
						</name>
						<name>
							<surname>Inbar-Lourie</surname>
							<given-names>O.</given-names>
						</name>
					</person-group>
					<year>2019</year>
					<article-title>Assessment literacy or language assessment literacy: Learning from the teachers</article-title>
					<source>Language Assessment Quarterly</source>
					<volume>17</volume>
					<issue>2</issue>
					<fpage>168</fpage>
					<lpage>182</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/15434303.2019.1692347">https://doi.org/10.1080/15434303.2019.1692347</ext-link>
				</element-citation>
			</ref>
			<ref id="B27">
				<mixed-citation>Loughran, J. J. (2006). <italic>Developing a pedagogy of teacher education: Understanding teaching and learning about teaching</italic>. Routledge.</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Loughran</surname>
							<given-names>J. J.</given-names>
						</name>
					</person-group>
					<year>2006</year>
					<source>Developing a pedagogy of teacher education: Understanding teaching and learning about teaching</source>
					<publisher-name>Routledge</publisher-name>
				</element-citation>
			</ref>
			<ref id="B28">
				<mixed-citation>Mertler, C. A., &amp; Campbell, C. (2005). <italic>Measuring teachers’ knowledge &amp; application of classroom assessment concepts: Development of the assessment literacy inventory</italic> (ED490355). ERIC. <ext-link ext-link-type="uri" xlink:href="https://files.eric.ed.gov/fulltext/ED490355.pdf">https://files.eric.ed.gov/fulltext/ED490355.pdf</ext-link>
				</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Mertler</surname>
							<given-names>C. A.</given-names>
						</name>
						<name>
							<surname>Campbell</surname>
							<given-names>C.</given-names>
						</name>
					</person-group>
					<year>2005</year>
					<source>Measuring teachers’ knowledge &amp; application of classroom assessment concepts: Development of the assessment literacy inventory</source>
					<pub-id pub-id-type="other">ED490355</pub-id>
					<publisher-name>ERIC</publisher-name>
					<ext-link ext-link-type="uri" xlink:href="https://files.eric.ed.gov/fulltext/ED490355.pdf">https://files.eric.ed.gov/fulltext/ED490355.pdf</ext-link>
				</element-citation>
			</ref>
			<ref id="B29">
				<mixed-citation>Ölmezer-Öztürk, E., &amp; Aydin, B. (2018). Toward measuring language teachers’ assessment knowledge: Development and validation of Language Assessment Knowledge Scale (LAKS). <italic>Language Testing in Asia</italic>, 8(20). <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1186/s40468-018-0075-2">https://doi.org/10.1186/s40468-018-0075-2</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Ölmezer-Öztürk</surname>
							<given-names>E.</given-names>
						</name>
						<name>
							<surname>Aydin</surname>
							<given-names>B.</given-names>
						</name>
					</person-group>
					<year>2018</year>
					<article-title>Toward measuring language teachers’ assessment knowledge: Development and validation of Language Assessment Knowledge Scale (LAKS)</article-title>
					<source>Language Testing in Asia</source>
					<volume>8</volume>
					<issue>20</issue>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1186/s40468-018-0075-2">https://doi.org/10.1186/s40468-018-0075-2</ext-link>
				</element-citation>
			</ref>
			<ref id="B30">
				<mixed-citation>Pallant, J. (2016). <italic>SPSS survival manual: A step by step guide to data analysis using IBM SPSS</italic> (6<sup>th</sup> ed.). McGraw-Hill Education.</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Pallant</surname>
							<given-names>J.</given-names>
						</name>
					</person-group>
					<year>2016</year>
					<source>SPSS survival manual: A step by step guide to data analysis using IBM SPSS</source>
					<edition>6th</edition>
					<publisher-name>McGraw-Hill Education</publisher-name>
				</element-citation>
			</ref>
			<ref id="B31">
				<mixed-citation>Pill, J., &amp; Harding, L. (2013). Defining the language assessment literacy gap: Evidence from a parliamentary inquiry. <italic>Language Testing</italic> , <italic>30</italic>(3), 381-402. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0265532213480337">https://doi.org/10.1177/0265532213480337</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Pill</surname>
							<given-names>J.</given-names>
						</name>
						<name>
							<surname>Harding</surname>
							<given-names>L.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<article-title>Defining the language assessment literacy gap: Evidence from a parliamentary inquiry</article-title>
					<source>Language Testing</source>
					<volume>30</volume>
					<issue>3</issue>
					<fpage>381</fpage>
					<lpage>402</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1177/0265532213480337">https://doi.org/10.1177/0265532213480337</ext-link>
				</element-citation>
			</ref>
			<ref id="B32">
				<mixed-citation>Plake, B. S., Impara, J. C., &amp; Fager, J. J. (1993). Assessment competencies of teachers: A national survey. <italic>Educational Measurement: Issues and Practice</italic> , <italic>12</italic>(4), 10-39. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1111/j.1745-3992.1993.tb00548.x">https://doi.org/10.1111/j.1745-3992.1993.tb00548.x</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Plake</surname>
							<given-names>B. S.</given-names>
						</name>
						<name>
							<surname>Impara</surname>
							<given-names>J. C.</given-names>
						</name>
						<name>
							<surname>Fager</surname>
							<given-names>J. J.</given-names>
						</name>
					</person-group>
					<year>1993</year>
					<article-title>Assessment competencies of teachers: A national survey</article-title>
					<source>Educational Measurement: Issues and Practice</source>
					<volume>12</volume>
					<isbn>4</isbn>
					<fpage>10</fpage>
					<lpage>39</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1111/j.1745-3992.1993.tb00548.x">https://doi.org/10.1111/j.1745-3992.1993.tb00548.x</ext-link>
				</element-citation>
			</ref>
			<ref id="B33">
				<mixed-citation>Popham, W. J. (2009). Assessment literacy for teachers: Faddish or fundamental? <italic>Theory Into Practice</italic>, <italic>48</italic>(1), 4-11. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/00405840802577536">https://doi.org/10.1080/00405840802577536</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Popham</surname>
							<given-names>W. J.</given-names>
						</name>
					</person-group>
					<year>2009</year>
					<article-title>Assessment literacy for teachers: Faddish or fundamental?</article-title>
					<source>Theory Into Practice</source>
					<volume>48</volume>
					<issue>1</issue>
					<fpage>4</fpage>
					<lpage>11</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/00405840802577536">https://doi.org/10.1080/00405840802577536</ext-link>
				</element-citation>
			</ref>
			<ref id="B34">
				<mixed-citation>Popham, W. J. (2011). Assessment literacy overlooked: A teacher educator’s confession. <italic>The Teacher Educator</italic> , <italic>46</italic>(4), 265-273. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/08878730.2011.605048">https://doi.org/10.1080/08878730.2011.605048</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Popham</surname>
							<given-names>W. J.</given-names>
						</name>
					</person-group>
					<year>2011</year>
					<article-title>Assessment literacy overlooked: A teacher educator’s confession</article-title>
					<source>The Teacher Educator</source>
					<volume>46</volume>
					<issue>4</issue>
					<fpage>265</fpage>
					<lpage>273</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/08878730.2011.605048">https://doi.org/10.1080/08878730.2011.605048</ext-link>
				</element-citation>
			</ref>
			<ref id="B35">
				<mixed-citation>Sadeghi, K., &amp; Richards, J. C. (2015). Teaching spoken English in Iran’s private language schools: Issues and options. <italic>English Teaching: Practice &amp; Critique</italic>, <italic>14</italic>(2), 210-234. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1108/etpc-03-2015-0019">https://doi.org/10.1108/etpc-03-2015-0019</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Sadeghi</surname>
							<given-names>K.</given-names>
						</name>
						<name>
							<surname>Richards</surname>
							<given-names>J. C.</given-names>
						</name>
					</person-group>
					<year>2015</year>
					<article-title>Teaching spoken English in Iran’s private language schools: Issues and options</article-title>
					<source>English Teaching: Practice &amp; Critique</source>
					<volume>14</volume>
					<issue>2</issue>
					<fpage>210</fpage>
					<lpage>234</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1108/etpc-03-2015-0019">https://doi.org/10.1108/etpc-03-2015-0019</ext-link>
				</element-citation>
			</ref>
			<ref id="B36">
				<mixed-citation>Shepard, L. A. (2013). Foreword. In J. H. McMillan (Ed.), <italic>Sage handbook of research on classroom assessment</italic> (pp. xix-xxii). Sage Publications.</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Shepard</surname>
							<given-names>L. A.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<chapter-title>Foreword</chapter-title>
					<person-group person-group-type="editor">
						<name>
							<surname>McMillan</surname>
							<given-names>J. H.</given-names>
						</name>
					</person-group>
					<source>Sage handbook of research on classroom assessment</source>
					<fpage>xix</fpage>
					<lpage>xxii</lpage>
					<publisher-name>Sage Publications</publisher-name>
				</element-citation>
			</ref>
			<ref id="B37">
				<mixed-citation>Stiggins, R. J. (1995). Assessment literacy for the 21<sup>st</sup> century. <italic>Phi Delta Kappan</italic>, <italic>77</italic>(3), 238-245.</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Stiggins</surname>
							<given-names>R. J.</given-names>
						</name>
					</person-group>
					<year>1995</year>
					<article-title>Assessment literacy for the 21st century</article-title>
					<source>Phi Delta Kappan</source>
					<volume>77</volume>
					<issue>3</issue>
					<fpage>238</fpage>
					<lpage>245</lpage>
				</element-citation>
			</ref>
			<ref id="B38">
				<mixed-citation>Tabachnick, B. G., &amp; Fidell, L. S. (2013). <italic>Using multivariate statistics</italic> (6<sup>th</sup> ed.). Pearson Education.</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Tabachnick</surname>
							<given-names>B. G.</given-names>
						</name>
						<name>
							<surname>Fidell</surname>
							<given-names>L. S.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<source>Using multivariate statistics</source>
					<edition>6th</edition>
					<publisher-name>Pearson Education</publisher-name>
				</element-citation>
			</ref>
			<ref id="B39">
				<mixed-citation>Tajeddin, Z., Alemi, M., &amp; Yasaei, H. (2018). Classroom assessment literacy for speaking: Exploring novice and experienced English language teachers’ knowledge and practice. <italic>Iranian Journal of Language Teaching Research</italic>, 6(3), 57-77.</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Tajeddin</surname>
							<given-names>Z.</given-names>
						</name>
						<name>
							<surname>Alemi</surname>
							<given-names>M.</given-names>
						</name>
						<name>
							<surname>Yasaei</surname>
							<given-names>H.</given-names>
						</name>
					</person-group>
					<year>2018</year>
					<article-title>Classroom assessment literacy for speaking: Exploring novice and experienced English language teachers’ knowledge and practice</article-title>
					<source>Iranian Journal of Language Teaching Research</source>
					<volume>6</volume>
					<issue>3</issue>
					<fpage>57</fpage>
					<lpage>77</lpage>
				</element-citation>
			</ref>
			<ref id="B40">
				<mixed-citation>Taylor, L. (2013). Communicating the theory, practice and principles of language testing to test stakeholders: Some reflections. <italic>Language Testing</italic> , <italic>30</italic>(3), 403-412. https://doi.org/10.1177/0265532213480338</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Taylor</surname>
							<given-names>L.</given-names>
						</name>
					</person-group>
					<year>2013</year>
					<article-title>Communicating the theory, practice and principles of language testing to test stakeholders: Some reflections</article-title>
					<source>Language Testing</source>
					<volume>30</volume>
					<issue>3</issue>
					<fpage>403</fpage>
					<lpage>412</lpage>
				</element-citation>
			</ref>
			<ref id="B41">
				<mixed-citation>Tsagari, D., &amp; Vogt, K. (2017). Assessment literacy of foreign language teachers around Europe: Research, challenges and future prospects. <italic>Papers in Language Testing and Assessment</italic>, 6(1), 41-63.</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Tsagari</surname>
							<given-names>D.</given-names>
						</name>
						<name>
							<surname>Vogt</surname>
							<given-names>K.</given-names>
						</name>
					</person-group>
					<year>2017</year>
					<article-title>Assessment literacy of foreign language teachers around Europe: Research, challenges and future prospects</article-title>
					<source>Papers in Language Testing and Assessment</source>
					<volume>6</volume>
					<issue>1</issue>
					<fpage>41</fpage>
					<lpage>63</lpage>
				</element-citation>
			</ref>
			<ref id="B42">
				<mixed-citation>Vogt, K., &amp; Tsagari, D. (2014). Assessment literacy of foreign language teachers: Findings of a European study. <italic>Language Assessment Quarterly</italic> , <italic>11</italic>(4), 374-402. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/15434303.2014.960046">https://doi.org/10.1080/15434303.2014.960046</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Vogt</surname>
							<given-names>K.</given-names>
						</name>
						<name>
							<surname>Tsagari</surname>
							<given-names>D.</given-names>
						</name>
					</person-group>
					<year>2014</year>
					<article-title>Assessment literacy of foreign language teachers: Findings of a European study</article-title>
					<source>Language Assessment Quarterly</source>
					<volume>11</volume>
					<issue>4</issue>
					<fpage>374</fpage>
					<lpage>402</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1080/15434303.2014.960046">https://doi.org/10.1080/15434303.2014.960046</ext-link>
				</element-citation>
			</ref>
			<ref id="B43">
				<mixed-citation>Xu, H. (2017). Exploring novice EFL teachers’ classroom assessment literacy development: A three-year longitudinal study. <italic>The Asia-Pacific Education Researcher</italic>, <italic>26</italic>, 219-226. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1007/s40299-017-0342-5">https://doi.org/10.1007/s40299-017-0342-5</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Xu</surname>
							<given-names>H.</given-names>
						</name>
					</person-group>
					<year>2017</year>
					<article-title>Exploring novice EFL teachers’ classroom assessment literacy development: A three-year longitudinal study</article-title>
					<source>The Asia-Pacific Education Researcher</source>
					<volume>26</volume>
					<fpage>219</fpage>
					<lpage>226</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1007/s40299-017-0342-5">https://doi.org/10.1007/s40299-017-0342-5</ext-link>
				</element-citation>
			</ref>
			<ref id="B44">
				<mixed-citation>Xu, Y., &amp; Brown, G. T. L. (2016). Teacher assessment literacy in practice: A reconceptualization. <italic>Teaching and Teacher Education</italic> , <italic>58</italic>, 149-162. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.tate.2016.05.010">https://doi.org/10.1016/j.tate.2016.05.010</ext-link>
				</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Xu</surname>
							<given-names>Y.</given-names>
						</name>
						<name>
							<surname>Brown</surname>
							<given-names>G. T. L.</given-names>
						</name>
					</person-group>
					<year>2016</year>
					<article-title>Teacher assessment literacy in practice: A reconceptualization</article-title>
					<source>Teaching and Teacher Education</source>
					<volume>58</volume>
					<fpage>149</fpage>
					<lpage>162</lpage>
					<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1016/j.tate.2016.05.010">https://doi.org/10.1016/j.tate.2016.05.010</ext-link>
				</element-citation>
			</ref>
			<ref id="B45">
				<mixed-citation>Xu, Y., &amp; Brown, G. T. L. (2017). University English teacher assessment literacy: A survey-test report from China. <italic>Papers in Language Testing and Assessment</italic>, 6(1), 133-158.</mixed-citation>
				<element-citation publication-type="journal">
					<person-group person-group-type="author">
						<name>
							<surname>Xu</surname>
							<given-names>Y.</given-names>
						</name>
						<name>
							<surname>Brown</surname>
							<given-names>G. T. L.</given-names>
						</name>
					</person-group>
					<year>2017</year>
					<article-title>University English teacher assessment literacy: A survey-test report from China</article-title>
					<source>Papers in Language Testing and Assessment</source>
					<volume>6</volume>
					<issue>1</issue>
					<fpage>133</fpage>
					<lpage>158</lpage>
				</element-citation>
			</ref>
			<ref id="B46">
				<mixed-citation>Zhang, Z., &amp; Burry-Stock, J. A. (1994). <italic>Assessment practices inventory</italic>. The University of Alabama.</mixed-citation>
				<element-citation publication-type="book">
					<person-group person-group-type="author">
						<name>
							<surname>Zhang</surname>
							<given-names>Z.</given-names>
						</name>
						<name>
							<surname>Burry-Stock</surname>
							<given-names>J. A.</given-names>
						</name>
					</person-group>
					<year>1994</year>
					<source>Assessment practices inventory</source>
					<publisher-name>The University of Alabama</publisher-name>
				</element-citation>
			</ref>
		</ref-list>
		<fn-group>
			<fn fn-type="other" id="fn1">
				<label>How to cite this article (APA, 7th ed.):</label>
				<p> Tajeddin, Z., Saeedi, Z., &amp; Panahzadeh, V. (2022). English language teachers’ perceived classroom assessment knowledge and practice: Developing and validating a scale. <italic>Profile: Issues in Teachers’ Professional Development</italic>, <italic>24</italic>(2), 247-264. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.15446/profile.v24n2.90518">https://doi.org/10.15446/profile.v24n2.90518</ext-link>
				</p>
			</fn>
		</fn-group>
		<fn-group>
			<title>About the Authors</title>
			<fn fn-type="other" id="fn2">
				<label>Zia Tajeddin</label>
				<p> is professor of Applied Linguistics at Tarbiat Modares University, Iran. He is the co-editor of two international journals: <italic>Applied Pragmatics</italic> (John Benjamins) and <italic>Second Language Teacher Education</italic> (Equinox). His research interests center on L2 pragmatics, teacher education, and EIL/ELF pedagogy.</p>
			</fn>
			<fn fn-type="other" id="fn3">
				<label>Zari Saeedi</label>
				<p> received her PhD in Applied Linguistics from the Trinity College Cambridge (UK) and is an associate professor at Allameh Tabataba’i University, Iran. She has presented in different national/international conferences and published papers and books on a range of topics, including language learning and culture.</p>
			</fn>
			<fn fn-type="other" id="fn4">
				<label>Vahid Panahzadeh</label>
				<p> is a PhD candidate of TEFL at Allameh Tabataba’i University, Iran. He has presented papers at national and international conferences. His main area of interest is language assessment.</p>
			</fn>
		</fn-group>
	</back>
</article>