Index

Abstract

This study investigated mathematics teachers’ knowledge, preparation, and use of assessment rubrics for teaching and learning 3D-geometry at selected secondary schools in the Dodoma region of Tanzania. The study was motivated by teachers' limited understanding of classroom assessment. The study was based on the premise that the existing assessment practices used by teachers in assessment of teaching and learning of mathematics are inadequate. A mixed research method was applied in this investigation. There were 51 mathematics teachers who participated and were purposively selected from 20 secondary schools. The data were collected through documentary reviews, classroom observations, questionnaires, and interviews. Qualitative data were analyzed thematically, while quantitative data were analyzed by using descriptive statistics such as mean, standard deviation, frequencies, and percentages. The study revealed that teachers have limited knowledge of assessment rubrics; hence, most do not prepare or use rubrics for assessing students. The use of rubrics could assist teachers to spot-out the specific point at which a particular learner has encountered learning obstacle, this could assist in making quick and pertinent teaching and learning adjustments to increase the effectiveness of teaching and learning. Through teachers’ professional development, the study recommends improving mathematics teachers' knowledge of planning and using assessment rubrics to improve 3D-geometry teaching and learning.

Keywords:3D-geometry, Preparation of assessment rubrics, Classroom assessment, Mathematics teachers.

Received: 22 February 2023/ Revised: 31 May 2023/ Accepted: 4 August 2023/ Published: 23 August 2023

Contribution/ Originality

This study sheds light on the current state of teachers' knowledge, preparation, and use of assessment rubrics in the teaching and learning of mathematics. The research can help education stakeholders make plans for the necessary capacity building to improve classroom assessment practices and, in turn, increase teaching and learning efficiency.

1. INTRODUCTION

Globally, there has been a growing interest in the field of education in assessment practices. There is an emerging consensus among educational policymakers, researchers, and practitioners that assessment is an important component of the teaching and learning process (Azim & Khan, 2012; Black & Wiliam, 2018; Siarova, Sternadel, & Mašidlauskaitė, 2017; Tarmo, 2022; Wiliam, 2017; Wiliam, 2018) . Assessment and students’ learning are viewed as inseparable entities. Among the assessment practices, Assessment for Learning (AfL) is considered to be the best classroom assessment practice. According to Black and Wiliam (2018), AfL is any assessment practice whose ultimate goal is to improve teaching and learning. Teachers use the information elicited from AfL to adjust their teaching, and students use it to adjust their learning. In this view, AfL practices should be emphasized to promote meaningful teaching and learning.

Syaifuddin (2020), entailed that, effective practice of AfL involves the use of assessment rubrics. A rubric is a tool used in the process of assessing students’ work (Dawson, 2017). Assessment rubrics provide a set of criteria used to evaluate a student’s level of performance in achieving planned learning competence and hence, can easily use to visualize their next steps in learning (Brookhart, 2013). Various authors have pointed out features of effective assessment rubrics. According to Popham (1997), a good assessment rubric should contain three important features: (1) success criteria that express what to look for in the work; (2) quality definitions for those criteria at particular levels; and (3) a scoring strategy that instantiates those criteria in students’ work at varying quality levels, from minimum performance to maximum performance. Assessment rubrics have been also categorized based on the focus; they have been termed general holistic, general analytic, task-specific analytic, and task-specific holistic (Jonsson & Svingby, 2007). 

Further, Reddy and Andrade (2010) classified assessment rubrics as "teacher-created rubrics," which are co-created rubrics between teachers and their students. Though they have described the strengths and weaknesses of each category, the integration of task-specific analytical assessment that is co-created by teachers and their students is thought to include important features of AfL practices (Dawson, 2017). AfL puts emphasis on students’ learning by engaging students to participate in setting success criteria; they are able to monitor their thinking and make them aware of the achievement benchmarks from the start of the lesson. In this way, through the use of assessment benchmarks and lesson objectives, teachers and students can monitor the teaching and learning progress (Aji, Hudha, Huda, Nandiyanto, & Abdullah, 2018; Andrade, 2014). A task-specific assessment rubric expresses specific facts or procedures that the student’s responses to a task should contain. In this view, a task-specific assessment rubric is desired to elicit the extent to which the student is able to perform a particular task. Table 1 shows a sample of assessment rubrics for the task of calculating the area and perimeter of a basketball key (Task # 7.G.4/ 7.G.6).

Table 1 . Sample of analytic-holistic assessment rubrics in geometric problems.

Task number

4
Thoroughly meet the standards

3
Meets standards

2
Approaching standards

1
Not yet approaching standards

0
No attempt

#1
7.G.4
7.G.6

Student correctly finds the area of the basketball key, with organized work that clearly shows their thinking, including a correct and labeled equation, with no calculation errors and using correct units.

Student uses a correct strategy to find the area of the basketball key, with work that shows their thinking, including an equation. May include minor calculation errors or incorrect units.

Student uses a partially correct strategy to find the area, but does not correctly find the area of the basketball key or student has correct answers but shows no work.

Student attempts to find the area but does not correctly find the area of any of any part of the basketball key or student has incorrect answers and shows no work.

No evidence of attempting the problem.

7.G.4
7.G.7

Student correctly finds the perimeter of the basketball key, with work that clearly shows thinking, including a correct and labelled equation, with no calculation errors, and using correct units.

Student uses a correct strategy to find the perimeter of the basketball key, with work that shows their thinking, including an equation. May include minor calculation errors or incorrect units

Student uses a partially correct strategy to find the perimeter, but does not correctly find the perimeter of the basketball key or student has correct answers but shows no work.

Student attempts to find the perimeter but does not correctly find the perimeter of any of any part of the basketball key or student has incorrect answers and shows no work.

No evidence of attempting the problem.

Note:

San Francisco Unified School District (2020).

Table 1 illustrates an analytic-holistic assessment rubric for assessing a student’s ability to calculate the area and perimeter of a basketball key. The key elements of the assessment rubric are included, such as success criteria, definitions of quality levels of performance, and success criteria.

In recent years, a number of studies have focused on the preparation and use of assessment rubrics in teaching and learning. For example, Jönsson and Panadero (2017) conducted research on how the preparation and use of assessment rubrics support AfL in higher education. The results showed that preparation and use of assessment rubrics helped teachers score performance tasks more consistently, promoted learning and improved instruction by making assessment expectations clear, and improved the feedback process. Kilgour, Northcote, Williams, and Kilgour (2020) conducted an intervention aiming to engage students in higher learning in collaboration with their teachers to construct and use assessment rubrics in the teaching and learning process. The study revealed that the co-construction of assessment rubrics between teachers with their students enhanced students’ understanding and ownership of their assessment experiences. The findings also revealed that, lessons that are guided by assessment rubrics are more directed, and experience improvement in the learning process compared to those that were not guided by assessment rubrics (Reddy & Andrade, 2010; Smith, 2017).

The studies highlighted show that rubrics positively influenced teaching and learning. However, most of these studies were conducted in higher education and not at a lower level of learning, such as in secondary schools. Nonetheless, it became difficult to find studies based on the preparation and use of assessment rubrics in the teaching and learning of 3D geometry. The topic of 3D-geometry is among the poorly performed topics in ordinary-level national examinations (National Examination Council of Tanzania, 2017, 2018, 2019, 2020, 2021) . Among other factors, limited classroom assessment practices were mentioned as contributing to low performance on the topic (William & Kitta, 2021). Assessment rubrics being an important assessment technique in teaching and learning (Andrade & Du, 2005; Panadero & Jonsson, 2013; Popham, 1997; Reddy & Andrade, 2010) , this study aimed at exploring mathematics teachers’ knowledge, preparation, and use of assessment rubrics in teaching and learning of 3D geometry. As their knowledge, preparation and use have influence in students’ performance in the topic.

The aspects of assessment rubrics mentioned by most of the authors include success criteria to meet learning intentions, the definition of the quality level of performance, and scoring criteria (Andrade & Du, 2005; Panadero & Jonsson, 2013; Popham, 1997; Reddy & Andrade, 2010) , were considered by this study to be the key assessment aspects for the quality of assessment rubrics since they cover almost all necessary aspects to be integrated in AfL practices. Specifically, the study answers the following research questions:

  1. How do mathematics teachers prepare assessment rubrics for teaching and learning of 3D-geometry?
  2. Do mathematics teachers use assessment rubrics in the teaching and learning of 3D-geometry?
  3. What is the knowledge level of mathematics teachers on assessment rubrics in teaching and learning?

1.2. Significance of the Study

The current study provides information on the current status of mathematics teachers' knowledge, preparation, and use of assessment rubrics in their teaching and learning. Subsequently, it will help education authorities plan appropriate professional development programs for enhancing mathematics teachers’ assessment practices.

2. METHODOLOGY

2.1. Research Design

The study followed a mixed-methods research approach to gather information regarding teachers’ preparation, use, and knowledge of assessment rubrics. Specifically, the study followed an explanatory sequential research design. This is a design where the outcome of quantitative findings leads to qualitative data collection methods (Creswell, 2014; Villiers & Fouché, 2015). The data were collected in two phases, the data collected from lesson planning documents, classroom observation, and classroom reflection from phase one were analyzed quantitatively, and were taken to the second phase of qualitative data collected through interviews and analyzed qualitatively by identifying themes.

2.2. Sample and Population

The target population of this study comprised mathematics teachers who were teaching ordinary-level secondary schools in Dodoma, Tanzania. The reason for the selection of ordinary levels is due to students' poor performance in mathematics (Ministry of Education Science and Technology, 2020). Among the reported factors for low performance was limited knowledge of assessment practices (Kitta & Likinjie, 2020; Kyaruzi, Strijbos, Ufer, & Brown, 2018; Lema & Maro, 2016; William & Kitta, 2021) . As a result, it was felt desirable to explore mathematics teachers’ current assessment practices, particularly their knowledge, preparation and use assessment rubrics that will inform intervention practices. Dodoma region was randomly picked from the pool containing names of 31 regions found in Tanzania. Similar sampling methods were used to pick two districts within the region, which are Chamwino and Dodoma municipalities. Due to their role in teaching mathematics, 51 mathematics teachers from 20 secondary schools within two districts were purposefully selected to participate in the study. Table 2 shows the demographic information of the mathematics teachers who participated in the study.

Table 2. Demographic information of participants.

Characteristics

Category

Frequency

Percentage

Gender

Female

17

33.3

Male

34

67.7

Age

Less than 30

9

17.7

30 - 35

11

21.5

36 - 40

17

33.3

41 - 45

14

27.5

Academic qualification

Bachelor degree

49

96.1

Diploma

2

3.9

Working experience

Less than 5

2

3.9

5 - 10

27

52.9

More than 10

22

43.2

The demographic characteristics of participating teachers were collected for the purpose of identifying the profile of the sample involved in the study. It is clear from Table 2 that mathematics teachers were dominated by males. A majority of teachers also had a bachelor’s degree with working experience of more than 5 years.

2.3. Instruments Data Collection Procedures

Phase 1: Collection of Quantitative data: The first phase of data collection involved collection of quantitative data, to answer the first two research questions inquired to find out mathematics teachers’ preparation and use of assessment rubrics. Documents reviews based on lesson planning documents, classroom observation, and classroom reflection were used to collect data as follows:

The documents reviewed were lesson planning documents, such as lesson plans, lesson guides, and prepared assessment documents. Document reviews were used to answer the first research question which explored how mathematics teachers prepared assessment rubrics. The documents were reviewed using a checklist to see if the assessment rubric was among the prepared assessment documents. If the assessment rubrics were not available, then aspects of the assessment rubrics, such as assessment tasks, success criteria, and definitions of performance levels for the task(s), were checked from other prepared assessment documents. The checklist consisted of three points: not indicated, not clearly indicated, and clearly indicated, as shown in Table 3.

Table 3. Sample of checklist aspects checked from lesson planning documents.

Attributes

Not indicated

Not clearly indicated

Clearly indicated

Assessment task(s)

Descriptions of assessment activities are not provided

Descriptions of assessment activities lack clarity in determining the desired level of competence that needs to be achieved

Descriptions of assessment activities are thoroughly explained, enabling clear identification of the desired level of competence to be achieved

Statements of success criteria

Benchmarks on qualities that need to be demonstrated to meet the desired level of performance are not provided

Benchmarks on qualities to be demonstrated to meet the desired level of performance are not explicitly provided

Benchmarks on qualities to be demonstrated to meet the desired level of performance are well provided

Definitions of performance levels

Achievement levels for assessing competence are not provided

Achievement levels for assessing competence are not well specified

Achievement levels for assessing competence are well specified

Scoring criteria

Set of standards for evaluating the level of performance are not provided

Set of standards for evaluating performance are not clearly described in terms of levels of achievement

Set of standards for evaluating the level of performance are explicitly described

In Table 3, "not indicated" was marked when either the assessment rubric or a particular aspect of the assessment rubric was not seen in the lesson planning documents. "Not clearly indicated" was marked when the assessment rubric was not well defined, that is, the aspects for quality assessment rubrics were not explicitly defined, and "clearly indicated" was marked when the assessment rubric was well defined or the aspects for quality assessment rubrics were defined from other assessment documents. Furthermore, classroom observation was used to collect data for the second research question, which was about how mathematics teachers use assessment rubrics to facilitate effective teaching and learning of 3D geometry. As shown in Table 4, data were collected from classroom sessions using the classroom observation protocol (COP) to check the implementation of planned assessment rubrics by looking at how the planned assessment aspects were implemented in the class.

Table 4. Sample of aspects in COP.

Attributes

Not implemented

Not well implemented

Well implemented

Teacher provides assessment task(s) to students

A teacher does not assign assessment activities to students

A teacher provides only general instructions on assessment activities but without targeting on specific competence to be reached

A teacher provides instruction to each assessment activity and specifies the competence to be met by students

Teacher shares success criteria with students

A teacher does not share with the students the benchmarks on qualities to be demonstrated to meet the desired level of performance

A teacher shares with the students a general competence to be met but without specifying a benchmark on each performance quality to be demonstrated to meet a specific competence

A teacher shares with students the benchmarks on qualities that need to be demonstrated to meet the desired level of performance

Teacher uses scoring criteria to evaluate achievement of a task

A teacher does not provide students with a set of standards for evaluating the level of performance

A teacher provides a general standard to be met but without specifying a standard for each level of performance

A teacher provides students with a set of standards for evaluating their level of performance in each assessment activity

The COP as shown in the table consists of a three-point scale with "not implemented", "not well implemented,” and "well implemented." "Not implemented" was marked when the teacher did not demonstrate a particular assessment rubric aspect at all. "Not well implemented" was marked when the teacher tried to demonstrate the assessment rubric aspect, but not to the level that the student could follow, and "well implemented" was marked when the teacher clearly demonstrated the assessment rubric aspect in a way that students were able to follow the instructions. A classroom evaluation questionnaire (CEQ) was also used for mathematics teachers’ self-evaluation on the use of assessment rubrics in their teaching and learning. The questionnaire consisted of three-point scales defined by "No," "Yes," and "not sure." A sample of items included in the teachers’ CEQ is shown in Table 5.

Table 5. Sample of statements in CEQ.

Statement

No

Yes

Not sure

I have planned assessment rubrics related to learning intentions

A teacher does not prepare assessment rubrics

A teacher prepares assessment rubrics

A teacher is not sure if a set of assessment procedures prepared are called assessment rubrics

I have shared success criteria with students at the start of teaching

A teacher does not provide students with specific benchmarks to demonstrate the desired level of performance

A teacher provides students with specific benchmarks to demonstrate the desired level of performance

A teacher is uncertain with the guidelines provided to students to describe success criteria

Assessment rubrics helped to evaluate the success of the lesson

A teacher disagrees that assessment rubrics helped to evaluate the success of the lesson

A teacher agrees that assessment rubrics have promoted to evaluate the effectiveness of the lesson

A teacher is unsure if assessment rubrics or other methods promote lesson evaluation

In Table 5, a teacher was marked in "No" column when she disagreed with the statement; she was marked in the "Yes" column when she agreed with the statement; and she was marked "Not sure" when she was uncertain about the statement.   The checklist, COP, and Likert scale questionnaire were peer-reviewed and validated by two lecturers who were experts in the field of mathematics education with more than 15 years of experience. The validity results from each instrument were evaluated, and items that received a score of 75% or above were kept, while those that scored lower than 75% were removed. The analysis of quantitative data from document review, classroom observation, and classroom evaluation stages indicated that the majority of teachers could not prepare or use assessment rubrics in their teaching and learning. The results from quantitative data developed an interest in exploring mathematics teachers’ knowledge level of assessment rubrics through the use of semi-structured interviews to get mathematics teachers’ insights on their understanding of assessment rubrics and to see if their level of understanding influenced their planning and implementation of assessment rubrics in the lesson delivery process.

Phase two: Collection of Qualitative data: From 51 mathematics teachers, ten (10) were randomly selected to be involved in the interview. The questions from the interview were adjusted for each mathematics teacher based on the practice demonstrated in lesson planning and implementation. The semi-structured interview guide was peer-reviewed and appraised by mathematics education experts. Some questions were suggested to be removed, some needed adjustments, and other questions did not require corrections. The interview session took an average of 45 minutes per teacher. The interview sessions were recorded using a sound recorder and transcribed verbatim after the interview session. Table 6 indicates the sample of interview questions administered to mathematics teachers.

Table 6. Sample of interview questions administered to mathematics teachers.

Constructs

Interview questions with teacher Twasu

  • Construction of assessment rubrics
  • What materials do you prepare for assessing the teaching and learning in your lesson?
  • Do you prepare assessment rubrics? Why
  • Understanding of assessment rubrics
  • What do you know about assessment rubrics?
  • Which factors hinder you in the preparation and use of assessment rubrics in teaching and learning?

2.4. Ethical Considerations

The research permission was obtained from the authorities, starting at the University of Rwanda, and then requested from the office of the president, regional authorities, and local government in Tanzania. In addition, participants signed consent forms indicating their willingness to participate in the study. Moreover, to assure the confidentiality of participants, pseudonyms were used instead of real names.

2.5. Data Analysis

The data from the checklist, COP, and questionnaire were analyzed quantitatively through the use of a Microsoft Excel sheet. To determine the extent to which mathematics teachers demonstrate a specific assessment rubric aspect, the checklist and COP items were analyzed in terms of mean and standard deviation (µ ± δ) on demonstrated performances. The data from the questionnaire were analyzed in terms of frequency and percentages of teachers’ evaluations on the use of assessment rubrics and calculated and presented using a column graph. On the other hand, the analysis of data from a semi-structured interview was guided by Braun and Clark's 2006 model. The model has six steps: familiarizing with the data, generating initial codes, searching for themes, reviewing themes, defining and naming themes, and producing the report (Braun & Clarke, 2006). The data analysis for each type of instrument was done as follows:

  1. From the checklist, the three checking value points were assigned; 0-for not indicated, 1-for not clearly indicated and 2-for clearly indicated, the mean value and standard deviation of each aspect over all sample of 51 teachers were computed. For example:

From the interpretation of point values, 1 represents ‘not clearly indicated’ so the interpretation is that, the assessment tasks were not clearly indicated by the teachers

After computing mean value of each aspect, then the overall mean of mathematics teachers’ ability to prepare assessment rubrics or aspects of quality assessment rubrics were calculated through the following formula

  1. For the data from COP, the same procedure applied on checklist were used to find the mean of mathematics teachers’ ability to implement assessment rubric aspects in the teaching and learning process. That is:
  1. Analysis of responses from Likert scale questionnaire were conducted using excel program where self-evaluation of implementing assessment rubrics in the lesson was obtained through calculating percentage of mathematics teachers’ responses on the ability to implement assessment rubric aspect. This was computed using the following formula:

Percentage of mathematics teachers to implement a particular  assessment aspect =

For example,

Percentage of mathematics teachers who shared success criteria with students =

The same procedure was done to obtain percentage of mathematics teachers with ‘No’ Or ‘Not sure’ responses

  1. Analysis of semi-structured interview was done thematically using (Braun & Clarke, 2006) model:

Familiarization with the initial codes entailed verbatim transcription of data, as well as reading and re-reading the data to determine the depth of the participants' responses. The codes were inductively generated by extracting from the respondents’ explanations. Groups of related codes were brought together to form subthemes, and the subthemes were scrutinized into broader patterns of overarching ideas that were called themes. Peers, especially those with expertise in qualitative data analysis, were consulted to go through the emerging themes to see if they were related to the formulated codes and collected data, as well as if they answered the research question. The inputs given by the experts were used to make further refinements to the themes. The last stage was report writing, which involved merging together the analytical narrative data segments based on the existing literature.

3. RESULTS

As previously stated, the purpose of this study was to investigate how mathematics teachers prepared and used assessment rubrics for assessing the teaching and learning of 3D geometry. The study went further to examine mathematics teachers' levels of understanding of assessment rubrics to see if their level of understanding influenced their preparation and degree of integration in lesson delivery. The findings are presented following the flow of research questions.

3.1. Mathematics Teachers’ Preparation of Assessment Rubrics

The data analysis from the checklist revealed that the majority of teachers could state assessment activities. Most of them prepared some questions to be practiced during the teaching and learning, and some of them prepared the solutions to the questions. However, all of the teachers did not indicate the success criteria to achieve the planned assessment tasks. The assessment tasks were prepared together with the lesson guiding notes, and the assessment methods were stated in the lesson plan book. From all the documents reviewed, there was no document that could be termed as an assessment rubric. There were also no definitions of the quality level of performances or scoring criteria for the planned assessment task(s). The overall mean and standard deviation of the level of indication of assessment rubrics elements from lesson planning documents were: Mean = 0.785, and SD = .027 which defines that they are not clearly indicated. The summary of the analysis is shown in Table 7.

Table 7. Participants’ level of indication of assessment rubric aspects in lesson planning stage.

Descriptions

Mean

SD

Level of indication

Planned assessment tasks

1.85

0.088

Clearly indicated

Explicit statement of success criteria

1.29

0.021

Not clearly indicated

Definition of quality levels of performance

0.00

0.000

Not indicated

Stated scoring criteria

0.00

0.000

Not indicated

Overall indication               

0.785

0.027

Not clearly indicated

The results show that mathematics teachers can only prepare assessment tasks; however, they could not explicitly indicate success criteria for achieving the maximum level of performance on the prepared assessment tasks. It was also found that teachers could not state the quality level of performance or scoring criteria of assessment tasks.

3.2. Mathematics Teachers’ Use of Assessment Rubrics in Teaching and Learning of 3D-Geometry

As it was explained in the data collection procedures, the finding on mathematics’ teachers use of assessment rubrics were collected using Classroom Observation Protocol and Classroom Evaluation Questionnaires. The findings are presented in the following two subsections.

3.2.1. Classroom Observation Protocol

An analysis of the data from COP shows that most of the teachers provide assessment tasks to students. However, in all lessons, no teacher shared success criteria with the students. For example, teacher Twaka provided an assessment task to construct a cylinder by using a manila sheet. Twaka did not go further to express success criteria that might help students reach the targeted level of performance. The teacher did not engage students to formulate the criteria or share the scoring criteria with them. Generally, teachers’ performance level on the use of assessment rubrics in the teaching and learning process was very limited. The overall mean and standard deviation of performance on the use of assessment rubrics in the class were (Mean = 0.54, SD = 0.011).

Table 8. Participants’ level of implementing assessment rubrics elements.

Description

Mean

SD

Level of implementation

Teacher provides assessment tasks

1.83

0.051

Well implemented

Teacher shares success criteria with students

0.89

0.006

Not well implemented

Teacher shares definitions of quality levels of performance of a task

0.00

0.000

Not implemented

Teacher uses scoring criteria to evaluate achievement of a task

0.00

0.000

Not implemented

Provision of supportive feedback are based on planned success criteria

0.00

0.000

Not implemented

Overall implementation

0.54

0.011

Not implemented

Table 8 presents research findings on the observed assessment attributes as demonstrated by mathematics teachers in classrooms. The results show that teachers provide assessment tasks to students; to lesser extent, they do share criteria for success with those assessment tasks. Results show that teachers do not share criteria for scoring the assessment tasks with their students and hence, the provision of feedback is not guided by the success criteria

3.2.2. Classroom Evaluation Questionnaires

Teacher’s self-evaluation for lessons was conducted with the help of a questionnaire on the use of assessment rubrics in teaching and learning. A summary of the analysis is indicated in Figure 1.

Figure 1. Mathematics teachers’ views on the use of assessment rubrics.

Most mathematics teachers declared that they did not consider assessment rubrics in teaching and learning, as more than 80% declared that they could not prepare or use assessment rubrics in their teaching. Assessment rubrics did not guide their lesson evaluation process or provide supportive feedback to the students.

3.3. Mathematic Teachers’ Knowledge Level on Assessment Rubrics in Teaching and Learning

The sub-themes generated from interview transcripts were categorized into two categories: the concept of assessment rubrics; and knowledge on the preparation and use of assessment rubrics in the teaching and learning process.

3.3.1. Concept of Assessment Rubrics

The responses from mathematics teachers indicated limited knowledge on the concept of assessment rubrics. About 90% of respondents could not provide a clear meaning of the assessment rubrics. For example, when asked what they knew about assessment rubrics, most of the teachers responded that they could not clearly express the meaning of assessment rubrics. This can be evidenced by the response from teacher Twopa, who asserted that "I cannot explain exactly what assessment rubrics mean, but I can try by saying that assessment rubrics are like the questions that you provide to students to assess their understanding during the teaching and learning process." The statement by Twopa implies a limited understanding of assessment rubric concepts. Similarly, Teacher Twele contended that "I think assessment rubrics are the assessment techniques used for assessing teaching and learning." Examples of techniques include brainstorming, quizzes, and exercises for the purpose of ensuring student understanding. Participants’ responses to the concept of assessment rubrics reflected assessment tasks and tools; they did not explain aspects of assessment rubrics such as success criteria or definitions of quality levels of performance on the assessment tasks. Therefore, the concept of assessment rubrics was still a challenge to participants.

3.3.2. Knowledge of Preparation and Use of Assessment Rubrics

When asked what materials they prepared for assessing 3D-geometry teaching and learning, most teachers said they prepared some guiding questions and solutions to the questions they used to assess students' understanding in class. Teacher Twasu had the following to say: "When I prepare my lesson, I prepare a lesson plan, lesson notes, and teaching aids. In the lesson notes, I also give some examples of questions and their solutions that guide me in teaching.” Furthermore, Tweme, the teacher, stated: “I prepare assessment materials by writing some questions for each learning activity, and I also prepare a marking guide, so after providing a learning activity, I give those questions to students, and when they respond, I follow the solution that I prepared to check if they are getting it right or wrong.”

The quotations above signify that teachers prepare assessment tasks and solution guides. The aspect of scoring criteria is reflected in the preparation of a marking guide. However, the teacher’s preparation of the marking guide was to guide the marking process and not for learning adjustment purposes, as the teacher did not mention the involvement of students either in the preparation or in the implementation stages of teaching and learning process.

On the same argument, Teacher Twesa pointed out that: “You know the topic of three-dimensional figures requires a lot of creativity in making teaching and learning materials. So sometimes I use the students themselves to prepare teaching and learning materials such as boxes, cones, or pyramids. I assess their creativity through hands-on activities I provide them.” This excerpt entails assessment in which students are involved in hands-on activities linked to the preparation of assessment rubrics. This kind of assessment is outcome-based and does not show criteria set prior to the assessment. In that regard, participants had a limited understanding of preparing assessment rubrics.

When they asked whether they prepared assessment rubrics, teachers declared that they did not because they were not the part of curriculum guidelines. This was evidenced by Teacher Twasu, who said, “I do not prepare assessment rubrics because they are not in the curriculum guidelines. As for me, I'm not sure what the assessment rubrics are or how they're created. If it could be a necessary curriculum document, we could have been taught it in teacher training colleges, but we did not learn it there.” From this claim, it seems teachers are not trained on how to prepare and use assessment rubrics in the classroom.

Some teachers claimed that they prepared assessment rubrics; however, their actual practice went contrary to their claims. Even their explanations of the use of assessment rubrics did not support their claims. For example, some of them referred to marking guides as assessment rubrics, so they said that they prepared them for marking end of midterm tests or end of term examinations. In this view, teachers have limited knowledge about the preparation and use of assessment rubrics. For example, one of the respondents, Teacher Twesa, said: “For my side I do not prepare assessment rubrics; I can say I use unplanned assessment rubrics that are not written. Because when I teach, I try to assess thoroughly how my students are gaining. For example, when I instruct them to indicate the edges or vertices in a figure, I can assess how many students were able to do that, and I find ways of helping those who were not able to indicate.”

On the other hand, teachers commented on having a heavy teaching workload that limited their time for preparing teaching and learning materials, including assessment rubrics. This was echoed by teacher Twaka, who asserted that: “I am the only mathematics teacher in this school, so I have to teach mathematics in all classes with a total of almost six hundred students. I can't afford to prepare all materials because I prefer to spend my time teaching rather than preparing assessment rubrics. I just use some questions from the textbooks and past papers and let students practise in the class. I can tell whether they understand or not after marking their exercises. It is clear from the preceding quotes that assessment rubrics were not prepared by participants.”

About 80% of the teachers interviewed commented that the assessment rubrics were not part of the curriculum materials that were taught during pre-service programs. The majority of teachers interviewed were uncertain on the concept of assessment rubrics; some of them referred to assessment tasks without including success criteria or scoring criteria. Some of them related them to marking guides that guided the teacher in marking assessment task(s) without sharing them with the students. Some participants claimed that they used unplanned assessment rubrics; however, when they were asked to briefly describe the key aspects of the rubrics, they just mentioned assessment tasks and other assessment tools such as tests and quizzes. The indication is that assessment rubrics were not being used to the required standards or were not being used at all.

4. DISCUSSION

4.1. Mathematics Teachers’ Preparation of Assessment Rubrics

The findings from the lesson planning documents indicate that mathematics teachers in Tanzania's ordinary secondary schools did not prepare assessment rubrics. In this view, their classroom assessment practices were not guided by assessment rubrics. The results are similar to the results obtained by of Putri (2016), who conducted research in Indonesia on how teachers prepared lesson planning documents. The results of that study also show that in preparing lesson assessment tasks, teachers do not prepare assessment rubrics for those planned assessment tasks. The results of the current study are also similar to the findings of Kitta and Tilya (2018), who asserted that secondary school mathematics teachers in Tanzania did not prepare assessment rubrics for their assessment tasks. As it was explained in the data collection procedures, when it was found that assessment rubrics were not prepared by teachers, the researchers went further to check if aspects of the assessment rubrics, such as success criteria, quality definitions of performance levels, and scoring criteria, were included in other lesson planning documents such as lesson plan templates, lesson guides, lesson notes, or other planned assessment tasks. It was found that the majority of teachers showed a high indication level (mean =1.850, SD = 0.088) of preparing assessment tasks for assessing the teaching and learning of 3D-geometry. However, the assessment tasks were not accompanied by success criteria for performing the tasks. The results do not concur with those of the study of Lema (2022), who pointed out that teachers did plan for success criteria or defined the quality levels of performance on the prepared assessment tasks that could guide both teachers and students to adjust the teaching and learning process.

4.2. Mathematics Teachers Use of Assessment Rubrics in Teaching and Learning Of 3D-Geometry

The findings from classroom observation show that none of the teachers used assessment rubrics during the teaching and learning of 3D geometry. This implies that most teachers do not use assessment rubrics in their teaching. The results go contrary to those of Brookhart (2018), who assumed that assessment rubrics were used in lower levels of education, such as K-12, but that there were only a limited number of studies conducted to learn the general practices of the rubrics. This study signifies limited assessment rubric practices in lower levels of education, particularly secondary schools, as most of the studies from the literature have been conducted at higher levels of education, such as teacher training colleges and universities (Panadero & Jonsson, 2013; Reddy & Andrade, 2010; Seifert & Feliks, 2019; Smith, 2017) .

While observing whether teachers used the aspects of assessment rubrics, it was found that teachers’ level of implementation of assessment rubric aspects was very low, with a mean and standard deviation of (mean = 0.54, SD =0.011). This finding is similar to that of Crichton and McDaid (2016), who found that teachers did not share assessment criteria with their learners during lesson implementation. Teachers did not consider the sharing of quality levels of performance or scoring criteria when implementing assessment tasks. In this view, secondary school teachers have been less equipped with the necessary skills for the use of assessment rubrics in teaching and learning processes.

4.3. Mathematics Teachers’ Knowledge of Assessment Rubrics

The findings from the interview revealed that teachers have limited knowledge of assessment rubrics. They claimed that assessment rubrics were not part of the curriculum materials that teachers were required to have. The results are supported by the findings from Kitta and Tilya (2018) that mathematics teachers in Tanzanian secondary schools have limited knowledge of assessment rubrics. They further commented that they did not learn about assessment rubrics during pre-service training programs. However, the National Curriculum Framework of Tanzania stipulates that apart from the curriculum materials mentioned in the curriculum, teachers have to use a variety of creative approaches that may include the use of assessment rubrics to ensure effective teaching and learning (Tanzania Institute of Education, 2019).

5. CONCLUSION

The intention of the study was to investigate how mathematics teachers prepare and use assessment rubrics for effective teaching and learning of 3D geometry. The study went further in looking at mathematics teachers’ knowledge levels of assessment rubrics. The findings revealed that very few mathematics teachers were aware of assessment rubrics, and as a result, the majority of them cannot prepare or use them in teaching and learning. Although mathematics teachers claimed that preparation and use of assessment rubrics are not part of Tanzania's educational curriculum requirements, it was observed that the suggested teaching, learning, and assessment methods were not fixed but flexible, and teachers may enhance teaching and learning through various creative approaches to improve the teaching and learning.

6. RECOMMENDATIONS

It has been revealed by researchers that assessment rubrics improved teaching and learning. However, the current study has observed that the majority of mathematics teachers have limited knowledge of assessment rubrics, and hence they did not prepare or use them in teaching and learning process. This study, therefore, recommends training mathematics teachers on how to prepare assessment rubrics that can enhance effective teaching and learning of 3D-geometry or any other subject. In addition to that, the study makes recommendations to curriculum developers to include the aspect of assessment rubrics in pre-service and in-service training to equip teachers with knowledge and skills on the preparation and use of assessment rubrics in teaching and learning.

Funding: This research is supported by African Centre of Excellence for Innovative Teaching and Learning Science and Mathematics, Rwanda (Grant number: ACE II(P151847)).
Institutional Review Board Statement: The Ethical Committee of the University of Rwanda, Rwanda has granted approval for this study on 15 December 2021 (Ref. No. 03/DRI-CE/084/EN/gi/2021).
Transparency: The authors state that the manuscript is honest, truthful, and transparent, that no key aspects of the investigation have been omitted, and that any differences from the study as planned have been clarified. This study followed all writing ethics.

Competing Interests: The authors declare that they have no competing interests.

Authors’ Contributions: All authors contributed equally to the conception and design of the study. All authors have read and agreed to the published version of the manuscript.

REFERENCES

Aji, S. D., Hudha, M. N., Huda, C., Nandiyanto, A. B. D., & Abdullah, A. G. (2018). The improvement of learning effectiveness in the lesson study by using e-rubric. Journal of Engineering Science and Technology, 13(5), 1181-1189.

Andrade, H., & Du, Y. (2005). Student perspectives on rubric-referenced assessment. Practical Assessment, Research, and Evaluation, 10(1), 3. https://doi.org/10.7275/g367-ye94

Andrade, M. S. (2014). Dialogue and structure: Enabling learner self-regulation in technology-enhanced learning environments. European Educational Research Journal, 13(5), 563-574. https://doi.org/10.2304/eerj.2014.13.5.563

Azim, S., & Khan, M. (2012). Authentic assessment: An instructional tool to enhance students learning. Academic Research International, 2(3), 314-320.

Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment in Education: Principles, Policy & Practice, 25(6), 551-575. https://doi.org/10.1080/0969594X.2018.1441807

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77-101. https://doi.org/10.1191/1478088706QP063OA

Brookhart, S. (2018). Appropriate criteria: Key to effective rubrics. Frontiers in Education, 3(22), 1-12.

Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Retrieved from https://books.google.co.tz/books?hl=en&lr=&id=v_9QBAAAQBAJ&oi=fnd&pg=PP1&dq=How+to+create+and+use+rubrics+for+formative+assessment+and+grading&ots=xzbaghB2qp&sig=E6IPvZQXsLXru7s05hiZO-b2kC0&redir_esc=y#v=onepage&q=How%20to%20create%20and%20use%20rubrics%2

Creswell, J. W. (2014). A concise introduction to mixed methods research. Thousand Oaks: Sage Publications.

Crichton, H., & McDaid, A. (2016). Learning intentions and success criteria: Learners' and teachers' views. The Curriculum Journal, 27(2), 190-203. https://doi.org/10.1080/09585176.2015.1103278

Dawson, P. (2017). Assessment rubrics: Towards clearer and more replicable design, research and practice. Assessment & Evaluation in Higher Education, 42(3), 347-360. https://doi.org/10.1080/02602938.2015.1111294

Jönsson, A., & Panadero, E. (2017). The use and design of rubrics to support assessment for learning. Scaling up Assessment for Learning in Higher Education, 5, 99-111. https://doi.org/10.1007/978-981-10-3045-1_7

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences Educational Research Review, 2(2), 130-144. https://doi.org/10.1016/j.edurev.2007.05.002

Kilgour, P., Northcote, M., Williams, A., & Kilgour, A. (2020). A plan for the co-construction and collaborative use of rubrics for student learning. Assessment & Evaluation in Higher Education, 45(1), 140-153. https://doi.org/10.1080/02602938.2019.1614523

Kitta, S., & Likinjie, M. (2020). What are the relevant techniques for assessing mathematics in the context of a competency-based curriculum? Turkish Journal of Teacher Education, 9(2), 120-133. https://doi.org/10.1007/s10671-013-9145-5

Kitta, S., & Tilya, F. (2018). Assessment status of learner-centred learning in Tanzania in the context of the competence-based curriculum. Papers in Education and Development(29).

Kyaruzi, F., Strijbos, J.-W., Ufer, S., & Brown, G. T. (2018). Teacher AfL perceptions and feedback practices in mathematics education among secondary schools in Tanzania. Studies in Educational Evaluation, 59, 1-9. https://doi.org/10.1016/j.stueduc.2018.01.004

Lema, G., & Maro, W. (2016). Secondary school teachers’ utilization of feedback in the teaching and learning of mathematics in Tanzania. Papers in Education and Development, 36, 104-117.

Lema, G. S. (2022). Assessment for learning: Factors influencing utilization of learning targets and criteria for success in teaching and learning mathematics. The International Journal of Humanities & Social Studies, 10(8), 1-7. https://doi.org/10.24940/theijhss/2022/v10/i8/hs2208-001

Ministry of Education Science and Technology. (2020). Basic education statistics of Tanzania. Ministry of Education, Science and Technology. Retrieved from https://www.tamisemi.go.tz/storage/app/media/uploaded-files/BEST%202020%20Regional%20Data_Final.pdf

National Examination Council of Tanzania. (2017). Candidates’ response analysis report for form four national examination: Mathematics, National Examination Council of Tanzania. Retrieved from https://onlinesys.necta.go.tz/cira/csee/2017/041_BASIC_MATHEMATICS.pdf

National Examination Council of Tanzania. (2018). Candidates’ response analysis report for form four national examination: Mathematics. National Examination Council of Tanzania. Retrieved from https://onlinesys.necta.go.tz/cira/csee/2018/041_BASIC_MATHS.pdf

National Examination Council of Tanzania. (2019). Candidates’ response analysis report for form four national examination: Mathematics. National Examination Council of Tanzania. Retrieved from https://onlinesys.necta.go.tz/cira/csee/2019/041_BASIC_MATHEMATICS.pdf

National Examination Council of Tanzania. (2020). Candidates’ response analysis report for form four national examination: Mathematics. National Examination Council of Tanzania. Retrieved from https://onlinesys.necta.go.tz/cira/csee/2020/041_BASIC_MATHEMATICS.pdf

National Examination Council of Tanzania. (2021). Candidates’ response analysis report for form four national examination: Mathematics. National Examination Council of Tanzania. Retrieved from https://onlinesys.necta.go.tz/cira/csee/2021/041_BASIC_MATHEMATICS.pdf

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational Research Review, 9, 129-144. https://doi.org/10.1016/j.edurev.2013.01.002

Popham, W. J. (1997). What's wrong--and what's right--with rubrics. Educational Leadership, 55(2), 72-75.

Putri, A. (2016). EFL teachers' understanding in developing lesson plan. Indonesian EFL Journal, 2(1), 1-11.

Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35(4), 435-448. https://doi.org/10.1080/02602930902862859

San Francisco Unified School District. (2020). Sample 4-point holistic rubric, mathematics department. Retrieved from https://www.sfusdmath.org/rubrics.html

Seifert, T., & Feliks, O. (2019). Online self-assessment and peer-assessment as a tool to enhance student-teachers’ assessment skills. Assessment & Evaluation in Higher Education, 44(2), 169-185. https://doi.org/10.1080/02602938.2018.1487023

Siarova, H., Sternadel, D., & Mašidlauskaitė, R. (2017). Assessment practices for 21st century learning: Review of evidence, NESET II report. Luxembourg: Publications Office of the European Union.

Smith, J. S. (2017). Assessing creativity: Creating a rubric to effectively evaluate mediated digital portfolios. Journalism & Mass Communication Educator, 72(1), 24-36. https://doi.org/10.1177/1077695816648866

Syaifuddin, M. (2020). Implementation of authentic assessment on mathematics teaching: Study on junior high school teachers. European Journal of Educational Research, 9(4), 1491-1502. https://doi.org/10.12973/eu-jer.9.4.1491

Tanzania Institute of Education. (2019). National curriculum framework for basic and teacher education. Tanzania Institute of Education. Retrieved from https://www.tie.go.tz/uploads/documents/sw/1568799160-National%20Curriculum%20Framework%20for%20Basic%20and%20Teacher%20Education.pdf

Tarmo, A. (2022). Integrating assessment for learning into the teaching and learning of secondary school biology in Tanzania. CEPS Journal, 12(2), 239-265. https://doi.org/10.26529/cepsj.958

Villiers, R. R. D., & Fouché, J. P. (2015). Philosophical paradigms and other underpinnings of the qualitative and quantitative research methods: An accounting education perspective. Journal of Social Sciences, 43(2), 125-142. https://doi.org/10.1080/09718923.2015.11893430 R. R. D., & Fouché, J. P. (2015). Philosophical paradigms and other underpinnings of the qualitative and quantitative research methods: An accounting education perspective. Journal of Social Sciences, 43(2), 125-142. https://doi.org/10.1080/09718923.2015.11893430

Wiliam, D. (2017). Assessment and learning: Some reflections. Assessment in Education: Principles, Policy & Practice, 24(3), 394-403. https://doi.org/10.1080/0969594X.2017.1318108

Wiliam, D. (2018). How can assessment support learning? A response to Wilson and Shepard, Penuel, and Pellegrino. Educational Measurement Issues and Practice 37(1), 42-44. https://doi.org/10.1111/emip.12192

William, F., & Kitta, S. (2021). Impact of digital content on mathematics teachers’ pedagogical change: Experiences from retooling of secondary school mathematics teachers in Tanzania. Papers in Education and Development, 38(2), 152-177.

Views and opinions expressed in this article are the views and opinions of the author(s), International Journal of Education and Practice shall not be responsible or answerable for any loss, damage or liability etc. caused in relation to/arising out of the use of the content.