4.1.6 Performance Levels for Assessors

by Sharon Jensen (Nursing, Seattle University)

A major role of an assessor is to describe and analyze data accurately and then share conclusions without judgment so that it is easy for the assessee to use the information for growth. The Assessment Methodology (4.1.4) gives guidelines for conducting the assessment process. The SII Method for Assessment Reporting (4.1.9) gives guidelines for structuring an assessment report. The purpose of this module is to identify and describe five important behaviors of an assessor, ranging from novice to expert. By examining the quality of one’s assessment efforts against this rubric, assessors can become more aware of their current level of performance and can formulate improvements needed for higher-level performance.
 

Performance Areas

There are five core aspects of assessor performance: values, criteria, evidence, interpretation, and reporting. Values relate to the extent to which the assessor appreciates the potential of the assessment process in personal and interpersonal development. Criteria describe the assessor’s ability to set specific, but complete, performance criteria so that both the assessor and assessee are clear about what is being assessed. Evidence describes the assessor’s ability to gather accurate data and analyze it in a systematic manner. Interpretation refers to the assessor’s ability to create meaning from observations, analysis, and understanding of context. Reporting describes the assessor’s ability to communicate to an assessee the assessment findings and their implications in a compelling manner.

The rubric presented in Table 1 defines five levels of performance in each of these core areas. The lowest level is that of a rookie who is unaware of the role of assessment in personal development and has little commitment to its practice. The next level is that of a learner who can see the benefits of assessment, but is in the beginning stages of implementation so that important data are often missed or disorganized. The middle level is that of a guide. At this level, an assessor feels comfortable performing assessment activities and is working on giving higher quality feedback that facilitates growth. The fourth level is that of a mentor who possesses strong skills in collecting and extracting meaning from assessment data, including nonjudgmental insights that are valuable for growth. The highest level is that of a veteran who uses assessment continuously and effectively with individuals and groups across contexts.

Elevating Assessor Skills

As explained in the Overview of Assessment (4.1.1) and in Distinctions Between Assessment and Evaluation (4.1.2), feedback that can be used for future growth is much more meaningful than evaluative feedback. Part of valuing the assessment process is respecting the assessee’s intentions and abilities. One begins this by conferring with the assessee to define the criteria that will be examined during the performance. During the performance use all of your senses, paying particular attention to non-verbal behavior. Write down observations frequently, referring to the performance criteria as an outline. After the performance, review your notes and take a moment to reflect upon the data gaps, omissions, and incongruities (Anderson & McFarlane, 2000). Analyze whether the data accurately reflect the performance. If not, add to the data to make them more complete. Remove judgmental or inaccurate information. Be sure to consider only criteria that were negotiated with the assessee. As new issues outside the performance criteria emerge, they should be identified and negotiated separately with the assessee.

Avoid making premature conclusions. Carefully consider the performance and data for each criterion. Ideally, a pattern of performance is identified; rarely is one piece of data enough to warrant an intervention. Be honest if you have incomplete data. Collaborate with the assessee on making the observations richer and more complete; rarely can one assessor gather all pertinent data on all students at one time. Validate your data with the assessee; he or she may have a different perspective about the data and their meaning. Listen carefully to the assessee as you share your assessments.

Assessment activities will have the greatest impact if they are combined with constructive interventions that are multidimensional and integrated (AAHE, 2003). Appropriate interventions help assessees and assessors shift priorities, connect with new resources, and change performances that have suffered for a period of time. It is highly important to listen, provide support, and be honest. Remember that plans for improvement are most successful if they are accompanied with positive strategies that focus on specifics.

Concluding Thoughts

Interaction with an assessee provides fertile ground for obtaining feedback on five core aspects of assessment. By examining your performance as an assessor against the rubric provided in this module, you will become more aware of the behaviors that allow you to make successful interventions in student learning as well as peer collaborations. You will also have a better idea about which behaviors are most limiting and how you might work on these to move to the next level of assessor.

References

Anderson, E. T., & McFarlane, J. (2000). Community analysis and nursing diagnosis. In Community as partner (3rd ed.). Philadelphia: Lippincott.

Astin, A. W., Banta, T. W., Cross, K. P., El-Khawas, E., Ewell, P. T., Hutchings, P., et al. (2003). Nine principles of good practice for assessing student learning. Washington, DC: American Association for Higher Education.]

 

Table 1  Performance Levels for Assessors
 
Veteran

Values:

Continuously seeks assessment in varied contexts while respecting assessee needs

Criteria:

Clearly articulates a comprehensive set of measurable criteria accepted by assessees

Evidence:

Collects data that is complete, well-documented, and supportive of assessment criteria

Interpretation:

Insightfully connects key performance areas with relevant assessee actions

Reporting:

Supplies integrated and robust plans of action in positive, assessee-centered language
Mentor

Values:

When asked, uses real-time assessment to improve the performance of self and others

Criteria:

Accurately proposes criteria for individuals and groups in a variety of contexts

Evidence:

Is able to observe, record, and recall key aspects of a performance as they relate to criteria

Interpretation:

Accurately identifies strengths, improvements, and insights in familiar and some unfamiliar contexts

Reporting:

Regularly creates non-judgmental reports that are relevant and valuable to assessees
Guide

Values:

Feels comfortable scoping out an assessment plan with an assessee

Criteria:

Selects key performance criteria for a specific context with the help of an assessee

Evidence:

Collects and organizes data from a performance related to major criteria

Interpretation:

Accurately identifies strengths, improvements, and insights in familiar contexts

Reporting:

Occasionally provides helpful feedback to assessees on key performance issues
Learner

Values:

Believes in the value of assessment, but is not always convinced that it is worth the effort

Criteria:

Within familiar contexts, is able to set some performance criteria of interest to assessee

Evidence:

Is able to collect data that support criteria, but has limited ability to analyze these data

Interpretation:

Is able to see strengths in key areas, but has difficulty giving guidance on how to improve

Reporting:

Provides superficial, often evaluative, feedback on items that may not be related to criteria
Rookie

Values:

Understands the concept of assessment, but is not interested in engaging in supportive activities

Criteria:

Proposes some relevant and some invalid criteria without asking for assessee input

Evidence:

Gathers data that do not align with criteria while missing important data that do

Interpretation:

Frequently misinterprets performance data and is unable to support conclusions with data

Reporting:

Comments only on obvious performance issues, often injecting personal bias in feedback