Collecting assessment evidence is not as simple as it sounds. It is about the whole Assessment system within an RTO. The Assessment systems are often where an RTO’s operations falls down. They certainly show up time and time again in audit reports, and statistics from ASQA show they are an area that need support. This area is the most critical in being able to demonstrate quality outcomes for your students, and your RTO. This post looks at considerations for your assessment system, tools and evidence to be collected. 

Firstly lets look at the Assessment tools themself and how they fit into the Principles of Assessment (fairness, validity, flexibility and reliability) and rules of evidence (validity, authenticity, sufficiency, currency). 

Information in your RTO assessment tools

There should be instructions for the learner.

  • WHAT has to be done
    • Information on how to complete all activities, even when it may appear self-explanatory
  • Include information relating to the task such as
    • The tools they may need to use
    • The PPE they need to wear
  • WHERE
  • WHY and HOW?
  • The number of resubmission attempts permitted

There should also be instructions for the trainer.

These provide detail on the activities that are to happen by the student in able for the Assessor to observe the tasks – validity. The instructions include:

  • Under what conditions – duration; environment; etc?
  • What resources? Detail all of resources required to undertake the assessment
  • What support is allowable / available?
  • What constitutes reasonable adjustment?

Consider notes for the assessor if they don’t see the required behaviour.  Do the tools allow the assessment to be conducted in a different way. What is the framework for reasonable adjustment? This is covering Flexibility and Fairness.

A tip is to keep the assessment activities and the assessor instructions in separate documents. Often when the assessor instruction is right in with the candidate’s instructions it becomes confusing to the candidate.

The actual tools

Your assessment system includes tools that include a number of items. You need to draw from a range of assessment methods (flexibility). Marking guides, questions, projects, tasks to be completed, tasks that involve an Assessor being able to observe, observation checklists, the actual unit of competency all make up your assessment system. 

Let’s talk about the task and observation of them first. When developing a task, and an observation checklist you need to think what the observable behaviours are you expect to see.

Not all Observation checklists are clear.

It is your role to make them clearer (without compromising the integrity of the unit). In my work as an auditor of RTOs I often see Observation checklists copied/pasted from the unit of competency’s performance criteria. This will not get you through audit, as it is not showing you the learner has the skills, knowledge and any other attributes described in the unit of competency and associated assessment requirements – validity.

Make sure each behaviour is observable. Assessment activities that are used to produce evidence of a candidate’s skills, will always require a task to be completed. This task is done under the conditions and standards (relevant the unit of competency), and will provide candidates an opportunity to demonstrate the skills required to perform the mentioned task.

Competency-based assessment is an evidence-based system.

Therefore we need to see evidence that the person can perform the skills the detail required in the unit. We need to see that a judgement is made based on evidence collected – reliability. It is your role to describe that evidence and how it was collected. When assessment tools such as observation checklists don’t have specific criteria, they become just a tick and flick operation. Consider how credible does that make the evidence?

So often I go to validation sessions and see just that. A tick and flick checklist. It doesn’t tell me where the assessment was done, how it was done, who was around, what tools were used, when it was conducted (day/night/hot/cold) etc. Sometimes there is a work permit or other supporting documents that can tell a bit of a story. However those cases are few and far between.

The same applies if you are assessing the delivery of a service or the producing of a product. You need to compare the outcomes, or features and attributes to the relevant Element/PC and essential skills/knowledge from the unit. You must prove the assessment evidence (product) is from the present or the very recent past. Currency. Often it is a great idea to use records of photos or written down characteristics of the product or service for your evidence. This also helps with Authenticity, proving the evidence presented for assessment is the learner’s own work.

Validity, Reliability and fairness

You want to move away from a tick and flick operation in your tools. So allow room for comments. Comments provide rigour and provide a context of how the assessment was undertaken.

Having just questions for your assessment is also another area I see come up over and over in audits or validation sessions. Knowing is not the same as doing, and the VET sector is about doing and showing competence. You need to collect assessment evidence of knowledge and skills being integrated with the practical application. If you have questions relate them to the tasks at hand. You need to show the person knows how to perform the task considering all the workplace requirements, AND they can demonstrate these skills and knowledge in other similar situations. Validity. The quality, quantity and relevance of the evidence collected must support the assessor’s judgement. Validity, reliability. Currency

Provide clear marking guides for your Assessors. Once upon a time I argued the Assessor is the expert and they should know the answers and criteria. But this was some 20 years ago, and I have changed and realised that not all Assessors think the same and the results can be quite different. So over the years Managing trainers and assessors, and now as consultant, I totally agree with clear marking guides to support the Assessors in what you as the RTO is looking for. After all it is your compliance.  

Assessment evidence – Consider your audience

Assessing at the correct AQF level. Looking at the verbs, basic factual and procedural knowledge. Make sure you assess at the level the unit first appeared in. Consider LLN needs of your learner cohorts.

Think about the activities required in the workplace. Consider the activities and tasks; do they relate to activities in the workplace. For example, if reading signs is required, then take that into consideration. Using skills of numeracy, measuring, then take that into consideration. Fairness

One of the best ways of checking this is to ask industry if your assessments make sense to industry. Make sure the instruments meet the requirements of your industry sector and keep the evidence (great for consultation evidence). These considerations are all actions that need to happen up front.

Mapping

Mapping is a great way to show that the assessment tool meets the Training Package down to the relevant Element/PC/skills/knowledge from the unit. Make sure you impart the essential knowledge and skills as well as any conditions of assessment. The Mapping you create will show you haven’t over assessed or under assessed. It is here you can show Sufficiency. Show how the quality, quantity and relevance of the assessment evidence will enable a judgement to be made of a learner’s competency.

Creating your Assessment system and the assessment evidence to be collected is about telling a story. A story of how the candidate is able to use the new skills and knowledge consistently, and apply them in different contexts and situations.

The quality test of any of your assessment tools is the capacity for another assessor to use and replicate the assessment procedures without any need for further clarification by the tool developer. Once all are prepared remember your validation processes to make sure the tools you are using are appropriate. This tests the reliability of the assessment tools.