Sunday, May 21, 2017

Feedback Strategy - Praise-Question-Polish (PQP)

One of the key takeaways from the recent webinar is the "PQP" strategy to provide feedback.
(once again, thanks to ETD colleagues who organised the virtual event, and thanks to Dr Tay for her sharing.)

Well, PQP does not only apply to students, i.e. to guide them to give constructive/ useful feedback, but also useful for teachers to frame our feedback to our students. Personally, I would make a more conscious effort to adopt this strategy and role model it.

Mindset change:

Ops! Having said that, I must clarify that it does not mean that I do not provide feedback to my students. Of course, some of us would say, what kind of feedback do you expect Maths teachers to give? Isn't it a tick or cross, or at most, circle the step where the error occurred? Well, yes, that's one form of feedback commonly practised by many of us, especially when we were in our beginning years as teachers. Isn't it "wordy" feedback is more relevant to other disciplines such as languages or humanities?

Not true.
I came to learn that there's a place for this in the subject (as I polish my craft over these years)!
事在人为!

There is a place for the less conventional exercises! For example, alternative assessments or performance tasks that come with rubrics! While writing qualitative feedback is something that I was not used to, I think this is one way students would benefit more (apart from trying to 'interpret' their work quality with a tick or a mark).

Based on my humble experience, the easiest way to give feedback is to make reference to the rubrics - not about lifting the entire description, but to contextualise (or customise) it to the work students have submitted. Students will learn, through the feedback, and make the relevant changes/ improvements/ enhancements to better a piece of work of similar nature. Personally, I observed this change in the quality in the viva voce assignment (as compared to the practice task). Undeniably, it takes time to give feedback and tasks of this scale is not assigned in a regular basis (like once a month). My point here is, given relevant amount of feedback, students will respond and make progress. It's worth the time and effort.

My Personal Reflection:

The next question is, by making reference to the rubrics, is it enough?
You see, I'm trying to make a connection between what I have been doing all this while to the PQP strategy. I think, in my current practice, what's lacking (though not totally missing) is the first "P" - Praise. Being very focused to identify what has gone wrong or incomplete, and targeted to improve,  "Question" and "Polish" are two aspects that I'm familiar with. One area that I will need to work on is the first "P" - Praise.

What we could do: 

Generic praises (such as excellent, "well done") without elaboration are not useful though it makes the day for the learner who receives the praise. Praise needs to be targeted so that it is helpful to affirm and to spur the learner to do even better (hopefully).

Similarly, when it comes to getting students to giving peer feedback, as teachers, we need/ should provide scaffolds so that learners are really giving useful feedback to their peers directed to the context (i.e. related to the task - disciplinary knowledge and skills involved).

"Good job", "Awesome" without elaboration are not very helpful though it motivates the learner.
In fact, before getting students to provide peer feedback, we should ask the following questions
  1. Do students know the purpose/ intent of feedback?
  2. Do they know what are the areas that they should specifically be looking out for when giving feedback (e.g. strategy or aesthetic part of the piece of work?)
  3. Do they know what makes a feedback useful to the recipient?
If these are not clear, they are just going through the motion of giving what they think is feedback.

In my opinion, to make peer feedback a productive one, for beginners involved in giving feedback, it is necessary to discuss the three questions above. With a clear rationale, the PQP strategy would make more sense to the students. On the other hand, stem questions would be necessary and very useful to help them kickstart this peer evaluation process. Though it seems to be very guided, it takes time for one to be familiar before adopting the vocabulary. Last but not least, the teacher needs to model the use so that it becomes something that the students are familiar with before they adopt a similar role (to give feedback).

Indeed, I think we should seek clarity to the following before embarking and getting others (i.e. students) to be involved:
  1. Why Praise?
  2. Why Question?
  3. Why Polish?
As mentioned before, to "praise" is not prominent in my practice. However, I recognise that it helps pointing out to the learner what they have done correctly so that they know they are heading the right direction. It could be in terms of an approach, or a strategy that they adopted, or even a decision made that is sound and grounded with good reasons. It is a form of feedback. As such, we should elaborate so that the message (feedback) is re-enforced. The outcome? One continues to do well in the area highlighted, or one could actually re-strategise/ re-focus to work in areas that need more attention. It is also a means to motivate the learner.

To "question" seems obvious, isn't it? It seems to be a forte of most teachers; but sometimes we might "mix it up" with "polish". To question is about asking "why"? To seek understanding. To probe for clarity. To ask for elaboration. It is quite natural that we could have already passed a judgement to the answer/ working (sub-consciously).
  • Does this sound like the "fault finding" stage? To some extent, yes - with the intent to surface the error/ issue that one might have overlooked so that they can be addressed. So, we need to be mindful of the words and tone used in this stage when giving feedback. And it is important to keep in mind that the feedback is targeted at work and not the person.
  • Indeed, it is important that the recipient of the feedback is clear of the rationale so that he/ she would receive the feedback with an open mind, and not attempt to defend for the sake of defending.
To "polish" is about giving suggestions to make improvements. I guess where we usually come from is to correct the error made, or point out the gaps, or suggest what could be added to the make the solution more complete. On another note, I think it could also be about sharing a different perspective so that the learner could gain a fuller picture on the subject matter that he/ she has been looking into. Of course, this "alternative" may also come in at the "Question" stage. All-in-all, the purpose of this stage is to enhance the outcome. It could be a case of correcting a misconception, or extending the breadth or depth to the matter of discussion.

Point to Ponder:

Indeed, after going crafting the previous section (of the post),  a thought popped-up - Isn't it that we should give the learner some wait time between "Question" and "Polish" stages? With the question, shouldn't we gave the learner time to think, process and respond? If we "polish" almost immediately after the "questions", are we depriving them from learning with our 'prescription'? I wonder.

Well, I guess it all depends on how rigorous the learning activity require the peer feedback to be.
In fact, I think sometimes it would be more effective if we "re-organise and customise" the strategy to become:
  • "Praise" followed by "Question"
  • "Praise" followed by "Polish" 

Scaffolds:

It is not difficult to find useful stem questions to aid the crafting feedback for the three stages (PQP).

Here are some examples...

Praise
  • I like the part where...
  • I like the way you explained...
  • I like the order you used in your writing because...
  • I think the ending was...
Question
  • I was confused with...
  • I didn't understand...
  • Why did you leave out...
  • Why did you include....
Polish
  • Don't forget to add...
  • Are your paragraphs in the best order....?
  • Could you add more to this part....?
  • Have  you tried...?
Useful sites:

Technologies for PQP: 

To facilitate documentation so that learners can make reference and learn (in the process of giving/ receiving feedback), we can consider the following tools (note: I've only selected those that I think are common and easy to use):

(i) Padlet(s)

As suggested in the example shared by Dr Tay - Padlet could be used to support the facilitation of the three stages of PQP. In the illustration, one padlet was divided into 3 sections.

One suggestion to enhance the technology:
If one is expecting several responses to each of the stages, it would be good to have 3 different padlets, each one serves a stage so that the feedback would not be messy.

I would possibly use this when first introducing the PQP strategy to students, to see how they provide feedback (for each of the stages) to a common problem. From the padlet posts, they would learn (from their peers) how to provide relevant feedback to the task presented.


Based on experience, the stickies may overlap each other - so, might be a bit challenging to manage. In addition, it would be much neater to set the posting layout to be organised in a grid or linear:


One could also activate the "Comment" feature for the padlet to allow one to respond to the feedback given:

An illustration on how it could look like:

The assumption: Re-classification of stickies is not necessary.


(ii) GoogleForm/ Sheets


GoogleForm is one of the most convenient tools and, it can be 'recycled' to serve a few exercises if what's shown in the form is generic enough.
  • GoogleForm is easy to set up. Broadly, the structure would  to collect the responses using

What I like is that the feedback can be sorted by Problem numbers (or even colour-coded).
In other words, we can use the same form in a classroom where students provide feedback to different sets of answers/ presentations at the same time.


The only draw back is the learner could not respond to the feedback in the spreadsheet unless he/ she has the right to edit the spreadsheet. Again, depending on the maturity of the class, the teacher may choose to give the edit right or not.

(iii) Google Slides or Doc

If Google Slides are used to collate the work, for example, in a language class, students work in pairs to annotate the visual text in an assigned slide; at the end of the activity, the pairs can be assigned to provide feedback to one or two other groups.

We can simply insert the PQP 'structure' in the speakers' notes of the slide so that students can insert their feedback to the peers.




Reflection: Learning the "HOW" and "WHAT" learners are thinking

Technology is a powerful tool that enables us to "see" and "document" what's going on in our mind, hence creating a much enriched learning experience, for self and others!

Well, the above seemed too broad and generic, especially the second half of the statement. Isn't it a given? Some of us might think. Well, what assumptions do we make if we say it's a given?  That is, the first half of the statement is TRUE for the learners!

Episodes of the learning activities flashed across my head when attending (and mentally preparing to attend) the recent webinar. Indeed, value this opportunity as I hardly had a chance to sit down and think about T&L more thoroughly and making good connections across my practices during this assessment period.

Sometimes, we tend to claim (and therefore generate excuses) like due to the nature of subject discipline, there is little room for technology to come into the picture more often that we could. Well, often it's a "yes" response if we look at what's expected of the students in the syllabus document. E.g. They are expected to carry out proofs in Mathematics. The application of knowledge from various units and the writing to present their train of thought - all to be written on the paper. On other hand, how do we know if the students are not regurgitate what they had memorised?
  • Drawing from my personal experience, sad to say, that's how I cleared my Physics at "O" Levels and went through the same when  I did Calculus in my university days! Glad that I joined the teaching profession and over the years,  I gain better insights to understand how learners learn and how teachers could do differently to enable learning. I believe, if I had learnt or shown how to learn the abstract topics differently (or if  my teachers had approach teaching differently), my learning experience and perspectives would have been rather different today. The teacher/lecturer had taught; but I had not learnt well.
  • From another perspective, this experience has helped me to better understand the struggle that learners in similar situation as I did have to go through. It helped me to recognise and appreciate how important it is for teachers to possess the necessary and relevant knowledge and skills to bring about positive experiences for the learner.
So, how did we know how well students have learnt? By doing the questions on papers correctly, is that enough? How do we assess the thinking behind the set of solution? Many very experienced educators would share that the design of the assessment items must be clear, on which aspect of learning are we trying to assess? That is the clarity of the intent and the design of assessment items. The next step is, how often or regular, and at what juncture should we carry out the 'checks'? How tedious is the process? How big a group can we reach out and cater to (using tradition methods, without technology) in a timely manner? The entire purpose of formative assessment is about timely intervention.
  • To conclude this paragraph: It's about the type of assessment item (to draw out what we want to know about our learner) and the timeliness of our response to need the need.
What are some ways that enable us to find how the learners think, or ways that enable us to probe their thoughts? This is one challenge that I notice students who just joined the Sec 1 classes would face. Not because they are not thinking - They had already proven themselves after braving through the Primary 6 Mathematics problem sums! They are great problem solvers who applied heuristics that would surprise many of us! The next question is, how well can they articulate how they think? or are they simply applying some "sure work" method that they learnt?

Let's refer to the Singapore Mathematics Framework (Syllabus document, p14) - which aspect(s) do we try to develop/ inculcate in our students? We are good in some areas outlined in the 'pentagon' (in particular, the skills, concepts and processes). Metacognition is an area that we should work harder on; and some aspects of "processes" could be further strengthened by riding on the affordances of technologies!

Making thinking visible is one approach that enable us to "see" how students think, how they process data/ information and definitely enable us to diagnose any misconceptions. It does not stop here! Because we could "see" the misconceptions, it enable us to surface the difficult points (Three Point Framework. (Yang & Ricks, 2012)) which we, as educators, might not be aware of! There are many generic thinking routines that we can pick and choose. One way of implementation is to make the routine "really" visible - flag it out each time it is used. Another way is to have it embedded more seamlessly in the way the problem/ question is being put across - which I think it's more 'natural'. 

To probe how they think, often we get them to elaborate the thinking process, which includes articulating:
  • What are the key words they identify - which frames the direction or way they will approach the question/ problem?
  • What are the info/ data they would use or need to solve the question/ problem?
  • What strategies would they consider? How to do they decide which to use if there is more than one way? (What's the criteria)
  • How do they articulate the steps clearly - where the concepts are explained?
  • How would they check that the answer is reasonably correct? 
Well, does the above sound like Polya's 4 steps to problem solving?

How do we do this in a less elaborate but deliberate manner during lesson, and at the same time, reach out to as many learners in the class as possible?

I guess, that's where technology makes its entrance and presence felt to this learning ecosystem.
Its key is accessibility. It's not about physical accessibility (which no doubt, students are well provided with technological devices in Singapore schools). But, it provides a channel for teachers to access learners' thinking far much faster and easier. With the appropriate choice of tools used in a well thought through manner, we are able to "see" how students think!

One of my favourite examples is getting students to explain via viva voce  how they manage a problem (notice that I use 'manage' instead of 'solve'?): They
  • are required to articulate the background (concepts)
  • describe how they dissect the question (to understand what's given and what's required)
  • list down the strategies can they apply and 
  • articulate what considerations they need to take into account to make a choice of the method; 
  • and eventually, how they would check the reasonableness of their proposed solution or answer!
Of course, we don't expect students being able to do all the above in the first attempt. It requires them rehearse the same process over and over again and improve with the feedback given along the way. That's where we help them to see their potential!

Students do not know and do not carry out the above steps naturally. It's not something that they are born with. Nevertheless, quite often, as teachers, we make assumptions that all learners are quick to pick that up, forgetting they need time to hone their stills and feedback is necessary for them to make positive progress. Scaffolding is a must to bring students through all these stages; and the guidance given in their first attempt would be far much more compared to subsequent attempts. Rubrics help. But we must explain the rubrics and give illustrations to that they become aware of the expectations and polish their crafts along the way. In summary,



Over the years, error analysis is introduced into the mathematics classrooms. It serves dual purposes. Indeed, it could be used as a strategy for differentiated learning, though I did not notice it when I first introduced it to the students.

The original intent, still remains as one of the primary intent, is for students to identify and explain what's wrong with the working (which is usually a result of misconception). Think more deeply, what is the underlying assumption behind this?
  •  In order to identify the error, the assumption is students already know the concept quite well such that they could see or identify the step that went wrong. There is another group who could identify the error; however, are unable to explain the misconception - simply because through their own experiences - they might have made enough similar mistakes or similar mistakes had been pointed to them before which would help them to see the mistakes very quickly. It is possible that latter could make the same mistake because they have yet understood the concept behind the operation. To differentiate these 2 groups of students (on who really understand what's going on) would require them to explain what was the thinking behind the error. This is the first step to develop their metacognition. The ability to articulate demonstrate how well they have known the concepts (as concepts in isolation, as well as links to others). 
  • On the other hand, error analysis-type of exercises would quickly filter those who are still trying to grabble with the basic or still functioning at the operational level. Likelihood is, the working speaks the type of misconception that these students already have. So, they would not be able to surface any mistake in the working at all! Hence, their conclusion is "nothing is wrong", but I guess that's what would worry them! 
  • We also have route procedural learners who probably re-attempt the entire question using the 'correct' method. To surface the error in the given working, they "spot" the difference and try to describe the mistake. They are unlikely able to explain the thinking behind the mistake (i.e. the misconception), but able to tell operationally what has gone wrong.
Indeed, error analysis is a very powerful strategy to differentiate learner - as they interact and exchange their observations during discussions. On another note, for this to play out well in a classroom setting, it is necessary to maintain a safe learning environment and to cultivate a open mindset among the learners.


How does technology come in to support the above?

It can be as simple as getting students to provide a brief explanation to support/ justify their answer. No sophisticated technology is required to facilitate the above processes. With the use of blog (comments) and Google Form/ Spreadsheet (embedded in a blog).





Saturday, May 20, 2017

Webinar: Assessment for Learning (AfL) - what does it emcompass?

Attended a webinar (organised by ETD) yesterday and glad to have one solid hour of learning and engagement :)

At the start of the webinar, a poll was conducted among the participants - to indicate which of the following did they think would describe assessment for learning most appropriately,
  • Thermometer 
  • Stethoscope
  • Mirror 
  • GPS (Global Positioning System)
Dr Tay spoke about the big picture (to set the context) before sharing an number of classroom examples to illustrate how one could assess learners' understanding (in particular, tapping on the affordances of technology) formatively. Indeed, the overarching idea when designing the AfL learning experience is to ask THREE following questions:
  • Where the learners are?
  • How best to get there?
  • Where the learners need to be?
This is accompanied by two important principles: Find the gap; Fix the gap

Based on the way the questions are phrased, it is no doubt that GPS seemed to be the most appropriate "answer".

Personally, I think all (4 items) are relevant and contribute to the big picture, especially if we talk about "effective" assessment for learning.

It was obvious that GPS best describes the "moves" involved:
To begin with an end in mind (where the learners need to be at eventually), we need to know our learners' strengths and gaps, what they know, what they do not know, what are their learning challenges/ difficulties and who they are (which would involve other areas like learning style). Only then, we could identify the most appropriate/ suitable approach to be integrated into our lesson/ assessment design so that they could benefit from the process.

How about the rest (thermometer, mirror and stethoscope)? Can we do without them?
  • Thermometer - it's about measurement. Indeed, I would equate it with our sensing - to sense readiness of students to move on? to sense if they got stuck at a difficult point. It also means to be sensitive and responsive to the students' needs. This requires us to be flexible and responsive. For example, do we complete what's planned for the 60-min period lesson, and 'die-die' must finish what's planned, or should we "detour" or change path because the sensing tells us that students needed additional support or scaffolds that we might not have anticipated? Will we end up in a situation that "we have taught; but have students learnt?" This is essential to alert us where the learners are.
  • Stethoscope - an item that I would associate with diagnosis, which I think it's an important step. If the diagnosis is inaccurate, we would plan and implement something that is irrelevant or not appropriate, when the good intent would never lead to fruitful or productive outcome. Diagnosis does not necessary just refer to what's immediate; but sometimes, it's where we surface other underlying issues which may or may not link to something we can address to immediately. We need to 对症下药, else would lead to 前功尽弃 

  • Mirror - close association with reflection (of practices). I would regard this as a 'given' -  teachers would regularly draw on his/ her past experiences, review and reflect his/ her practices to hone his/ her skills. The reflection would also mean he/ she could best match the approaches/ strategies to address what is needed (surfaced through the diagnosis).
Hence, I think, all these complement each other, and when synergised, would create a 
1+ 1 + 1 + 1 > 4 impact.

Kahoot! in Learning - a Reflection

Kahoot https://getkahoot.com/

I was introduced to "Kahoot" several years ago.
It became and is still a very popular among the colleagues, as well as the students. Teachers use this tool for short quizzes for review purpose (most of the time), student also used this tool when designing activities to engage the student community, and even parents and in outreach programmes!

Without the need to carry out any serious research, based on observations in schools (on-site), feedback from those who have used in or online articles/ postings, it is not difficult to understand why it is so popular. Here are some reasons:
  1. The tool is very easy to use! For those of us who 'hang' around and "play" with web applications, we would agree that the navigation in this platform is intuitive. Kahoot even has a guide to help newcomers to use it - click HERE for the Kahoot guide!
  2. It is hassle free - only the creator needs to sign up for an account to create the quizzes. To administer the quiz, the creator simply needs to click at the start button to generate a code that participants just need to enter it into a 'generic' website, "https://kahoot.it" to sign up with a name. It's ready to go! The administration of the activity is simple! And you can involve many people at a go!
  3. One of the 'winning' element of this tool is, all participants will do the 'same' thing (i.e. to look at a common screen) at the same time. In other words, the pace to attempt the quiz has been "synchronised". All participants are directed to the same objective - wait for the question to appear and choose the correct answer! All the question at the same time and are given the same amount to process. The only 'differentiation' is the system will time the response time, hence the "higher ability" participant would stand to gain more points! (Assumption here is higher ability participant takes a shorter time to process the question.)
  4. Of course, the unique music that comes with all the Kahoots create the excitement (or perhaps the anxiety for some), and it is the one that stirs the emotion and anxiety throughout the quiz! Imagine you play Kahoot without the music (which I did before) - the level of engagement would reduce by more than half! The music is the soul of the entire activity! 
So, what's the 'educational' value of this platform? What can we achieve through this platform? It is no doubt (and undeniable) that Kahoot engages the participants. Do you notice that I choose to use "participant" instead of "learner"?

The engagement is high. (again it depends on how we define engagement!)

On another note, we need to bear in mind that engagement does not  necessarily equate to learning.
  • What is going on in the participants' minds when they go through the entire activity? 
  • Is the urge to win the game (which could be hyped up by the music) far much greater than taking the time to look at the questions more deeply on what's expected? Are the participants simply regurgitating some facts and non-challenging questions "mindlessly"?  
    • Indeed, what does the designer of the quiz want to achieve? With this operating environment, to what extent will the objective be achieved?
  • This is another assumption - all participants work well in the 'operating environment', with the 'music' hyping up their emotion, and they trying to process the answer with excitement (or anxiety!). It probably works for many, but not necessary all! 
Kahoot is commonly cited in  articles that discuss/ recommend applications suitable/ recommended for formative assessment. It would probably be useful for a quick review, and perhaps questions that do not require too much brain power to process. To assess more complex concepts or ideas, will students be able to make good judgement based on what they know? or they simply take a 25% chance? Something for us to think about when choosing the tool for our activity or when conducting the activity.

This is how it is often used/ implemented in class:
  • All the questions will be administered at one go - the pace and question projection are all automated. 
  • Participants usually would stare at the questions and (try to) process the question in their head before making a choice (since it's multiple choice questions). 
  • Because of the pace, participants who did the questions incorrectly would not get the opportunity to figure out what has gone wrong, or get the chance to clarify at all. 
Well, this approach works well for questions that have a clear right or wrong answer. Here's an example I found in the Kahoot website: Figuring out Figurative Language which students make a choice to confirm their understanding of the definition (whether it is a simile or a metaphor).
To transfer this notion in lower secondary maths learning, we can create quizzes to get students identify terms in quadratic expressions (e.g. coefficient, constant, variable), identify the types of number system that a given number belong to, map linear equations to graphs (with observable characteristics like positive/ negative/ neutral/ undefined gradients, positive/ negative intercepts), reading data from graphical representations, just to name a few. It is, quite obvious, from these examples that the questions in these quizzes cannot be too complex, and certainly there are some limitations.

Similarly, for Science, it could be used in areas like identifying the correct chemical structure of some compound, characteristics of the elements, tests for presence of selected compounds, etc... (just to name a few).

Well, I'm not suggesting giving Kahoot a total miss... but I think the point to emphasize here is, we need to be clear on the objective(s) we have in mind for that learning experience we intend to create, or what we hope to accomplish through this activity.

To stretch the tool a bit more, since we can't change the features in the tool, we can try using the tool differently. (That's where the user makes a difference!)

The pace to release the questions can be managed by the user! If we look at the platform more closely, we can set different duration to different questions within the same quiz!


Here's my takeaway when I modified the way the quiz is to be administered:

(1) Depending on the level of difficulty of question, different duration was set.


(2) Prior to the start of the quiz, students were preempted that they needed to have their writing materials, notebook and calculator ready so that they could work on the answers.

I was glad that a handful of students actually tried working out the answer before making their choice. As anticipated, most of them did not pick up their pen to write or scribble anything even when the questions became more complex. It was quite obvious that many just randomly pick an answer (after eliminating what's really not possible).

(3) I paused after all students submitted their responses. We took a bit of time to discuss how to 'dissect' the question and talk aloud how the solve the problem. Indeed, I think that helps when similar question popped up subsequently. More students picked up a pen to scribble a couple of lines before they make a choice. The emphasis here is accuracy against speed. When given a reasonable amount of time, the participants became more willing to take time to process the question more carefully. This was actually what I hope to see and achieve - the students to process the questions mindfully and learning should not be compromised by the excitement and urge to win!

(4) Learning does not end here! In anticipation that some students would like to revisit the questions (which they might not be able to access easily), I extracted the questions in a set of slides and make it available to them. Suggested working is also available so that they could see how the question could be tackled (from the teacher's perspective).

Last but not least, borrowing a Chinese saying,  规矩是死的,人是活的 - one has to exercise flexibility and assess the nature of the topic and be very clear of the objective before designing the quiz items and deciding how to administer it.