Placing the Department at the heart of research-driven initiatives
I hit nearly all the characteristics of the Introvert that @susancain delineates in her book ‘Quiet… ‘ In research terms I would happily go off and devise my own research question, conduct the consequent procedures, and consider the findings at leisure – and probably keep them to myself. However, that can lead to a whole heap of bias, inappropriateness of method; corrupted results and self-deluded interpretations. So it’s more rigorous to seek a wider group to peer review the validity of the question, the process, the analysis and the conclusions. At the workshop I presented at the delightful Huntington School in York for #NTENRED on May 3rd I suggested that, in secondary schools at least, the subject department is a natural body of peer critique for the following reasons:
Subject expertise – we all know our own subject better providing information wealth
Organisational – a defined group of people
Functional – addressing a challenge, issue or initiative that we all face so there is common cause
Developmental – as individuals feeding into a subject team we can move forward in what we learn
Logistical – finding opportunities to meet together which, in many schools, are garnered in department meeting time
In thinking back over the research occasions punctuating my career, some of the most surprising results were delivered by individual pieces of research (for my BA Geog – on classroom geography of student seating patterns – it probably led to me wanting to teach; for my MEd – on the required qualities & attributes of Deputy Heads….. have to admit – it veered me away from wanting to become a Deputy Head). But the most satisfying are those that we have conducted as a department. The sharing and bouncing of ideas, the collaborative purpose of quizzing an issue together – yielded, I’m certain, more resilient information and professional development spin-offs as a result of collective endeavour.
The procedures have invoked a combination of Quantitative and Qualitative data-collection; what Colin Robson, in his extremely helpful handbook ‘Real World Research’ , refers to as ‘Fixed’ and ‘Flexible’ inquiry approaches respectively. Advocating the value of both, Robson asserts that it is the Research Question that should drive the selection of approach and that is derived from the initial purpose of the investigation and the theoretical (conceptual) framework that informs it.
Departments are rarely homogenous beasts; they are a collection of ages, experiences and confirmations. Cultivating the most productive zone of ‘confident doubt’ within the department for an effective research process is worth the discussion and reassurance/challenging. Students don’t want teachers in front of them who clearly are unsure about the process of teaching and learning, but we may sometimes think the direct polar opposite of this is ultimate certainty in what and/or how we are doing. But an effective research team needs to be able to question, doubt and query current practices which – if we’re absolutely honest about being ‘Learning Institutions’ – should not detract in the students’ eyes our position in front of them. I would argue that students benefit from realising that their teachers are questioning their practices and trying to devise ways to improve them to show that learning is for everyone in the building; that learning is a life-long challenge (some would say, duty).
It’s perfectly valid to conduct a research procedure without collecting data – sometimes the initial stumbling block to a departmental enquiry as colleagues, legitimately, say ‘when have we got the time?’ A systematic literature/evidence review of secondary sources is a often a realistic starting point, devolving the reading material or research areas amongst the department team and collectively reviewing the thinking or evidence that is unearthed.
The challenge of restricting the variables inherent in any classroom situation can sometimes mitigate against Fixed (Quantitative) research procedures. But they have been accessibly described by Damian Benney @Benneypenyrheol in his analytical measurement of the effect size of a specific marking intervention with a science class here
I was particularly struck with the comment by Damian’s head of department who, far from pointing out possible design & interpretive flaws in the study, comments that the key element is what the teacher and student learnt and how it made them feel. (Research should challenge the head; but if it doesn’t move the heart in some way it can become lumpen. I find the most engaging research provokes an emotional reaction, be that discomfort, disquiet – or satisfaction in having moved forward in what we think we know). That sounds like just the kind of supportive department context in which research can be initiated and deliver benefits by encouraging the posing of valid questions.
I think the Flexible (Qualitative) approach to research sometimes need support. Often we hear or read about the Quantitative mega-scale research designs yielding vast quantities of data, subjected to fiendish-looking statistical analysis and muscleing all else aside with its veneration of number. It can leave teachers feeling that it’s not for us. Not to do, anyway – too complicated, too much time required, not the necessary resources. We’ll just be the passive recipients of someone else’s study. But that’s not it at all. The flexible design approach, involving – as it may – the case-study investigation, an interview, the participant observation or action-research project has just as strong a claim for a valid research approach and can yield far more powerful results – in the sense that we can conduct it. What we research, we own. And that means it can move us in new directions in our professional choices and decisions. It’s all too easy to trawl through a catalogue of research findings carried out by others to supplement our own confirmation biases and discredit research that doesn’t fit with our current world-view. Plunging into the research maelstrom ourselves, however clumsy our doggy-paddle, is likely to have a more transformative effect on our practice and cause us to keep posing those questions that are fundamental to self-directed professional development. Robson outlines a range of possible considerations:
The Case-Study research design
- May be exploratory, explanatory or confirmatory
- Could be based on an individual, set of individuals or group
- Might focus on organisations, events, roles or incidents
The Ethnographic research design
- A study of a group of which you form a part (NQTs, Subject leader, A level teachers, tutors….)
- Useful where you seek insight into an area which is new or different, or paradoxically, an area you’re very familiar with
The Grounded Theory research design
- Useful when investigating something that is complex & puzzling. Starts with no theory; go & collect data… analyse.. collect more data…analyse…continue to collect until ‘categories of data’ emerge & become saturated with information – then formulate a theory based on what has emerged.
Jill Berry @jillberry102 presented a workshop on the day outlining her use of a Case-Study approach in her PhD investigation into the transition from Deputy headship to Headship. In my session I talked through the Grounded Theory approach my department has been using this year to investigate why our GCSE exam results were so different to those we were anticipating based on the students having already accumulated 50% of their marks through their CA and modular exam results. Robson describes Grounded Theory: “It is close to the common-sense approach which one might use when trying to understand something which is complex and puzzling.” The important difference, this year, was that we carried out the investigation as a department; usually I will do exam analysis on my own and hand out the data when we meet again in September. To try to get a handle on why so many students had performed less well than three experienced teachers (who were getting excellent results at A level) had forecast meant we all had to be involved in uncovering what was going on. By making it a collective research project it took the notion of individual ‘blame’ out of the equation. We could step aside from the subjective and be as explicitly objective as possible in our approach, challenging each other’s findings – not the person. After the detailed exam analysis by teaching group, student sub-group, the analysis of unit by unit, tier by tier, question by question, year on year comparison we moved on to script return, remark request and a form of a theory began to emerge that was consistent with all the evidence. This was reinforced by contact with others schools’ departments who had entered students for the same paper.
So where did the research take us? What action(s) resulted? We have entered the majority of our students for Higher tier papers this year and we are adjusting our teaching and exam preparation for the demands of this. Two further analysis diagnostics have been used here: one, a self-reflection checklist for those of us teaching the course and one for students to try to isolate the key changes we (they) need to focus on. I certainly don’t think I could have asked colleagues in the department to complete the teacher diagnostic without having wrapped it within the collective, objective research frame that we had engaged in. As it was it yielded, for each of us, at least one area to action that was self-identified, self-actualised, and self-realised. That has to be one of the most difficult changes to engender but is at the heart of self-directed CPD.
The styles of research approach are not mutually-exclusive. Mixed designs may begin with a fixed study that then proceeds into a flexible one to try to explain the mechanisms behind the feature(s) captured, or vice versa – a flexible study may generate a theory of what is happening that subsequent fixed procedures attempt to quantify. Evaluation frameworks or Action-research often combine the two. Depending on the size and make-up of the department there is a range of research options:
- Full team conducting similar research proposal simultaneously.
- Team members pursuing delegated research element towards same proposal
- Use of some taught groups as ‘control’ groups whilst others receive ‘intervention’
- Some team members research mechanism; others then test emerging theory
- Some team members research theory; others then test emergent predictions
- Team continue longitudinal study over 5 (7) years with identified sub-group
- Team sub-groups approaching systematic review from a range of perspectives
- Individual research proposals according to interest and appropriateness using the team for systematic peer review
For a worked example of the third option, see Alex Quigley’s study of intervention with selected groups here. With all the formats, there are certain ‘approach with care’ areas to consider to do with both the ethics of the study, the politics of it within school and, not least, the organisational challenges. Some key considerations include:
- What could your department research that is likely to have the most significant effect for the most needful group?
- Would it be more effective, long-term, to research a question that is more likely to be successful as a research proposal?
- How could a research proposal empower your team members and place them in a position to drive a positive influence and/or make more informed choices?
- Will research findings result in giving teachers an additional function to add to what they already do? Can a research proposal consider what could be abandoned without an appreciable loss of learning?
- Can debate, disagreement and conflicting evidence be handled within the team for mutual benefit?
- Research may not give a clear answer and may add to the confusion of the overall picture. But it allows you to self-direct your own CPD development. Research may rarely answer questions but it’s the deliberate ‘asking’ that provides the most lasting value. Can this value become central?
With the last point in mind, our school is moving towards a teacher development model based less on lesson judgements and more on the ability of teachers to evidence their professional growth. With any accountability system there has to be evaluation, but I think we, like many schools, have found that the lesson observation/grading/judgement circus has had its day and been pushed out of town. For the majority of staff we are evolving a format based on the concept of the Learning School. If students are to be taught and learn most effectively, then the influences on that process – the teachers, the management operations, the parents, support staff – all have to be developing too. Up until now, like many schools, this has relied largely on a ‘done-to’ approach.
We are moving to accountability based on a wider range of evidence brought to the table by both teacher and development-advisor. Out go lesson grades. The accountability driver will be, for those teachers who are above baseline standard, to identify an area they select to develop (which could be based on either a deficit or sharpening-competence model) pursue a programme of self-improvement – which might well be research-based – and show evidence of how it has resulted in growth of insight, practice and performance. It’s an attempt to put Dylan Wiliam’s urge: ‘teachers need to improve, not because they are not good enough, but because they can be better’ at the heart of the learning school, and adding ‘…and because they want to..’ The done-to CPD approach has too low an impact. Groups of enquiry-led teachers who ask questions of themselves and of their peers is what we think will do it more effectively; those ‘philosopher teachers’ that John Tomsett spoke of wanting to work amonst in his opening talk. It’s all about the learning; and the learning being career-long.
A learning community, based on learning teachers, driven by posing critical questions, researching/evidencing the progress and using it to propose the follow-up question. We will never be completely sure we’ve got it right – but that’s why the profession is so constantly intriguing; whatever insights we glean – should serve to provoke further questions. It’s what researchers do. And it moves us, however marginally, along a transect of knowing a little more, and feeling a quantum more engaged.
With thanks to @nmgilbride for recommending Colin Robson’s ‘Real World Research’ and being a strong advocate of the value of Qualitative research. Do check out his blog.
Last word to Dylan…. Amen
Postscript: I was asked at the session if I thought the changes we implemented would improve our results for summer 2014. I replied it was too early to tell. They were enacted in trust. In fact, the students achieved our best ever departmental results with a higher A*-C percentage than had been achieved ever before and also, for the first time, higher than national average and similar centre average A*-C