Tuesday, May 23, 2017

June 22… Biesta’s Democratic Research


Consider Biesta’s vision of democratic research. What, if anything, about it seems novel or unusual?  From what you’ve learned thus far, what sort of place do you see for this kind of work in the world of educational research?  

13 comments:

  1. I loved Biesta’s interpretation of the current state of educational research and its implications for democracy. While not directly quoting him, Biesta seemed to pick up where Flyvbjerg left off (that social science research should aspire to phronesis) by adding in Dewey’s concept of knowing and de Vries’ concept of the dynamic relationship between the technical role and cultural role of research. From this solid foundation, he was able to make a convincing argument that democracy is undermined when there is a consensus about the aims of education, such as only focusing on evidence-based research (“what works”). When thinking about this kind of work in the world of educational research, I keep thinking about de Vries’ concept of cultural research. It seems crucial to work to help practitioners envision alternative possibilities. Similarly to how science fiction writing can help us imagine new technologies, research that engages in both technical work and cultural work can help us work toward new and desirable educational horizons.

    ReplyDelete
  2. Before getting into the arguments Biesta puts forth, I feel compelled to comment on his writing. His use of italics for emphasis drove me nuts. I learned that if you want a word to have emphasis, you achieve that by structuring your sentences so that the emphasis falls where you want it. Using italics, or underlining, or capslock, or whatever, is either lazy (because you can't figure out how to make the sentence work without the added emphasis) or condescending (because you don't trust your readers to pick up on what should be emphasized).

    To get to B-man's actual argument, though, I thought it was pretty underwhelming. I think I agree with the John Dewey/William James idea that educational research is not in the business of providing definitive answers and is more appropriate for providing heuristics and frameworks for educators to use, but I think Biesta takes this idea a bit too far. And I don't think he argues very well for the conclusions he tries to make.

    There's an amusing hypocrisy in Biesta's ideas about what can be definitive. That is, he objects to the widespread adoption of "evidence-based practice" in education because, to paraphrase, context matters. He argues that "knowledge available through research is not about what works and will work, but about what has worked in the past" (p. 17), which he uses to suggest that there can't be definitive statements about "what works." And then he goes on at the end of his article to say "I have already shown that educational practice..." (p. 18), which is itself a definitive statement ("shown" implying that this is the case and he has revealed it to the readers). It strikes me as absurd that evidence re: educational interventions or practice must be contextualized and tentative, yet a few pages of fairly shoddy argumentation can somehow be definitive.

    Further, Biesta criticizes ed research's focus on means rather than ends. I don't necessarily disagree with the spirit of that argument (certainly, there are lots of things we want gets to get from school), but again, I don't think he makes the argument particularly well. The distinction between means and ends is almost always artificial and depends a lot on scope. For example, my end for today might be to learn how to run a t-test in SPSS, and the means to that end might be to read and annotate articles. But if we take a step back, knowing how to run a t-test might be a means to conducting and publishing a study (the ends). And if we take another step back, publishing a study might be the means to change how students write in their English classes (the ends), etc. There's almost always a higher end we can abstract out to, which implies that we can think of just about anything as means to some end. Given this, I think Biesta's argument about questioning the ends loses some of its bite. That is, the ends that he seems to align with evidence-based practices (e.g. learning to do math better) are really means to different ends.

    Like I said before, I really like the idea of ed research being a tool/heuristic/guide/whatever for practitioners and policymakers. I just don't think Biesta does a great job arguing that point.

    ReplyDelete
  3. Biesta makes some interesting points in his comparison of educational and medical research. I can understand wanting to model educational research after medical research. Medical research is straight forward (issue, control group, intervention, measureable outcome, inference about causality). However, how far can we go with that notion in educational research, and is it stifling progress? In educational research, I think there is a tendency to look back and question why something did not work, which is appropriate whether done by researcher or practitioner. Maybe I am revealing my lack of experience with the p-12 setting, but I tend to think the practitioner could probably explain, or at least have an idea why a specific intervention/approach did not work for a particular student. I do not believe the same approach is used in medical research. If the outcome of a drug trial shows promise for the majority of the intervention group, do the medical researchers then spend significant time deliberating on why it was not a success for all participants in the intervention group? --Amy

    ReplyDelete
  4. Tom here--I didn't perceive a vision of well-conducted research in this article. Biesta offers a critique of the "what works" philosophy, but doesn't necessarily offer a vision. The essay seems more like a political argument against government's current enthusiasm for "what works".

    I feel mildly guilty for admitting this, but I'm just not bothered by the focus on "what works". We teach microskills in counseling--specific listening and intervention strategies that can be used with clients. Counselors can use these tools in a variety of situations. The microskills don't replace good counselors--they're tools; nothing more.

    "What works" interventions seem similar to microskills. Interventions are tools at the disposal of master teachers. Teachers can utilize research-proven strategies that work. But those same teachers decide when to back off, try something different, encourage, withdraw, etc. "What works" doesn't supplant a master teacher.

    This essay made me feel like all is right with the world. Governments push researchers to offer relevant, meaningful, and results-oriented publications. Researchers push governments to back off and retain a philosophical perspective on the importance of a broader, considered research approach.

    ReplyDelete
  5. Michelle Boulanger ThompsonJune 21, 2017 at 11:01 PM

    As a student in the Research-to-Policy and Advocacy cohort in the discipline of special education I am interested in closing the gap between research, policy, and practice. For this reason David Hargreaves' suggestion that education needs a "double transformation", of both educational research and practice, to fully transform education into an evidence-based practice is attractive to me at first glance. However, following this line of thinking leads to a narrow view of "what works" in order to generalize findings and leads to even narrower methodologies for research to "prove" what works. What bothers me with this particular argument in favor of evidence-based practice is that the teacher's opinion is devalued in favor of "scientific" evidence and "values" seem to be lost in this pursuit.

    I think that evidence-based practices work well in the field of medicine because the goal of these practices is clearly defined while in education the goal of education, or what is "educationally desirable" is not so clear and not as easily measured. I like that Biesta sees a continuous expansion of the inter-relationships between research, policy, and practice as a way to invite more participants to the conversation, stepping away from purely scientific analysis of effective intervention. Another way to look at the difference between education and medicine is that medicine is an external process where techniques or procedures are applied to the individual, while in education the learning process is an internal job.

    I like where he concludes that for educators the question isn't about the effectiveness of their actions (teaching) but the potential value of their actions.

    I agree that the field of education differs from the field of medicine in that education is better understood through the lens of systems theory with the student learning via interpretations, while medicine primarily can be understood as "causal" or technical.

    Looking at the interaction of research, practice and policy through the democratic lens of John Dewey's transactional theory of knowing allows education to be a "living" process that we look at a possibility, not retrospectively as we technically assess "what worked" in medicine.

    ReplyDelete
  6. Stephanie - I agree with Biestra that it is difficult in education to definitely determine what works since there are so many factors involved. However, like Tom, I am not too bothered by researchers making claims about what works. When looking for ideas to improve learning/test scores/whatever in the classroom, we look for interventions/ideas that have been successful. Then, we adapt them for the population of students that we are working with. Often it has to be adapted for different blocks and/or different individuals. However, we still consider this to be “what works” using the broad nature of the intervention; it is the details that need to be adjusted. He then makes his case for not using certain interventions because of the lack of desirability. While I understand the idea that he was promoting, I felt like some of his examples were of extreme cases and one reminded me of Brave New World.
    I felt that in this article, Biestr identified the nuanced nature of education and the need to respect the autonomy and professionalism of teachers. At times, I believe he did this very well, but at other points, I had difficulty following his argument.

    ReplyDelete
  7. Although much of Biesta’s writing was distracting, he brings up some thought provoking points. The idea that we need to refocus Ed research around what is educationally desirable and not just a fix to ‘problem’. This is where he is trying to get when he is talking about the issues with the ‘ends’ of evidence-based research. As research is often set as such a technique and finding a solution he argues against this idea of ‘what worked’. I believe there is validity in his argument that research shows us what worked and not what will work in the future. As much we desire and try to reach generalizability we have have to some degree experienced the class, student, or situation that the evidence-based practice does not seem to work. This is where professional judgement is needed, as Biesta would argue. Once again as we have talked about in class Ed research and its ‘power’ or ‘validity’ is connected to view and trust of practitioners.

    While I agree with Eric that sometimes his arguments are lacking, the big picture he is trying to paint is helpful in reframing the focus and purpose of Ed research. Education deals with humans and fixing them or ignoring their nature and context is problematic. At the same time, the participants of education is very different from many fields and culture and context cannot be ignored for a technical solution. If we want to make the most impact we need to move beyond just finding the tools/solution to fix but see our work as much more, as Biesta and Kurt have argued a moral endeavor. I will end with this, how many of us can really remember all or even some of the content in our classes from P-12 or even our undergrad work? Or do we remember the way were treated by our teachers and the larger ‘life lessons’ that came across in their class?

    ReplyDelete
  8. Kendra- For me this feels like a “both-and” scenario. I agree with David Hargreaves' suggestion that education needs a "double transformation", of both educational research and practice. In many ways it seems, and has been argued in this class, that a clear divide exists between what is practiced and researched. On the whole, I also agree with Biesta’s argument that a focus on the ends is not always sufficient in educational research. It seems troubling on some level to model educational research after a field that does not operate in the same way due to constructions of power, knowledge, impact, etc. Amy raises an interesting question. Do medical researches go back to figure out why 8% (totally made that up) of participants did not respond positively to an intervention? I don’t know the answer. But certainly given the many nuances and complexities inherent in the field of education and in any particular learning environment that feels like a worthwhile pursuit.

    I don’t know that he presented anything that we haven’t already discussed throughout the course. As individuals engaged in many different iterations of education, learning environments, helping fields, etc. it seems common sense that you’d want multiple perspectives involved in the creation of large claims. This feels particularly important when those claims will be used to inform policy. I think we knew that already.

    ReplyDelete
  9. Like Kendra, I see Biesta mirroring what we've heard from other scholars thus far this semester: there's a disconnect between what the government wants and what teachers want, and that randomized controlled trials are the One True Research Methodology when it comes to obtaining funding and use as the basis for policy. Additionally, "for practice to be based on evidence, that evidence must come from experiments in real contexts" (7)... but it is virtually impossible to do a randomized controlled experiment in any real educational context.

    The crux of this issue seems to be that, as Biesta notes on page 9, "even if we were able to identify the most effective way of achieving a particular end, we might still decide not to act accordingly." I think parents get this, for the most part; they want schools to treat their kids as individuals and to teach them life skills like sharing in addition to the three Rs. (Those three Rs--and, for some, getting into college--are certainly important too, though.) Policymakers are solely focused on academic attainment, though, as measured in test scores. Because, perhaps, the individual differentiation and soft skills aren't measurable? Given the need for hard data, statistics, ROI, etc. in government to determine what to fund, getting policymakers to embrace this more comprehensive view of education seems unlikely.

    I wonder if it would ever be possible for the NEA, for example, to be given the kind of governing authority over educational policy that the ABA has over lawyers, the AMA over doctors, etc. (Perhaps more possible now than it would have been a few months ago...?)

    ReplyDelete
  10. So let me say that I just lost my whole post.....uggh so here is the shortened version.

    I agree with Tom that Biesta does not present a well conducted research approach.

    I agree with Kendra that Biesta does not present anything new from what we have already been discussing in class.

    I like the thought but don't know if he offers a viable way to do it. I appreciate the difference noted between technical role and cultural role. Looking beyond what works and that the research can only tell us what workED. I believe in educational practice you have to take into account context, values, judgements, the environment, the population, and many other factors to determine if something is actually effective or in determining the reality of the impact. He acknowledges that educational research should be about finding, testing, and evaluating action, but he takes it a step further to say that it can also be about acquiring a different understanding of practice. This is a valid argument, especially in education. We have to take into account so many variables and must understand that there can't be just one way of doing things that will work and provide the exact same results each time. Education does not work like in that manner and this is why educational research can never be an exact science nor follow the medical research model. Can we learn some things? Absolutely.

    Marsha

    ReplyDelete
  11. Evandra ----- “The problem with evidence-based education, is that it is not sufficiently aware of the role of norms and values in educational decision making; the problem is that it also limits the opportunities for educational professionals to exert their judgment about what is educationally desirable in particular situations. This is one instance in which the democratic deficit in evidence-based education becomes visible.” This particular quote from Biesta details the tension between democratic and evidence-based research. Democratic research takes a different approach and it addresses the needs and concerns of policymakers in the policy making process. Democratic research is similar to action research in that it accounts for external factors as well. The link between policy and practice is not completely technical and should account for the data and perspectives democratic research is able to tease out. Biesta makes some great points regarding the two types of research. He states, “A democratic society is, in other words, characterized by the existence of an open and informed discussion about problem definitions and the aims and ends of our educational endeavors.” This definition addresses methodology and epistemology sought by educational researchers.

    ReplyDelete
  12. I think I am partly with Tom and Stephanie that I am not too bothered by ed. research conducting inquiries into "what works." However, my reservation has more to do with the way that such research can be recontextualized, or reframed, eventually becoming mandated practice that teachers must implement with fidelity. If teachers were granted the professional respect to make decisions about how to use "what works" in their classroom, or were invited to take part in conversations about the practices under consideration, I might jump on board more willingly. I have heard the term "slender autonomy" (Smaller, to refer to what agency teachers actually have in the classroom, and Tom's microskills seem to fall under that umbrella. Unfortunately, there are many teachers who enter teaching without a variety of such tools or skills and feel unsure of whether or not they can deviate from the script. So, they use "what the district says works," even when it is not in the best interest of their students.

    I like Biesta’s discussion of Dewey and “’old’ knowledge.” He says that in “reflective problem solving we do not use ‘old’ knowledge to tell us what we should do; we use ‘old’ knowledge to guide us first in our attempts to understand…and then in the intelligent selection of possible lines of action” (p. 16). While I value some tried and true methods, such as reading to a child from a young age to build a foundation for literacy success, I see how often the same strategies become “best practice” and then never leave the classroom-even in light of their ineffectiveness. We need to continue to be reflective and critical of classroom practices. I understand Eric’s point about Biesta’s structural hypocrisy, but still agree with Biesta that “what works” now may not work forever.

    ReplyDelete
  13. Looking math Biesta, it seems that democratic approaches seems to be suppressed in educational research. On page 20: "This shows that whether research can play a technical or a cultural role does not solely depend upon the decisions and intentions of researchers but is influenced in a significant way by the environment in which researchers operate." This quote seems to emphasize that stuff that may work in one area may not work in another. Techniques that work in one time period may change in effectiveness in another time period.

    Often times educational research is implemented without the consent of the teachers. This can lead to some resistance which may lead to a partial or no implementation of actions. This is because even in the building level, educational research would have to be re-contextualized and be molded into a way that can work for both the teacher and the students. Some research may not be able to be implemented at all due to other factors as well.

    ReplyDelete

June 13---Bonus Post----The Special Edition of Ed Researcher

Now that you have heard about most/all of the articles in the special issue on ed. research post a comment about what they do (or don't)...