Even those who are interested in, who "buy in" to, (some forms of) assessment -- like myself -- do admittedly often look at the sorts of requirements SACS imposes upon its constituents, at the reams of data it falls upon us to generate, to organize, to report upon, as onerous make-work. Having been drawn in to the ongoing process of generating our Quality Enhancement Plan at Fayetteville State University -- first as a subject matter expert in both Critical Thinking and CLA Performance Tasks, a purely advisory role, and then afterwards as a member of the QEP Writing Committee -- I was afforded an illuminating vantage point from which to observe at least some of the workings of SACS and the QEP activity.
For me, as a philosopher, this involvement has been very interesting on a number of different levels.Our plan itself thematically focuses on critical thinking and practical reasoning, two areas in which philosophers are not the only experts and disciplinary practitioners, but definitely fields in which we make central contributions. Going beyond the content of the plan, the very process of our QEP's development, our attempts to interpret and meet requirements imposed by SACS by drawing upon our available faculty strengths, leveraging our current involvements, and setting up tracks for future ongoing faculty development -- that very process has been a sort of set of test cases or experiments in practical reasoning -- and reasoning in difficult circumstances, less resources available than we would have liked, consensuses yet to be formed, buy-ins to be achieved -- basically, the sort of real-life circumstances in which most practical reasoning really does take place -- far from the clean contours of classroom examples, or the highly abstract thought-experiments and textbook cases with which most philosophers tend to be more comfortable.
I'm personally quite proud of what we did accomplish, a well-thought out, many-sided plan, offering multiple opportunities for faculty, staff, and student involvement, which -- if it is implemented thoughtfully and followed through consistently -- should result in measurable gains in student learning in Critical Thinking. The full report, nearly 100 pages including the appendices, is available online, as is a 1 page executive summary, but perhaps it might be useful to provide a discussion somewhere between these two extremes in lengths and complexity. I attempted to do something like that in this video, which I produced mainly to give students some idea of what was heading their way. |
Suffice it to say that that study articulates something those of us who teach in the trenches already know well: our students come in with weak abilities and knowledge-bases in critical thinking, analytical reasoning, assessing and using evidence, problem solving, and practical reasoning. And, unfortunately, many of them are leaving having made relatively little improvement. I've argued in another blog entry why this is such a significant problem for these students -- employers demand these skills. And why? Because they are absolutely necessary for working in the new knowledge economy college is supposed to training our graduates to participate in -- those lacking these skills and dispositions after 4-6 years of college will just not get hired, or if they get hired, will not keep their positions, because they can be replaced by those who are adequately prepared for that level of work.
Like many universities, we have a Critical Thinking class at FSU, but expecting that class to provide students with an adequate grounding in the subject in one semester -- students many of whom come in with already weak critical thinking, reading, writing, and study skills -- is quixotic. The skills acquired, the dispositions developed, the knowledge bit by bit gleaned and applied in the Critical Thinking class has to be infused and reinforced throughout the curriculum. To the credit of the institution, FSU University College -- responsible for assessing and to some extent structuring the core curriculum -- has made this a high priority, and has led the way by deliberately moving away from meaningless multiple choice tests to using CLA Performance Tasks to assess the skills of Entering Freshmen and Rising Juniors. The Provost's office has also been a very strong supporter of infusing critical thinking and of incorporating CLA in the Classroom. Even though there is still so much work to be done, excellent groundwork
Our QEP builds off the the foundation already long-laid and well-established, ambitiously proposing the next logical step: critical thinking understood as evidence-based decision making in the majors. It is one thing to assess how students are doing in the core, where the Critical Thinking course is essentially housed. It is another to see critical thinking, problem solving, practical reasoning still being promoted consistently, being developed further, incorporated into the central goals and preoccupations -- even student learning outcomes -- of content-based, disciplinary-situated classes, courses that form components of a major discipline.
So, what does evidence based decision making entail and include? What are its measures, its characteristics? How do we coherently promote it, test it, teach to it? These are important general questions, to which there are well-articulated answers going considerably beyond our QEP (for an example of one discussion, correlating evidence based decision making with several models of critical thinking, see this white paper). Here though, I'm going to address three more focused questions: How will FSU measure evidence-based decision making in the majors? How will students be readied and aided to provide evidence that they actually are using evidence to make good, justifiable decisions? And, how will the faculty be better enabled to guide the students in that newly identified direction?
The plan is that the same tool will be used to measure critical thinking in the majors as in the Core -- a surprisingly flexible instrument, the CLA Performance Task. We have used instructor-generated CLA Performance Tasks for the 2010 Rising Junior Examination and the 2010 Entering Freshman Examination, so adding a senior level Exit Examination for the departments participating in the QEP will build off a previously developed expertise and framework. These performance tasks differ greatly in content from examination to examination, but have the same basic structure: students are provided with a set of documents containing information pertaining to some problem, task, decision, or controversy, upon which they have to articulate a coherent, well-argued, evidence-based stand. What is provided in the documents is a mix of relevant, irrelevant, misleading, even contradictory information -- and typically, there will be arguments, some good, some bad, made in the documents as well. The student's task, essentially, is to transform information into evidence for a decision. A common generic rubric -- quite well-designed -- is employed to assess student performances on these tasks.
The CLA is a robust tool, not only for assessment, but also for classroom use, for teaching -- and even for reflecting on -- critical thinking skills. But it is only one means for fostering critical thinking, and FSU's QEP actually includes five mutually supportive tracks of faculty development, each of which is to enhance faculty members' own skills and knowledge, and enable them to better assist students in their own development. Faculty development is a combination of providing faculty with needed resources and teaching the teachers, and its ultimate payoff is in the classroom, in improvements in student learning.
Faculty members whose departments participate in the QEP will be offered five different faculty development opportunities. Each of these is a way of fostering important aspects of critical thinking and evidence based decision making. Each of these is a path of faculty development along which some faculty at FSU have already made some previous progress. And, each of these is a program in which we currently possess either faculty experts with a track record of facilitating that type of faculty development, or partnerships with outside experts who can provide quality faculty development.
So, what are they? Writing Across the Curriculum; Reading Across the Curriculum; Information Literacy (through our excellent and highly competitive Chesnutt Library Fellowship program); Integrated Course Design (facilitated through online courses with course design expert Dee Fink); and finally CLA in the University (FSU's own permutation on CLA in the Classroom, in which I may continue involvement as an outside expert after I leave FSU).
The basic idea behind the Quality Enhancement Plan is then this: through these five pathways, faculty will be equipped to thoughtfully, competently, progressively infuse Critical Thinking into their classes -- both in the core curriculum and in the major disciplines -- and this will pay off in much more coordinated, developed, coherent student learning in Critical Thinking, assessed and demonstrated through CLA senior exit examinations in the majors.
Is this really about assessment, about numbers, though? The QEP includes that as a necessary component. In fact, I think being able to provide stakeholders evidence that an FSU education demonstrably "adds value," as they say, is not only something very useful, very prudent to have at hand in these tough times of budget cutting -- it is practically an ethical requirement, a moral duty the institution bears towards those who foot the bills, the stakeholders, the taxpayers, the parents, the members of the larger community. And, going deeper, I would say that unless they can turn things around and not keep turning and churning out graduates lacking in the vital assemblage of skills, knowledge and dispositions we call "critical thinking," colleges and universities are failing those very students to whom they given passing grades.
The CLA is not a silver bullet, and until it is actually implemented FSU's QEP remains just a piece of paper (really, a pdf!), but they are reasonable models offering hope for substantive improvement -- and it was evidence based decision making on the part of faculty, staff, and administration, practical reasoning at work through committee and consensus, that produced the plan now ready to be implemented provided SACS approves it. For my part, as a philosopher, what we've effectively done is smuggled in measures to improve and augment good practical reasoning on the part of faculty and students both in the development and as an effect of the plan. For in the end, all educational jargon and assessment-speak aside, that is what evidence based decision making comes down to: practical rationality at work.
No comments:
Post a Comment