If the research we conduct is built on research that is less than solid, are we asking the right questions? For works cited as foundational to our own discovery process, we are intended to realize and utilize salvageable elements, if no better works exist. This is done because of the realizations that a) no perfect work exists; b) the post-positivist realization that truth is relative and conditional; and c) LIS research is incomplete, lacking in coverage and rigor.
It means something rather tiresome, which is that works cited in a piece of research must themselves need to be critically evaluated - a fundamental part of the EBLIP model. But it could really make you tear hair out to then realize: what about the works they cite, and so on? Perhaps it's like retrospective cataloging of a collection, where the decision may be to go forth and sin no more - and in the case of research, to go forth having thoroughly evaluated the sins of past excursions, salvaging bits of wisdom. I have this vision of broken amphorae, patched in with contrasting colors so that we can see what the original was, with full realization of missing parts.
Here our amphorae are surveys done on Medlib-L, or Publib-L, responded to by a self-selected audience and decidedly not bias-free. Or reports of 'how I done it at my library,' program descriptions about a new class, with the self-satisfied conclusion that attendees were pleased. They might be blog postings, conference abstracts or poster session - un-indexed, with little or no consideration of previous research. But they remain, artifacts of experience and reflection.
I have talked with others about their use of resources such as the CSA LISA database in supporting decisions (and please note, here is a nonsupported, observationally-based conclusion): Many have said that they can never find things in LISA, or that what they find is insufficient. Tens of journal editors have bemoaned the lack of rigor and quantity in our research (this is far from unsupported - if you want cites, just ask). In this posting about questioning questions, I end with one: can we afford to walk away from our amphorae?
Monday, September 22, 2008
Sunday, September 14, 2008
Teaching EBLIP
This semester (fall 2008), I'm teaching a class with Joanne Marshall on evidence-based library & information practice (hereinafter referred to as the rather ungainly EBLIP). It's our second collaborative effort, and what an adventure! First, though I've taught small groups and one-on-one for years, it's always been as a coaching or one-time session, rather than a full semester experience. I feel a bit as if I'd jumped in the deep end, pedagogically, but then this is precisely the sort of thing I'd like to be teaching. I have loved the seminar-style courses that really engage people in learning and teaching, so that we're exploring topics in a systematic and open way - this is definitely a great way to learn, but it's a challenge to teach. I'm very glad to be working with Joanne on this.
We'd based the first full semester course we'd taught around the Booth and Brice book(1) as our official text, and a modular tutorial I created(2), bringing in speakers from the UNC campus (social work, nursing, public health) to discuss the adaptation of EBP to their profession, and also had the serendipitous occasion to participate during class time with the Canadian Library Association workshop led by Su Cleyle, on EBLIP(3).
The experience was a very good one, and in fact we wrote a chapter on how we conceived and conducted the course for Elizabeth Connor's edited work, Evidence-based librarianship: Case studies and active learning exercises(4). But this summer, thinking hard about what I felt might be the really seminal aspects of the course, I had several realizations. It didn't hurt that I've simultaneously been reading deeply in the literature of EBM, EBL, and all the many publications exploring librarians' publication patterns.
One was that the tutorial places undue emphasis on EBM and its development, and really less on the examination of issues related to the adoption/adaptation of EBP to library settings. I think it may be important to understand the roots of EBP development, but it seems at least as important to place it (EBM, I mean) in context of contributing factors, and to provide contrast with EBP in other fields of study (such as social work, education, nursing, etc.) so that we do not get bogged down in healthcare contexts.
There is so much to say about EBM - practice setting decision making, (questions, diagnosis, etc.); regulatory factors; culture; education, etc.) that one day I'd really like to write the book: Comparative Evidence-based Practice. I had begun something on this at the tail end of the first EBLIP course, because I was so inspired by what I'd learned about the adaptation of EBP by other disciplines. There is much, much more to say on this, and I think it will be instructive for our own profession to think about what's been done in order to see what might be done. In fact, one of the papers to be written this semester (and in the last class) will be one where students take another area of practice and summarize how EBP has been adopted & adapted, considering contextual isssues and finally, whether LIS could benefit from specific tools or practices around that adaptation process.
Second, while we did practice critical evaluation of LIS research, I felt it wasn't, somehow, enough. I wanted the class to be a transformative experience, and envisioned a journal club, with rotating responsibilities for facilitation, where we read and critically evaluate LIS research. Part of the discussion is around the use of critical evaluation tools: Do they work? What sort of evaluative tools do we need to build? We read and had a first try at critical evaluation last week using a bibliometric study(5), finding the process involving, and bringing up a lot of questions about the tools and much more. We'll be doing this throughout the entire semester, and instructors are not exempt :).
Perhaps as a direct consequence of my dissertation work, I have also found my initial understanding of EBLIP to be shallow, with assumptions unexamined (such as the more-or-less automatic acceptance of the evidence pyramid). I find myself not so likely to say to students: Well, here's the topic. Let's practice the skills now. It's not about that - it cannot be, not if we are intending to truly engage in discourse. It's a matter of walking the walk, eh? I am less sure of how to direct this process (tending to want to meander off, seeing all sorts of digressive issues) and am grateful for the presence of a syllabus and the time-boundaries of a weekly class, which we're teaching via Blackboard and teleconferencing.
(1) Booth A & Brice A. (2004). Evidence-based practice for information professionals: a handbook. London, England: Facet Publications.
(2) EBL Course Syllabus, used the first time we taught the course.
(3) Cleyle S. (2005, Sept. 21). Evidence based library and information practice - the Canadian scene. CLA Teleconference Series.
(4) Perryman C & Marshall JG. Designing a curriculum in evidence-based practice for Master’s students in library and information science. In E. Connor (Ed.), Evidence-based librarianship: Case studies and active learning exercises, Oxford: Chandos Publishing, 2007.
(5) Antonisse MJ, Burright MA,& Hahn, TB. (2005). Understanding information use in a multidisciplinary field: A local citation analysis of neuroscience research. College & Research Libraries, 66(3):198-210.
We'd based the first full semester course we'd taught around the Booth and Brice book(1) as our official text, and a modular tutorial I created(2), bringing in speakers from the UNC campus (social work, nursing, public health) to discuss the adaptation of EBP to their profession, and also had the serendipitous occasion to participate during class time with the Canadian Library Association workshop led by Su Cleyle, on EBLIP(3).
The experience was a very good one, and in fact we wrote a chapter on how we conceived and conducted the course for Elizabeth Connor's edited work, Evidence-based librarianship: Case studies and active learning exercises(4). But this summer, thinking hard about what I felt might be the really seminal aspects of the course, I had several realizations. It didn't hurt that I've simultaneously been reading deeply in the literature of EBM, EBL, and all the many publications exploring librarians' publication patterns.
One was that the tutorial places undue emphasis on EBM and its development, and really less on the examination of issues related to the adoption/adaptation of EBP to library settings. I think it may be important to understand the roots of EBP development, but it seems at least as important to place it (EBM, I mean) in context of contributing factors, and to provide contrast with EBP in other fields of study (such as social work, education, nursing, etc.) so that we do not get bogged down in healthcare contexts.
There is so much to say about EBM - practice setting decision making, (questions, diagnosis, etc.); regulatory factors; culture; education, etc.) that one day I'd really like to write the book: Comparative Evidence-based Practice. I had begun something on this at the tail end of the first EBLIP course, because I was so inspired by what I'd learned about the adaptation of EBP by other disciplines. There is much, much more to say on this, and I think it will be instructive for our own profession to think about what's been done in order to see what might be done. In fact, one of the papers to be written this semester (and in the last class) will be one where students take another area of practice and summarize how EBP has been adopted & adapted, considering contextual isssues and finally, whether LIS could benefit from specific tools or practices around that adaptation process.
Second, while we did practice critical evaluation of LIS research, I felt it wasn't, somehow, enough. I wanted the class to be a transformative experience, and envisioned a journal club, with rotating responsibilities for facilitation, where we read and critically evaluate LIS research. Part of the discussion is around the use of critical evaluation tools: Do they work? What sort of evaluative tools do we need to build? We read and had a first try at critical evaluation last week using a bibliometric study(5), finding the process involving, and bringing up a lot of questions about the tools and much more. We'll be doing this throughout the entire semester, and instructors are not exempt :).
Perhaps as a direct consequence of my dissertation work, I have also found my initial understanding of EBLIP to be shallow, with assumptions unexamined (such as the more-or-less automatic acceptance of the evidence pyramid). I find myself not so likely to say to students: Well, here's the topic. Let's practice the skills now. It's not about that - it cannot be, not if we are intending to truly engage in discourse. It's a matter of walking the walk, eh? I am less sure of how to direct this process (tending to want to meander off, seeing all sorts of digressive issues) and am grateful for the presence of a syllabus and the time-boundaries of a weekly class, which we're teaching via Blackboard and teleconferencing.
(1) Booth A & Brice A. (2004). Evidence-based practice for information professionals: a handbook. London, England: Facet Publications.
(2) EBL Course Syllabus, used the first time we taught the course.
(3) Cleyle S. (2005, Sept. 21). Evidence based library and information practice - the Canadian scene. CLA Teleconference Series.
(4) Perryman C & Marshall JG. Designing a curriculum in evidence-based practice for Master’s students in library and information science. In E. Connor (Ed.), Evidence-based librarianship: Case studies and active learning exercises, Oxford: Chandos Publishing, 2007.
(5) Antonisse MJ, Burright MA,& Hahn, TB. (2005). Understanding information use in a multidisciplinary field: A local citation analysis of neuroscience research. College & Research Libraries, 66(3):198-210.
Saturday, September 13, 2008
First thoughts
There are so few resources out there for librarians interested in EBL/EBLIP (whatever you'd like to call it)! I don't know if I'll keep this up but it seemed like a good idea at the time. Here's my primary goal, other than finishing my dissertation and returning to gainful employment: enabling an active discourse among multi-type librarians about the profession itself, especially about how we make decisions.
I keep quoting Pat Thibodeau, Duke University Medical Center's Library director, who made a half-joking comment that 'librarians are anarchists.' It is her sense that we focus so much on serving our communities, defining collections, policies, and activities by those whom we serve, that the mere thought of identifying best practices in many areas is anathema. I don't mean to jest about this any more than she did, though. In a profession defined by service, it is most appropriate to shape practice in response to need.
One question that occurs about this, however, is just how we do that (I know, I know - Libqual). That's a terrific advance. But what if, at your library, you recognize a need that is not adequately measured by such tools, or if you are unprepared to go beyond user satisfaction measurements - and then find you need to do so, or risk budgets?
There's another question, too. This one is rather self-evident as well. What do we say we teach patrons in any kind of library where teaching is involved (I mean, the overarching goal of it)? Isn't it information literacy... critical thinking? - (and I see you nodding). Where is our own form of literacy, does that stop at competency?
I keep quoting Pat Thibodeau, Duke University Medical Center's Library director, who made a half-joking comment that 'librarians are anarchists.' It is her sense that we focus so much on serving our communities, defining collections, policies, and activities by those whom we serve, that the mere thought of identifying best practices in many areas is anathema. I don't mean to jest about this any more than she did, though. In a profession defined by service, it is most appropriate to shape practice in response to need.
One question that occurs about this, however, is just how we do that (I know, I know - Libqual). That's a terrific advance. But what if, at your library, you recognize a need that is not adequately measured by such tools, or if you are unprepared to go beyond user satisfaction measurements - and then find you need to do so, or risk budgets?
There's another question, too. This one is rather self-evident as well. What do we say we teach patrons in any kind of library where teaching is involved (I mean, the overarching goal of it)? Isn't it information literacy... critical thinking? - (and I see you nodding). Where is our own form of literacy, does that stop at competency?
Subscribe to:
Posts (Atom)