I read Malcolm Gladwell’s Outliers this week and it is a great book... but I like all his stuff. The section on Asian students and math/work ethic is really good and I really liked his analysis on sports and age limit deadlines. There is also some good parenting advice in there as well.
Anyway, my colleague Jon Bodnar and I are working on a handful of writing projects and this theme of outliers started to standout for me. We are currently processing a ton of ARL stats and I became very interested in the idea of who is outperforming the overall trends. This post is in no why authoritative, statistically valid, or what have you—just something I found interesting and that wanted to share. Jon’s office is right beside mine and so we’ve been discovering these types of epiphanies all week long. It has been a good week.
Reference Questions
Overall the trend is downward. Nothing new here, we’ve all been hearing that for years. When you look at 1995 to 2005 the ARL average drops about 47%. That is a lot fewer questions!
Ah, but not at Oregon. The Ducks report a +51% increase in reference queries during this time period. Interesting, but why? Did they add service points? Did they renovate or change the layout? Did they alter the way they record stats? I am sure there are many variables, but I just found it fascinating that while everyone else declined they surged so much.
Other Reference Outliers:
- Columbia (+46%)
- Toronto (+40%)
- Washington (+38%)
- Cincinnati (+36)
There were a total of 12 libraries that saw an increase during this period, the other 83 saw a decline.
NOTE: Not all member libraries provided data.
Total Circulation
Total circulation is another area that has seen an overall decline. From 1995 – 2005 the average total circ count per ARL library is down 11%. Not bad really. The steepest drop is -68%, but overall -11% seems decent. I would have suspected about -30 or -40%, but what do I know? One think I do know is that patrons at Ohio State love to borrow books! During this same period OSU saw a +110% increase in circulation. I didn’t believe it at first but visited their website to double check—indeed it is true. So again, why? They have been building a new library for years, but I would think that would hurt circulation. Are they circulating more DVDs? More reserves? Why are they busting the overall trend? That is a substantial increase!
Other Circulation Outliers:
Columbia (+66%)
Harvard (+59%)
Southern Illinois (+52%)
Utah (+40%)
There were a total of 32 libraries that saw an increase during this period, the other 73 saw a decline.
NOTE: Not all member libraries provided data.
Instruction
Last but not least: instruction. ARL asks for groups as well as total participants involved with instruction. My feeling is that there is some inconsistency here. I mean, do you only count classroom sessions? What about orientations? What about tours? This is very subjective. If I help four students working together on an assignment does that count as a group or is that four reference queries? Maybe both? Or maybe just one query? In other words, take this section with a grain of salt. I doubt all 100+ libraries are following the same counting practices.
From 1995 to 2005 there has been a significant rise with instruction or at least in attendance. The stat for “presentation participants” went up +40%. That’s huge.
And who saw the largest increase? Toronto with +392%. That’s right they taught just over 6,000 students in 1996 and then reached over 30,000 in 2005. Texas gets the top honor for most participants in 2005 with 61,042, but how did Toronto surge so much in such a short time? What’s the story? Did they get new classrooms? More librarians? A change in the curriculum?
Yale was another top outlier with a +361% increase. In 2005 they had over 10,000 people participate in instruction sessions--- that is pretty good for a school with just 5,300 undergraduates.
In fact, there were 22 libraries that saw triple digital percentage increases during this 10 year period. Only 19 schools saw a decrease with instruction, while the bulk was somewhere between 20% and 90%. Here is a quick trend line:
So again that question: why did Toronto (+392), Yale (+361), Boston (+276), Columbia (+259), and Emory (+227) pull far ahead of everyone else? Did these organizations decide to really push instruction? Did the faculty really embrace information literacy? Did they all start counting the ten minute overviews that librarians give at new student orientation events as “instructional” sessions? Who knows? Columbia ended up in the top five of all three catagories-- why? Why them instead of Cornell or Harvard or Wisconsin or UCLA?
As I mentioned, Jon and I are hacking at some data that really has nothing to do with this topic— this is my tangent. I just find it incredibly fascinating to look at the data and piece together stories. Gladwell of course would go three levels deeper and rationalize what happened at each place and could tell us why Columbia was so successful. I am just aiming for an entertaining winter break blog post.
Interesting stuff Brian - and you ask the right questions - why are some up and some down? Who is counting what? When we were at OSU and the main building was closed we heard not as many students were going to the substitute library - so how is circulation up - or do the numbers reflect the time between the substitute building closing and the opening of the new building. If it was artificially low the year before it would be a skewed increase - not indicative of rising interest in the library - just a return to normal. But I ask if any of these numbers matter other then to tell us we're getting more or less inputs. Even with instruction the numbers might not mean much more - than as you say - quickie sessions are being counted. What good are the numbers if ARLs cannot provide evidence of how they are helping students to achieve learning outcomes, to help faculty be more productive and more likely to know the library's resources and teach them to students or to show how the library is a good investment of institutional resources (as the recent UIUC ROI study does indicate). I am wondering if the next generation of ARL directors would be wise to move away from the number counting - which to some extent creates a competitive - who's number one - mentality - and ultimately doesn't tell us if we are really successful where it counts.
Posted by: stevenb | December 18, 2008 at 05:40 PM
@Bell. Thanks for the comment. I thought about looking for a correlation between LibQUAL+ data and the outliers, but ARL gets mad when I talk about other people’s results. OSU has had a dip in circ numbers the last few years, but has seen a steady rise since 1995. Leslie suggested that their delivery options might have a positive impact. Why bother going to the library to search for a book when you can just request it for pick-up? As for inputs vs. outcomes—yes, we’re always on the same page there. The data was available so I decided to look at it, really just a fun diversion, but of course you know I love the ranked lists.
Posted by: brianmathews | December 19, 2008 at 04:37 AM
Yes indeed, it's important to ask the right questions. As ARL Survey Coordinator at my institution, I can tell you that we struggle with interpreting ARL question definitions, especially when a definition no longer meets our management needs. Here's an example. Several years ago our reference and instruction department began a research consultation service. This service has become increasingly successful: in just two years research consultations increased by 611% (by comparison, our group instruction sessions increased 89%). So how to report "research consultations" to ARL? On the local level we consider them part of our instruction program--a student meets with the librarian in his or her office, not at the reference desk, and most consultations are by appointment. The ARL definition, however, considers one-on-one help sessions as reference questions, even if they take 30-45 minutes, as most of ours do. On the other hand, a similar consultation with two students would be considered "instruction." So yes, a lot depends on how the reporting institution interprets (or takes liberty with) the definition. We discuss these things at survey coordinators' meetings, so ARL is aware of local differences in interpretation. So continue asking "why" when looking at outliers, and go one step further: take the time to ask an outlier institution, "how do you count x or y?" as well as "how come"?
Great post, BTW. Our campus bookstore has Gladwell's book on sale and I'm on my way to buy a copy.
Posted by: Sharyn Ladner | December 19, 2008 at 08:58 AM
Very impressive increases, indeed! Are you going to be contacting any of the librarians at the libraries you mention to see if they can shed some light on what they have done that's lead to these successes? I also wonder, if Toronto has a +392% increase in instruction/ participants, how they're actually coping with this. Have they increased the numbers of staff to be able to cater to so many more students? Are they using lots of different methods to reach their clientele?
Thanks for this post - have just ordered The Outliers for the collection here at MPOW.
Posted by: CW | December 21, 2008 at 04:47 PM