Thursdays at 12 noon was the only available time for December. Agreement: lunch is not a great time to meet and we should find a new time/day.
Decision: We will record our meetings.
Decision: We affirm that we will abide by the SILS work practices/responsibilities and team charter.
@Ellen Augustiniak will carry-out another two additional scheduling polls (for dec., then jan.,/feb)
2
UCL/OP annual stats: Background, current uses, current process
Further context, including OP (Office of the President) needs and interests; develop shared understanding and identify outstanding questions (and any questions that can’t be answered in the room, will be answered in writing afterwards)
5m
Danielle
Currently, what we collect, how it is used and by whom – this is largely opaque; we can resolve this through our work and documentation
The UC Libraries have been collecting systemwide data for 75+ years; CDL responsible for receiving data from campuses and compiling since 1997/8; prior to CDL, another UCOP office (systemwide library planning) collected this data - likely why we still refer to the activity as “UCOP statistics”
While CDL collects the data, our data requirements and instructions (how we define the ways in which we meet requirements) are defined by the libraries (collective “us”) and the Council of University Librarians (CoUL) is the ultimate decision-maker
Previously, our UCL/OP data requirements tracked closely to ARL statistics; sometime (dww believes around 2010?) ARL stats changed and we decided to keep some of what what ARL dropped; but that was a shared library decision
For 21/22 data collection, CoUL endorsed the removal of schedule B; this change came about through campus-based advocacy (kudos to Sarah Lindsey of UCSC, who spearheaded this) and was reviewed by all campus library leadership teams before the decision was endorsed
UCOP has long used some of the data that we collect annually
The Liability and Property Programs Office sends Schedule D data to our insurance provider
The Budget includes aggregated data from several schedules in the UC Budget for Current Operations, in the Academic Support-Libraries chapter (this year, ILL related data wasn’t shared – as reporting of that data has been delayed to December). Not all data in these schedules is shared (e.g., Electronic Reference Sources, quantity and searches, remains a data point only collected for the libraries). The budget report is public; we’ve heard it has also been cited to the Regents and to the Governor’s office before.
Changes to the data shared with the Budget and Liability offices are potentially possible - we just need to talk with them, socialize the change and understand the impact
I’ve received requests from campus library leadership to summarize/report systemwide and campus specific data; with expanded capacity on data analysis, this service can be improved now.
ARL and ACRL are currently revising their data instruments; changes to their data submission requirements are likely 1-2 years away (we’ve had two university librarians involved in this work: UCR’s Steve Mandeville-Gamble and UCSD’s Erik Mitchell). When CoUL voted to endorse this project team, it also endorsed the decision that the UC Libraries should wait to review and revise the UCL/OP annual statistics until after the new ARL and ACRL data requirements are released, assuming that the UC Libraries will want to mirror those data requirements to the extent possible. This determination does not restrict this project team from making recommendations on how to best harmonize and streamline UC annual statistics, or documenting when existing data points are no longer supported/viable in the SILS environment.
Questions:
The RLFs provide stats, Shared Print collects statistics; what about those data collections? Are they related to this exercise? (We should consult with our RLF and SP colleagues; RLF have historically submitted ILL/intercampus data as part of this process; outstanding question of how campuses should report their RLF holdings moving foward; there is one SP data point included through our systemwide annual stats; then SP managed its own more detailed stats collection - that collection is ultimately outside of our scope).
Action: ALL - If folks have any addition questions how the background, context or present processes for reporting data, they should post those questions to the listserv and/or Slack.
3
Current process + top pain points
Increase awareness of different options
20m
Campus members
UCB: First year using Alma Analytics; data still needs to be cleaned up (material type includes inaccuracies); using historical data as a comparator; newly added records (since implementation), including type, are good. Moving forward, for collection type count, want to fully rely on Alma. For certain types (microfiche) - many are uncatalogued at item level (estimates). UCB has been long reporting estimates, as opposed to the ILS-generated data.
UCD: Utilized analytics (have several years experience); reporting responsibility has been shared - will report back with more detail.
UCI: Main + Law report their data separately (both fill out schedules). Using Alma Analytics. Lack of confidence in material type is shared; year-to-year changes in reporting and how we’re defining various data requirements (across time, some differences in what’s reported).
UCLA: Accustomed to using Voyager to export data; LA used June workshop to utilize Alma. Lack of communication around whether CDL would pull the data for this year; then understanding data in Alma; problems with material types - determining what has been a manual count and carrying that forward; UCLA Alma Analytics study group discussed - John presented to them, included observations of this process.
UCM: Responsibility shared across teams. With WMS, used external spreadsheets. With switch to Alma, this has changed. Analytics has been great, but still trying to understand it. Based on UC webinar, using information shared to develop the right practice. Every year, have to re-reflect/consider how we’re defining everything.
UCR: Realization locally that we needed to change what we were doing. Used Analytics/dashboard supported. Tried to carry out this work in shared space with limited edit/delete rights - frustrating to not have full capacity. Dates was a pain (date-prompt across dashboard). Serials currently received was problematic - reported what I had. Matching material types to categories was imperfect (creating a cross-walk). What is an electronic reference resource? A lot of change required new workflows.
UCSF: Definitions have been tricky - have local documentation to keep track of how we’ve defined this previously. Once you migrate, confidence level decreases; new functionality is possible. Why of what we’re doing - risk assessment that we share systemwide; locally, now we’re also having to report electronic resources more comprehensively (greater guidance would be helpful).
UCSC: Responsibility divided across a team; advance search gave us strange results, so Lisa brought on last year to combine advance search, analytics, scripting. Legacy data continues to be a pain point; including one item record for a drawer of materials, etc. This year, used Analytics for full data submission; limitations with Network and questions about how it correlates to bib.
UCSB: Reporting process is fragmented due to different departments being responsible for specific statistics. Music Library and Special collections have their own workflows to report statistics in schedule A and D, data is usually maintained in Excel by these departments. An effort to understand and harmonize data reporting practices across departments is needed. Alma Analytics is largely utilized for general library statistics reporting. Starting last year, Analytics Coordinator has been given the responsibility to spearhead the entire statistics reporting process which has led to efficient communication and timely reporting. Going forward UCSB intends to bring majority of the statistics reporting workflows within the realm of Alma Analytics which will require clarity of data definitions and collaboration with our partners in form of shared reports etc.
Outstanding: UCSD
@John Riemer will share his UCLA presentation, re: their local process to utilize Alma Analytics for 21/22 reporting.
Priorities for next meeting’s discussion & further work are developed.
Let’s establish clarity. Potential goals for final plan:
Meet tight deadline.
Distribute to subgroups so that work can happen in parallel?
Clarity on campus & CDL needs from deliverables
Alignment from committee members on realistic approaches to fulfilling charges
15m
Ellen, All
Agreement: We will spend December establishing shared agreement on how we’re fulfilling this charge, what our final deliverables are.
Impact of material type issues: Can we all put our material types on the table (looking at the data we have) and consider how we might harmonize (including issues with doing so, potential actions/needs that might stem from it)?
Agreement: For those data points that are not viable or meaningful, this group could make a recommendation to stop counting/reporting (similar to how schedule B was dropped for 21/22 - it was a recommendation put forward to CoUL; we can do that again). But we are not charged to revamp what data we collect.
Do we want to compile how we currently all report fulfill the UCL/OP data requirements for statistics, to see our differences/similarities? Or do we establish a central report now, and then seek to validate/change?
Agreement: First we want to look at what we are all doing (how we’ve designed our reports - what data points, from which tables, we’re using); where do we have commonalities on how we define/report? what are our differences?
How can we flag the gymnastics and workarounds? How can we assess whether we should stop doing the gymnastics? One outcome is doing things more simply.
How might we format and compare our compiled data? If we have a shared form/template where we cleanly compile out data (tidy data - save us from parsing SQL), how do we organize that?
Regarding material type, could use historical reporting to identify migration issues and seek changes. We will also need to consult with SSCP, potentially local cataloguing teams, regarding current practices.
Hand counting - how can we move away from that? (We are empowered to make this recommendation - our first principle, assigned by CoUL and DOC is: We seek to reduce unnecessary manual and duplicative work.
How can we define what is good enough? This is about order of magnitude.
This is also about changing what people expect from statistics; how do we change perceptions and get folks more open to order of magnitude reporting (how do we set expectations)?
Shared reporting for Statistics is a goal; we haven’t yet talked about how this might be a shared service and what that shared service might look like (done centrally, and then connected to systemwide group)?
Dec is for Work Plan development
Any recommendation to stop counting element is in scope.
We need to determine how to collect and compare data about our current processes.
Shared reporting for Statistics is a goal; we’re charged to “recommend systemwide analytics harmonization policy, including a centralized reporting and analytics service.”
Action: All - review the work plan and consider how we should categorize and organize our work components.
5
Wrap up
Review actions and decisions
5
6
Parking Lot
Capture important topics for future discussion
The SILS mission is to transform library services and operations through innovation and collaboration. The future is shared!
Question? Contact AskSILS-L@ucop.edu